Adam学习21之在Ubuntu下编译安装记录

环境:
adam-2.10.0.19
clean package
test
install

1.Ubuntu下对adam进行mvn clean package -DskipTests

xubo@xubo:~/cloud/adam-2.10-0.19-git$ mvn clean package -DskipTests
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] ADAM_2.10
[INFO] ADAM_2.10: Core
[INFO] ADAM_2.10: APIs for Java
[INFO] ADAM_2.10: CLI
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ adam-parent_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-parent_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-parent_2.10 ---
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-parent_2.10 ---
[INFO] Modified 0 of 199 .scala files
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-parent_2.10 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-parent_2.10 ---
[INFO] No sources to compile
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10: Core 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ adam-core_2.10 ---
[INFO] Deleting /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-core_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-core_2.10 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-core_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-core_2.10 ---
[INFO] Modified 0 of 159 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-core_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-metrics_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-io_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-cli_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.scoverage:scalac-scoverage-plugin_2.10:1.1.1 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.adam:adam-core_2.10:0.19.0 requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/main/java:-1: info: compiling
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/main/scala:-1: info: compiling
[INFO] Compiling 108 source files to /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target/scala-2.10.4/classes at 1463194196606
[WARNING] warning: there were 37 deprecation warning(s); re-run with -deprecation for details
[WARNING] warning: there were 33 feature warning(s); re-run with -feature for details
[WARNING] two warnings found
[INFO] prepare-compile in 0 s
[INFO] compile in 57 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-core_2.10 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 2 source files to /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target/scala-2.10.4/classes
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-core_2.10 ---
[INFO] Test Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 65 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-core_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-metrics_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-io_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-cli_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.scoverage:scalac-scoverage-plugin_2.10:1.1.1 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.adam:adam-core_2.10:0.19.0 requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/test/scala:-1: info: compiling
[INFO] Compiling 53 source files to /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target/scala-2.10.4/test-classes at 1463194259111
[WARNING] /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/test/scala/org/bdgenomics/adam/rdd/read/recalibration/BaseQualityRecalibrationSuite.scala:38: warning: object DecadentRead in package rich is deprecated: Use RichAlignmentRecord wherever possible in new development.
[WARNING]     val bqsr = new BaseQualityRecalibration(cloy(reads), snps)
[WARNING]                                             ^
[WARNING] /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/test/scala/org/bdgenomics/adam/rdd/read/recalibration/BaseQualityRecalibrationSuite.scala:58: warning: object DecadentRead in package rich is deprecated: Use RichAlignmentRecord wherever possible in new development.
[WARNING]     val bqsr = new BaseQualityRecalibration(cloy(reads), snps)
[WARNING]                                             ^
[WARNING] warning: there were 1 feature warning(s); re-run with -feature for details
[WARNING] three warnings found
[INFO] prepare-compile in 0 s
[INFO] compile in 59 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-core_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-core_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-core_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.6:jar (default-jar) @ adam-core_2.10 ---
[INFO] Building jar: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target/adam-core_2.10-0.19.0.jar
[INFO] 
[INFO] --- maven-jar-plugin:2.6:test-jar (default) @ adam-core_2.10 ---
[INFO] Building jar: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target/adam-core_2.10-0.19.0-tests.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10: APIs for Java 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ adam-apis_2.10 ---
[INFO] Deleting /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/target
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-apis_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-apis_2.10 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-apis_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis_2.10 ---
[INFO] Modified 0 of 4 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-apis_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-apis_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/main/scala:-1: info: compiling
[INFO] Compiling 3 source files to /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/target/scala-2.10.4/classes at 1463194321294
[WARNING] warning: there were 2 feature warning(s); re-run with -feature for details
[WARNING] one warning found
[INFO] prepare-compile in 0 s
[INFO] compile in 6 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-apis_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-apis_2.10 ---
[INFO] Test Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-apis_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/test/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-apis_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/test/java:-1: info: compiling
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/test/scala:-1: info: compiling
[INFO] Compiling 3 source files to /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/target/scala-2.10.4/test-classes at 1463194327622
[INFO] prepare-compile in 0 s
[INFO] compile in 7 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-apis_2.10 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 2 source files to /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/target/scala-2.10.4/test-classes
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-apis_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-apis_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.6:jar (default-jar) @ adam-apis_2.10 ---
[INFO] Building jar: /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/target/adam-apis_2.10-0.19.0.jar
[INFO] 
[INFO] --- maven-jar-plugin:2.6:test-jar (default) @ adam-apis_2.10 ---
[INFO] Building jar: /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/target/adam-apis_2.10-0.19.0-tests.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10: CLI 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ adam-cli_2.10 ---
[INFO] Deleting /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.0:revision (default) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- templating-maven-plugin:1.0-alpha-3:filter-sources (filter-src) @ adam-cli_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/generated-sources/java-templates added.
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-cli_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli_2.10 ---
[INFO] Modified 0 of 36 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-cli_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-cli_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/generated-sources/java-templates:-1: info: compiling
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/main/scala:-1: info: compiling
[INFO] Compiling 26 source files to /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/scala-2.10.4/classes at 1463194338099
[INFO] prepare-compile in 0 s
[INFO] compile in 23 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-cli_2.10 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 1 source file to /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/scala-2.10.4/classes
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-cli_2.10 ---
[INFO] Test Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-cli_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 9 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-cli_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/test/scala:-1: info: compiling
[INFO] Compiling 11 source files to /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/scala-2.10.4/test-classes at 1463194361602
[INFO] prepare-compile in 0 s
[INFO] compile in 21 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-cli_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-cli_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-cli_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.6:jar (default-jar) @ adam-cli_2.10 ---
[INFO] Building jar: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/adam-cli_2.10-0.19.0.jar
[INFO] 
[INFO] --- maven-shade-plugin:2.4.1:shade (default) @ adam-cli_2.10 ---
[INFO] Including commons-cli:commons-cli:jar:1.2 in the shaded jar.
[INFO] Including commons-httpclient:commons-httpclient:jar:3.1 in the shaded jar.
[INFO] Including commons-codec:commons-codec:jar:1.4 in the shaded jar.
[INFO] Including commons-logging:commons-logging:jar:1.1.3 in the shaded jar.
[INFO] Including org.apache.commons:commons-compress:jar:1.4.1 in the shaded jar.
[INFO] Including org.codehaus.jackson:jackson-core-asl:jar:1.9.13 in the shaded jar.
[INFO] Including org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13 in the shaded jar.
[INFO] Including com.google.code.findbugs:jsr305:jar:1.3.9 in the shaded jar.
[INFO] Including org.slf4j:slf4j-api:jar:1.7.10 in the shaded jar.
[INFO] Including log4j:log4j:jar:1.2.17 in the shaded jar.
[INFO] Including org.xerial.snappy:snappy-java:jar:1.1.1.7 in the shaded jar.
[INFO] Including com.thoughtworks.paranamer:paranamer:jar:2.6 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-io_2.10:jar:0.2.4 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-misc_2.10:jar:0.2.4 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpclient:jar:4.5.1 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpcore:jar:4.4.3 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-cli_2.10:jar:0.2.4 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-avro:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-column:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-common:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-encoding:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-hadoop:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-jackson:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-format:jar:2.3.0-incubating in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-metrics_2.10:jar:0.2.4 in the shaded jar.
[INFO] Including com.netflix.servo:servo-core:jar:0.10.0 in the shaded jar.
[INFO] Including com.google.code.findbugs:annotations:jar:2.0.0 in the shaded jar.
[INFO] Including com.netflix.servo:servo-internal:jar:0.10.0 in the shaded jar.
[INFO] Including org.scoverage:scalac-scoverage-plugin_2.10:jar:1.1.1 in the shaded jar.
[INFO] Including org.bdgenomics.bdg-formats:bdg-formats:jar:0.7.0 in the shaded jar.
[INFO] Including org.apache.avro:avro:jar:1.7.7 in the shaded jar.
[INFO] Including org.bdgenomics.adam:adam-core_2.10:jar:0.19.0 in the shaded jar.
[INFO] Including com.esotericsoftware.kryo:kryo:jar:2.24.0 in the shaded jar.
[INFO] Including com.esotericsoftware.minlog:minlog:jar:1.2 in the shaded jar.
[INFO] Including org.objenesis:objenesis:jar:2.1 in the shaded jar.
[INFO] Including commons-io:commons-io:jar:1.3.2 in the shaded jar.
[INFO] Including it.unimi.dsi:fastutil:jar:6.6.5 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-scala_2.10:jar:1.8.1 in the shaded jar.
[INFO] Including org.seqdoop:hadoop-bam:jar:7.1.0 in the shaded jar.
[INFO] Including com.github.samtools:htsjdk:jar:1.139 in the shaded jar.
[INFO] Including org.apache.commons:commons-jexl:jar:2.1.1 in the shaded jar.
[INFO] Including org.tukaani:xz:jar:1.5 in the shaded jar.
[INFO] Including org.apache.ant:ant:jar:1.8.2 in the shaded jar.
[INFO] Including org.apache.ant:ant-launcher:jar:1.8.2 in the shaded jar.
[INFO] Including com.google.guava:guava:jar:16.0.1 in the shaded jar.
[INFO] Including org.bdgenomics.adam:adam-apis_2.10:jar:0.19.0 in the shaded jar.
[INFO] Including org.scala-lang:scala-library:jar:2.10.4 in the shaded jar.
[INFO] Including org.slf4j:slf4j-log4j12:jar:1.7.12 in the shaded jar.
[INFO] Including args4j:args4j:jar:2.0.31 in the shaded jar.
[INFO] Including net.codingwell:scala-guice_2.10:jar:4.0.0 in the shaded jar.
[INFO] Including com.google.inject:guice:jar:4.0 in the shaded jar.
[INFO] Including javax.inject:javax.inject:jar:1 in the shaded jar.
[INFO] Including aopalliance:aopalliance:jar:1.0 in the shaded jar.
[INFO] Including com.google.inject.extensions:guice-multibindings:jar:4.0 in the shaded jar.
[WARNING] annotations-2.0.0.jar, jsr305-1.3.9.jar define 34 overlapping classes: 
[WARNING]   - javax.annotation.Nonnegative
[WARNING]   - javax.annotation.CheckForSigned
[WARNING]   - javax.annotation.CheckForNull
[WARNING]   - javax.annotation.Tainted
[WARNING]   - javax.annotation.meta.TypeQualifierValidator
[WARNING]   - javax.annotation.meta.TypeQualifier
[WARNING]   - javax.annotation.Syntax
[WARNING]   - javax.annotation.Detainted
[WARNING]   - javax.annotation.Nonnull$Checker
[WARNING]   - javax.annotation.meta.TypeQualifierNickname
[WARNING]   - 24 more...
[WARNING] maven-shade-plugin has detected that some class files are
[WARNING] present in two or more JARs. When this happens, only one
[WARNING] single version of the class is copied to the uber jar.
[WARNING] Usually this is not harmful and you can skip these warnings,
[WARNING] otherwise try to manually exclude artifacts based on
[WARNING] mvn dependency:tree -Ddetail=true and the above output.
[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/adam-cli_2.10-0.19.0.jar with /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/adam-cli_2.10-0.19.0-shaded.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] ADAM_2.10 .......................................... SUCCESS [ 11.263 s]
[INFO] ADAM_2.10: Core .................................... SUCCESS [02:14 min]
[INFO] ADAM_2.10: APIs for Java ........................... SUCCESS [ 14.594 s]
[INFO] ADAM_2.10: CLI ..................................... SUCCESS [ 55.031 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:35 min
[INFO] Finished at: 2016-05-14T10:53:10+08:00
[INFO] Final Memory: 77M/395M
[INFO] ------------------------------------------------------------------------
xubo@xubo:~/cloud/adam-2.10-0.19-git$ 

2.Ubuntu下对adam进行mvn test

[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] ADAM_2.10
[INFO] ADAM_2.10: Core
[INFO] ADAM_2.10: APIs for Java
[INFO] ADAM_2.10: CLI
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-parent_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-parent_2.10 ---
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-parent_2.10 ---
[INFO] Modified 0 of 199 .scala files
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-parent_2.10 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-parent_2.10 ---
[INFO] No sources to compile
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10: Core 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-core_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-core_2.10 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-core_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-core_2.10 ---
[INFO] Modified 0 of 159 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-core_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-metrics_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-io_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-cli_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.scoverage:scalac-scoverage-plugin_2.10:1.1.1 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.adam:adam-core_2.10:0.19.0 requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-core_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-core_2.10 ---
[INFO] Test Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 65 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-core_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-metrics_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-io_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-cli_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.scoverage:scalac-scoverage-plugin_2.10:1.1.1 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.adam:adam-core_2.10:0.19.0 requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-core_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-core_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-core_2.10 ---
Discovery starting.
Discovery completed in 1 second, 545 milliseconds.
Run starting. Expected test count is: 373
RichGenotypeSuite:
- different ploidy
- all types for diploid genotype
ReferenceUtilsSuite:
- unionReferenceSet: empty
- unionReferenceSet: one region
- unionReferenceSet: multiple regions on one contig, all overlap
- unionReferenceSet: multiple regions on one contig, some overlap
- unionReferenceSet: multiple regions on multiple contigs
MdTagSuite:
- null md tag
- zero length md tag
- md tag with non-digit initial value
- md tag invalid base
- md tag, pure insertion
- md tag, pure insertion, test 2
- md tag pure insertion equality
- md tag equality and hashcode
- valid md tags
- get start of read with no mismatches or deletions
- get start of read with no mismatches, but with a deletion at the start
- get start of read with mismatches at the start
- get end of read with no mismatches or deletions
- check that mdtag and rich record return same end
- get end of read with no mismatches, but a deletion at end
- CIGAR with N operator
- CIGAR with multiple N operators
- CIGAR with P operators
- Get correct matches for mdtag with insertion
- Get correct matches for mdtag with mismatches and insertion
- Get correct matches for mdtag with insertion between mismatches
- Get correct matches for mdtag with intron between mismatches
- Get correct matches for mdtag with intron and deletion between mismatches
- Throw exception when number of deleted bases in mdtag disagrees with CIGAR
- Get correct matches for mdtag with mismatch, insertion and deletion
- Get correct matches for mdtag with mismatches, insertion and deletion
- Get correct matches for MDTag with mismatches and deletions
- Get correct matches base from MDTag and CIGAR with N
- get end of read with mismatches and a deletion at end
- get correct string out of mdtag with no mismatches
- get correct string out of mdtag with mismatches at start
- get correct string out of mdtag with deletion at end
- get correct string out of mdtag with mismatches at end
- get correct string out of complex mdtag
- check complex mdtag
- move a cigar alignment by two for a read
- rewrite alignment to all matches
- rewrite alignment to two mismatches followed by all matches
- rewrite alignment to include a deletion but otherwise all matches
- rewrite alignment to include an insertion at the start of the read but otherwise all matches
- create new md tag from read vs. reference, perfect match
- create new md tag from read vs. reference, perfect alignment match, 1 mismatch
- create new md tag from read vs. reference, alignment with deletion
- create new md tag from read vs. reference, alignment with insert
- handle '=' and 'X' operators
- CIGAR/MD tag mismatch should cause errors
GenotypesToVariantsConverterSuite:
- Simple test of integer RMS
- Simple test of floating point RMS
- Max genotype quality should lead to max variant quality
- Genotype quality = 0.5 for two samples should lead to variant quality of 0.75
PairingRDDSuite:
2016-05-14 11:07:09 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-05-14 11:07:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sliding on an empty RDD returns an empty RDD
2016-05-14 11:07:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sliding on an RDD where count() < width returns an empty RDD
2016-05-14 11:07:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sliding on an RDD where count() == width returns an RDD with one element.
2016-05-14 11:07:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sliding on a small RDD works correctly
2016-05-14 11:07:17 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sliding works correctly on a partitioned RDD
2016-05-14 11:07:20 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- pairing a simple sequence works
2016-05-14 11:07:20 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- pairing an empty sequence returns an empty sequence
2016-05-14 11:07:21 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- pairing a sorted sequence works
2016-05-14 11:07:22 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- pairWithEnds on an empty sequence returns an empty sequence
2016-05-14 11:07:22 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- pairWithEnds gives us the right number and set of values
ADAMVariationRDDFunctionsSuite:
2016-05-14 11:07:23 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- recover samples from variant context
2016-05-14 11:07:24 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- joins SNV database annotation
2016-05-14 11:07:24 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can write, then read in .vcf file
SingleReadBucketSuite:
- convert unmapped pair to fragment
- convert proper pair to fragment
- convert read pair to fragment with first of pair chimeric read
FlankReferenceFragmentsSuite:
- don't put flanks on non-adjacent fragments
- put flanks on adjacent fragments
ReferencePositionSuite:
- create reference position from mapped read
- create reference position from variant
- create reference position from genotype
AlignmentRecordRDDFunctionsSuite:
2016-05-14 11:07:25 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sorting reads
2016-05-14 11:07:27 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- characterizeTags counts integer tag values correctly
2016-05-14 11:07:28 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- withTag returns only those records which have the appropriate tag
2016-05-14 11:07:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- withTag, when given a tag name that doesn't exist in the input, returns an empty RDD
2016-05-14 11:07:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- characterizeTagValues counts distinct values of a tag
2016-05-14 11:07:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- characterizeTags counts tags in a SAM file correctly
2016-05-14 11:07:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- round trip from ADAM to SAM and back to ADAM produces equivalent Read values
2016-05-14 11:07:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- SAM conversion sets read mapped flag properly
2016-05-14 11:07:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert malformed FASTQ (no quality scores) => SAM => well-formed FASTQ => SAM
2016-05-14 11:07:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- round trip from ADAM to FASTQ and back to ADAM produces equivalent Read values
2016-05-14 11:07:32 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- round trip from ADAM to paired-FASTQ and back to ADAM produces equivalent Read values
2016-05-14 11:07:33 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:07:33 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:07:33 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- writing a small sorted file as SAM should produce the expected result
2016-05-14 11:07:33 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:07:33 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:07:33 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- writing unordered sam from unordered sam
2016-05-14 11:07:33 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:07:34 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:07:34 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- writing ordered sam from unordered sam
2016-05-14 11:07:34 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:07:35 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:07:35 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- write single sam file back
2016-05-14 11:07:36 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:07:36 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:07:36 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- write single bam file back
FlattenerSuite:
- Flatten schema and record
ShuffleRegionJoinSuite:
2016-05-14 11:07:37 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Overlapping reference regions
2016-05-14 11:07:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Multiple reference regions do not throw exception
2016-05-14 11:07:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- RegionJoin2 contains the same results as cartesianRegionJoin
ConsensusGeneratorFromReadsSuite:
2016-05-14 11:07:39 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking search for consensus list for artificial reads
NucleotideContigFragmentRDDFunctionsSuite:
2016-05-14 11:07:39 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- generate sequence dict from fasta
2016-05-14 11:07:39 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- recover reference string from a single contig fragment
2016-05-14 11:07:40 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- recover trimmed reference string from a single contig fragment
2016-05-14 11:07:40 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- recover reference string from multiple contig fragments
2016-05-14 11:07:40 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- recover trimmed reference string from multiple contig fragments
2016-05-14 11:07:41 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save single contig fragment as FASTA text file
2016-05-14 11:07:41 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save single contig fragment with description as FASTA text file
2016-05-14 11:07:41 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save single contig fragment with null fields as FASTA text file
2016-05-14 11:07:42 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save single contig fragment with null fragment number as FASTA text file
2016-05-14 11:07:42 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save single contig fragment with null number of fragments in contig as FASTA text file
2016-05-14 11:07:42 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save multiple contig fragments from same contig as FASTA text file
2016-05-14 11:07:42 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save multiple contig fragments with description from same contig as FASTA text file
2016-05-14 11:07:43 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- merge single contig fragment null fragment number
2016-05-14 11:07:43 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- merge single contig fragment number zero
2016-05-14 11:07:43 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- merge multiple contig fragments
FragmentConverterSuite:
- build a fragment collector and convert to a read
2016-05-14 11:07:44 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert an rdd of discontinuous fragments, all from the same contig
2016-05-14 11:07:44 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert an rdd of contiguous fragments, all from the same contig
2016-05-14 11:07:44 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert an rdd of varied fragments from multiple contigs
InterleavedFastqInputFormatSuite:
2016-05-14 11:07:45 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- interleaved FASTQ hadoop reader: interleaved_fastq_sample1.ifq->interleaved_fastq_sample1.ifq.output
2016-05-14 11:07:45 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- interleaved FASTQ hadoop reader: interleaved_fastq_sample2.ifq->interleaved_fastq_sample2.ifq.output
2016-05-14 11:07:45 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- interleaved FASTQ hadoop reader: interleaved_fastq_sample3.ifq->interleaved_fastq_sample3.ifq.output
2016-05-14 11:07:45 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- interleaved FASTQ hadoop reader: interleaved_fastq_sample4.ifq->interleaved_fastq_sample4.ifq.output
ReferenceRegionSuite:
- contains(: ReferenceRegion)
- contains(: ReferencePosition)
- merge
- overlaps
- distance(: ReferenceRegion)
- distance(: ReferencePosition)
- create region from unmapped read fails
- create region from mapped read contains read start and end
- validate that adjacent regions can be merged
- validate that non-adjacent regions cannot be merged
- compute convex hull of two sets
- region name is sanitized when creating region from read
- intersection fails on non-overlapping regions
- compute intersection
- overlap tests for oriented reference region
- check the width of a reference region
AlignmentRecordConverterSuite:
- testing the fields in a converted ADAM Read
- converting a read with null quality is OK
- convert a read to fastq
- reverse complement reads when converting to fastq
- converting to fastq with unmapped reads
- converting a fragment with no alignments should yield unaligned reads
- converting a fragment with alignments should restore the alignments
IntervalListReaderSuite:
- Can read the simple GATK-supplied example interval list file
SingleFastqInputFormatSuite:
2016-05-14 11:07:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- FASTQ hadoop reader: fastq_sample1.fq->single_fastq_sample1.fq.output
2016-05-14 11:07:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- FASTQ hadoop reader: fastq_sample2.fq->single_fastq_sample2.fq.output
2016-05-14 11:07:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- FASTQ hadoop reader: fastq_sample3.fq->single_fastq_sample3.fq.output
2016-05-14 11:07:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- FASTQ hadoop reader: fastq_sample4.fq->single_fastq_sample4.fq.output
RegExpSuite:
- matches returns Some(matcher) when a complete match is found
- find returns Some(matcher) when a partial match is found
AttributeUtilsSuite:
- parseTags returns a reasonable set of tagStrings
- parseTags works with NumericSequence tagType
- empty string is parsed as zero tagStrings
- incorrectly formatted tag throws an exception
- string tag with a ':' in it is correctly parsed
MarkDuplicatesSuite:
2016-05-14 11:07:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- single read
2016-05-14 11:07:47 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- reads at different positions
2016-05-14 11:07:47 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- reads at the same position
2016-05-14 11:07:47 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- reads at the same position with clipping
2016-05-14 11:07:48 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- reads on reverse strand
2016-05-14 11:07:48 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- unmapped reads
2016-05-14 11:07:48 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- read pairs
2016-05-14 11:07:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- read pairs with fragments
- quality scores
2016-05-14 11:07:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- read pairs that cross chromosomes
IndelRealignmentTargetSuite:
2016-05-14 11:07:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking simple realignment target
2016-05-14 11:07:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating simple target from read with deletion
2016-05-14 11:07:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating simple target from read with insertion
2016-05-14 11:07:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- joining simple realignment targets on same chr
2016-05-14 11:07:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- joining simple realignment targets on different chr throws exception
2016-05-14 11:07:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating targets from three intersecting reads, same indel
2016-05-14 11:07:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating targets from three intersecting reads, two different indel
2016-05-14 11:07:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating targets from two disjoint reads
2016-05-14 11:07:51 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating targets for artificial reads: one-by-one
2016-05-14 11:07:51 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating targets for artificial reads: all-at-once (merged)
2016-05-14 11:07:51 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating indel targets for mason reads
TwoBitSuite:
- correctly read sequence from .2bit file
- correctly return masked sequences from .2bit file
- correctly return Ns from .2bit file
ConsensusSuite:
- test the insertion of a consensus insertion into a reference
- test the insertion of a consensus deletion into a reference
- inserting empty consensus returns the reference
DecadentReadSuite:
- reference position of decadent read
- reference position of decadent read with insertions
- build a decadent read from a read with null qual
- converting bad read should fail
2016-05-14 11:07:51 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:07:51 WARN  DecadentRead:64 - Converting read {"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null} to decadent read failed with java.lang.IllegalArgumentException: Error "(D,Some(3)) (of class scala.Tuple2)" while constructing DecadentRead from Read({"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null}). Skipping...
- convert an RDD that has an bad read in it with loose validation
2016-05-14 11:07:52 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:07:52 ERROR Executor:96 - Exception in task 1.0 in stage 0.0 (TID 1)
java.lang.IllegalArgumentException: Error "(D,Some(3)) (of class scala.Tuple2)" while constructing DecadentRead from Read({"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null})
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:42)
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:34)
    at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:57)
	at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:55)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555)
	at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)
	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
	at org.apache.spark.scheduler.Task.run(Task.scala:88)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Caused by: scala.MatchError: (D,Some(3)) (of class scala.Tuple2)
	at org.bdgenomics.adam.util.MdTag$.apply(MdTag.scala:71)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag$lzycompute(RichAlignmentRecord.scala:98)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag(RichAlignmentRecord.scala:96)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceBase$1(RichAlignmentRecord.scala:176)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceContext(RichAlignmentRecord.scala:192)
	at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:213)
    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:211)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.AbstractTraversable.map(Traversable.scala:105)
    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:211)
	at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:200)
    at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)
    at scala.collection.immutable.List.foldLeft(List.scala:84)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts$lzycompute(RichAlignmentRecord.scala:200)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts(RichAlignmentRecord.scala:198)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions$lzycompute(RichAlignmentRecord.scala:196)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions(RichAlignmentRecord.scala:196)
    at org.bdgenomics.adam.rich.DecadentRead.(DecadentRead.scala:91)
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:38)
    ... 15 more
2016-05-14 11:07:52 WARN  TaskSetManager:71 - Lost task 1.0 in stage 0.0 (TID 1, localhost): java.lang.IllegalArgumentException: Error "(D,Some(3)) (of class scala.Tuple2)" while constructing DecadentRead from Read({"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null})
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:42)
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:34)
    at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:57)
	at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:55)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555)
	at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)
	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
	at org.apache.spark.scheduler.Task.run(Task.scala:88)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Caused by: scala.MatchError: (D,Some(3)) (of class scala.Tuple2)
	at org.bdgenomics.adam.util.MdTag$.apply(MdTag.scala:71)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag$lzycompute(RichAlignmentRecord.scala:98)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag(RichAlignmentRecord.scala:96)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceBase$1(RichAlignmentRecord.scala:176)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceContext(RichAlignmentRecord.scala:192)
	at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:213)
    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:211)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.AbstractTraversable.map(Traversable.scala:105)
    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:211)
	at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:200)
    at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)
    at scala.collection.immutable.List.foldLeft(List.scala:84)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts$lzycompute(RichAlignmentRecord.scala:200)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts(RichAlignmentRecord.scala:198)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions$lzycompute(RichAlignmentRecord.scala:196)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions(RichAlignmentRecord.scala:196)
    at org.bdgenomics.adam.rich.DecadentRead.(DecadentRead.scala:91)
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:38)
    ... 15 more

2016-05-14 11:07:52 ERROR TaskSetManager:75 - Task 1 in stage 0.0 failed 1 times; aborting job
- converting an RDD that has an bad read in it with strict validation will throw an error
AlphabetSuite:
- test size of a case-sensitive alphabet
- test apply of a case-sensitive alphabet
- test reverse complement of a case-sensitive alphabet
- test exact reverse complement of a case-sensitive alphabet
- test size of a case-insensitive alphabet
- test apply of a case-insensitive alphabet
- test reverse complement of a case-insensitive alphabet
- test exact reverse complement of a case-insensitive alphabet
- DNA alphabet
- map unknown bases to N
FieldEnumerationSuite:
2016-05-14 11:07:52 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
- Empty projections are illegal
2016-05-14 11:07:52 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:07:53 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Simple projection on Read works
BroadcastRegionJoinSuite:
- alternating returns an alternating seq of items
- Single region returns itself
- Two adjacent regions will be merged
- Nonoverlapping regions will all be returned
- Many overlapping regions will all be merged
- ADAMRecords return proper references
2016-05-14 11:07:53 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Ensure same reference regions get passed together
2016-05-14 11:07:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Overlapping reference regions
2016-05-14 11:07:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Multiple reference regions do not throw exception
2016-05-14 11:07:55 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- regionJoin contains the same results as cartesianRegionJoin
CoverageSuite:
- regionToWindows
- calculate empty coverage
- calculate coverage of one region
- calculate coverage of two regions
- calculate coverage of three regions
- calculate coverage of two adjacent regions
- calculate coverage of two nearby regions
- calculate coverage of three out-of-order regions
- calculate coverage of two regions which join at a window boundary
2016-05-14 11:07:56 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find empty coverage
2016-05-14 11:07:57 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of one region
2016-05-14 11:07:57 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of two regions
2016-05-14 11:07:58 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of three regions
2016-05-14 11:08:00 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of two adjacent regions
2016-05-14 11:08:01 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of two nearby regions
2016-05-14 11:08:02 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of three out-of-order regions
2016-05-14 11:08:03 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of two regions which join at a window boundary
VariantContextConverterSuite:
- Convert GATK site-only SNV to ADAM
- Convert GATK site-only SNV to ADAM with contig conversion
- Convert GATK site-only CNV to ADAM
- Convert GATK SNV w/ genotypes w/ phase information to ADAM
- Convert GATK SNV with different filters to ADAM
- Convert ADAM site-only SNV to GATK
- Convert ADAM site-only SNV to GATK with contig conversion
- Convert ADAM SNV w/ genotypes to GATK
- Convert GATK multi-allelic sites-only SNVs to ADAM
- Convert GATK multi-allelic SNVs to ADAM
- Convert gVCF reference records to ADAM
AttributeSuite:
- test SAMTagAndValue parsing
- Attributes can be correctly re-encoded as text SAM tags
SAMRecordConverterSuite:
- testing the fields in an alignmentRecord obtained from a mapped samRecord conversion
- testing the fields in an alignmentRecord obtained from an unmapped samRecord conversion
- '*' quality gets nulled out
ADAMContextSuite:
2016-05-14 11:08:04 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sc.loadParquet should not fail on unmapped reads
2016-05-14 11:08:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sc.loadParquet should not load a file without a type specified
2016-05-14 11:08:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can read a small .SAM file
2016-05-14 11:08:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can read a small .SAM with all attribute tag types
2016-05-14 11:08:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can filter a .SAM file based on quality
- Can convert to phred
- Can convert from phred
2016-05-14 11:08:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- findFiles correctly finds a nested set of directories
2016-05-14 11:08:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- loadADAMFromPaths can load simple RDDs that have just been saved
2016-05-14 11:08:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Can read a .gtf file
2016-05-14 11:08:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Can read a .bed file
2016-05-14 11:08:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Can read a .narrowPeak file
2016-05-14 11:08:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Can read a .interval_list file
2016-05-14 11:08:07 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can read a small .vcf file
2016-05-14 11:08:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from interleaved FASTQ: 1
2016-05-14 11:08:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from interleaved FASTQ: 2
2016-05-14 11:08:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from interleaved FASTQ: 3
2016-05-14 11:08:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from interleaved FASTQ: 4
2016-05-14 11:08:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from single ended FASTQ: 1
2016-05-14 11:08:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from single ended FASTQ: 2
2016-05-14 11:08:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from single ended FASTQ: 3
2016-05-14 11:08:10 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from single ended FASTQ: 4
2016-05-14 11:08:10 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- filter on load using the filter2 API
RichCigarSuite:
- moving 2 bp from a deletion to a match operator
- moving 2 bp from a insertion to a match operator
- moving 1 base in a two element cigar
- move to start of read
UtilSuite:
- isSameConfig
MDTaggingSuite:
2016-05-14 11:08:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags over boundary
2016-05-14 11:08:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags; reads span full contig
2016-05-14 11:08:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags; reads start inside first fragment
2016-05-14 11:08:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags; reads end inside last fragment
2016-05-14 11:08:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags; reads start inside first fragment and end inside last fragment
2016-05-14 11:08:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags; reads start and end in middle fragements
NormalizationUtilsSuite:
- cannot move an indel left if there are no bases to it's left
- move a simple indel to farthest position left until bases run out
- move a simple indel to farthest position left, past length of indel
- cannot move a left normalized indel in a short tandem repeat
- move an indel in a short tandem repeat
- move an indel in a short tandem repeat of more than 2 bases, where shift is not an integer multiple of repeated sequence length
- moving a simple read with single deletion that cannot shift
- shift an indel left by 0 in a cigar
- shift an indel left by 1 in a cigar
- do not left align a complex read which is already left aligned
SmithWatermanSuite:
- gather max position from simple scoring matrix
- gather max position from irregular scoring matrix
- gather max position from irregular scoring matrix with deletions
- score simple alignment with constant gap
- score irregular scoring matrix
- score irregular scoring matrix with indel
- can unroll cigars correctly
- execute simple trackback
- execute trackback with indel
- run end to end smith waterman for simple reads
- run end to end smith waterman for short sequences with indel
- run end to end smith waterman for longer sequences with snp
- run end to end smith waterman for longer sequences with short indel
- run end to end smith waterman for shorter sequence in longer sequence
- run end to end smith waterman for shorter sequence in longer sequence, with indel
FastaConverterSuite:
2016-05-14 11:08:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find contig index
- convert a single record without naming information
- convert a single record with naming information
2016-05-14 11:08:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert single fasta sequence
2016-05-14 11:08:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert fasta with multiple sequences
2016-05-14 11:08:14 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert fasta with multiple sequences; short fragment
2016-05-14 11:08:14 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert reference fasta file
IndelTableSuite:
- check for indels in a region with known indels
- check for indels in a contig that doesn't exist
- check for indels in a region without known indels
2016-05-14 11:08:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- build indel table from rdd of variants
RealignIndelsSuite:
2016-05-14 11:08:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking mapping to targets for artificial reads
2016-05-14 11:08:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking alternative consensus for artificial reads
2016-05-14 11:08:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking extraction of reference from reads
2016-05-14 11:08:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking realigned reads for artificial input
2016-05-14 11:08:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test mismatch quality scoring
2016-05-14 11:08:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test mismatch quality scoring for no mismatches
2016-05-14 11:08:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test mismatch quality scoring after unpacking read
- we shouldn't try to realign a region with no target
2016-05-14 11:08:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- we shouldn't try to realign reads with no indel evidence
2016-05-14 11:08:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test OP and OC tags
BaseQualityRecalibrationSuite:
2016-05-14 11:08:17 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- BQSR Test Input #1
2016-05-14 11:08:25 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- BQSR Test Input #1 w/ VCF Sites
ProgramRecordSuite:
- convert a sam program record with no optional fields into a program record and v/v
- convert a sam program record into a program record and v/v
SequenceDictionarySuite:
- Convert from sam sequence record and back
- Convert from SAM sequence dictionary file (with extra fields)
- merge into existing dictionary
- Convert from SAM sequence dictionary and back
- Can retrieve sequence by name
- SequenceDictionary's with same single element are equal
- SequenceDictionary's with same two elements are equals
- SequenceDictionary's with different elements are unequal
- SequenceDictionaries with same elements in different order are compatible
- isCompatible tests equality on overlap
- The addition + works correctly
- The append operation ++ works correctly
- ContainsRefName works correctly for different string types
- Apply on name works correctly for different String types
- convert from sam sequence record and back
- convert from sam sequence dictionary and back
- conversion to sam sequence dictionary has correct sort order
GenomicPositionPartitionerSuite:
- partitions the UNMAPPED ReferencePosition into the top partition
- if we do not have a contig for a record, we throw an IAE
- partitioning into N pieces on M total sequence length, where N > M, results in M partitions
- correctly partitions a single dummy sequence into two pieces
- correctly counts cumulative lengths
- correctly partitions positions across two dummy sequences
2016-05-14 11:08:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:08:30 WARN  TaskSetManager:71 - Stage 0 contains a task of very large size (131 KB). The maximum recommended task size is 100 KB.
2016-05-14 11:08:30 WARN  TaskSetManager:71 - Stage 1 contains a task of very large size (131 KB). The maximum recommended task size is 100 KB.
- test that we can range partition ADAMRecords
2016-05-14 11:08:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test that simple partitioning works okay on a reasonable set of ADAMRecords
RichAlignmentRecordSuite:
- referenceLengthFromCigar
- Unclipped Start
- Unclipped End
- Illumina Optics
- Cigar Clipping Sequence
- tags contains optional fields
- Reference Positions
- read overlap unmapped read
- read overlap reference position
- read overlap same position different contig
GeneSuite:
2016-05-14 11:08:32 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can load a set of gene models from an Ensembl GTF file
2016-05-14 11:08:32 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can load a set of gene models from a Gencode GTF file
2016-05-14 11:08:33 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- chr20 gencode transcript sequences match the published sequences
MapToolsSuite:
- add two nonzero integer maps
- add two nonzero float maps
- adding an empty map is the identity
RecordGroupDictionarySuite:
- simple conversion to and from sam read group
- sample name must be set
- simple equality checks
Run completed in 1 minute, 26 seconds.
Total number of tests run: 373
Suites: completed 52, aborted 0
Tests: succeeded 373, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
May 14, 2016 11:08:11 AM INFO: org.apache.parquet.filter2.compat.FilterCompat: Filtering using predicate: eq(start, 16097631)
May 14, 2016 11:08:11 AM INFO: org.apache.parquet.filter2.compat.FilterCompat: Filtering using predicate: eq(start, 16097631)
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10: APIs for Java 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-apis_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-apis_2.10 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-apis_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis_2.10 ---
[INFO] Modified 0 of 4 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-apis_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-apis_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-apis_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-apis_2.10 ---
[INFO] Test Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-apis_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/test/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-apis_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-apis_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-apis_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-apis_2.10 ---
Discovery starting.
Discovery completed in 465 milliseconds.
Run starting. Expected test count is: 2
JavaADAMContextSuite:
2016-05-14 11:08:37 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-05-14 11:08:40 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can read a small .SAM file
2016-05-14 11:08:42 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
- can read a small .SAM file inside of java
Run completed in 7 seconds, 879 milliseconds.
Total number of tests run: 2
Suites: completed 2, aborted 0
Tests: succeeded 2, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10: CLI 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.0:revision (default) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- templating-maven-plugin:1.0-alpha-3:filter-sources (filter-src) @ adam-cli_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/generated-sources/java-templates added.
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-cli_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli_2.10 ---
[INFO] Modified 0 of 36 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-cli_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-cli_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/generated-sources/java-templates:-1: info: compiling
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/main/scala:-1: info: compiling
[INFO] Compiling 26 source files to /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/scala-2.10.4/classes at 1463195327139
[INFO] prepare-compile in 0 s
[INFO] compile in 22 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-cli_2.10 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 1 source file to /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/scala-2.10.4/classes
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-cli_2.10 ---
[INFO] Test Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-cli_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 9 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-cli_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-cli_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-cli_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-cli_2.10 ---
Discovery starting.
Discovery completed in 392 milliseconds.
Run starting. Expected test count is: 33
TransformSuite:
2016-05-14 11:09:12 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-05-14 11:09:14 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:17 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:09:17 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- unordered sam to unordered sam
2016-05-14 11:09:17 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:18 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:09:18 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- unordered sam to ordered sam
2016-05-14 11:09:18 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
2016-05-14 11:09:20 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:09:20 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- unordered sam, to adam, to sam
2016-05-14 11:09:20 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:21 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:09:21 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- unordered sam, to adam, to ordered sam
FlattenSuite:
2016-05-14 11:09:21 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can flatten a simple VCF file
FlagStatSuite:
2016-05-14 11:09:23 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Standard FlagStat test
ViewSuite:
2016-05-14 11:09:23 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:24 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -f 0 -F 0 is a no-op
2016-05-14 11:09:25 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:25 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- no -f or -F args is a no-op
2016-05-14 11:09:26 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:26 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -f 4: only unmapped reads
2016-05-14 11:09:26 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:27 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -F 4: only mapped reads
2016-05-14 11:09:27 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:27 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -f 4 -F 8: unmapped reads with mapped mates
2016-05-14 11:09:28 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:28 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -f 12: unmapped reads with unmapped mates
2016-05-14 11:09:28 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:28 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -g 12: reads that are unmapped or whose mate is unmapped
2016-05-14 11:09:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -F 12: mapped reads with mapped mates
2016-05-14 11:09:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -g 36: unmapped reads or reads with mate on negative strand
2016-05-14 11:09:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -F 36: unmapped reads or reads with mate on negative strand
PluginExecutorSuite:
2016-05-14 11:09:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- take10 works correctly on example SAM
2016-05-14 11:09:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- java take10 works correctly on example SAM
2016-05-14 11:09:31 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- takeN works correctly on example SAM with arg of '3'
ADAM2FastaSuite:
2016-05-14 11:09:32 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- round trip FASTA to nucleotide contig fragments in ADAM format to FASTA
Features2ADAMSuite:
- can convert a simple BED file !!! IGNORED !!!
2016-05-14 11:09:33 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can convert a simple wigfix file
AboutSuite:
- template variables have been replaced
- templated values are not empty
ADAMMainSuite:
- default command groups is not empty
- module provides default command groups
- inject default command groups when called via main
- command groups is empty when called via apply
- single command group
- add new command group to default command groups
- module restores default command groups when called via apply
- custom module with single command group
- custom module with new command group added to default command groups
Adam2FastqSuite:
2016-05-14 11:09:34 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:09:35 ERROR AlignmentRecordRDDFunctions:75 - Found 16 read names that don't occur exactly twice:
    1x: 16

Samples:
    SRR062634.16445865
    SRR062634.9119161
    SRR062634.17190076
    SRR062634.17969132
    SRR062634.7301099
    SRR062634.2087100
    SRR062634.20911784
    SRR062634.16769670
    SRR062634.18958430
    SRR062634.12099057
    SRR062634.12606172
    SRR062634.14985224
    SRR062634.10448889
    SRR062634.4789722
    SRR062634.3203184
    SRR062634.17698657
- convert SAM to paired FASTQ
Run completed in 24 seconds, 919 milliseconds.
Total number of tests run: 33
Suites: completed 11, aborted 0
Tests: succeeded 33, failed 0, canceled 0, ignored 1, pending 0
All tests passed.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] ADAM_2.10 .......................................... SUCCESS [  8.840 s]
[INFO] ADAM_2.10: Core .................................... SUCCESS [01:40 min]
[INFO] ADAM_2.10: APIs for Java ........................... SUCCESS [ 10.696 s]
[INFO] ADAM_2.10: CLI ..................................... SUCCESS [ 52.030 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:52 min
[INFO] Finished at: 2016-05-14T11:09:36+08:00
[INFO] Final Memory: 48M/400M
[INFO] ------------------------------------------------------------------------

3.mvn install 记录 :

[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] ADAM_2.10
[INFO] ADAM_2.10: Core
[INFO] ADAM_2.10: APIs for Java
[INFO] ADAM_2.10: CLI
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-parent_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-parent_2.10 ---
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-parent_2.10 ---
[INFO] Modified 0 of 199 .scala files
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-parent_2.10 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-parent_2.10 ---
[INFO] No sources to compile
[INFO] 
[INFO] >>> scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) > generate-sources @ adam-parent_2.10 >>>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-parent_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-parent_2.10 ---
[INFO] 
[INFO] <<< scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) < generate-sources @ adam-parent_2.10 <<<
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) @ adam-parent_2.10 ---
[INFO] No source files found
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ adam-parent_2.10 ---
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/pom.xml to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-parent_2.10/0.19.0/adam-parent_2.10-0.19.0.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10: Core 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-core_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-core_2.10 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-core_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-core_2.10 ---
[INFO] Modified 0 of 159 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-core_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-metrics_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-io_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-cli_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.scoverage:scalac-scoverage-plugin_2.10:1.1.1 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.adam:adam-core_2.10:0.19.0 requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-core_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-core_2.10 ---
[INFO] Test Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 65 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-core_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-metrics_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-io_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-cli_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.scoverage:scalac-scoverage-plugin_2.10:1.1.1 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.adam:adam-core_2.10:0.19.0 requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-core_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-core_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-core_2.10 ---
Discovery starting.
Discovery completed in 1 second, 532 milliseconds.
Run starting. Expected test count is: 373
RichGenotypeSuite:
- different ploidy
- all types for diploid genotype
ReferenceUtilsSuite:
- unionReferenceSet: empty
- unionReferenceSet: one region
- unionReferenceSet: multiple regions on one contig, all overlap
- unionReferenceSet: multiple regions on one contig, some overlap
- unionReferenceSet: multiple regions on multiple contigs
MdTagSuite:
- null md tag
- zero length md tag
- md tag with non-digit initial value
- md tag invalid base
- md tag, pure insertion
- md tag, pure insertion, test 2
- md tag pure insertion equality
- md tag equality and hashcode
- valid md tags
- get start of read with no mismatches or deletions
- get start of read with no mismatches, but with a deletion at the start
- get start of read with mismatches at the start
- get end of read with no mismatches or deletions
- check that mdtag and rich record return same end
- get end of read with no mismatches, but a deletion at end
- CIGAR with N operator
- CIGAR with multiple N operators
- CIGAR with P operators
- Get correct matches for mdtag with insertion
- Get correct matches for mdtag with mismatches and insertion
- Get correct matches for mdtag with insertion between mismatches
- Get correct matches for mdtag with intron between mismatches
- Get correct matches for mdtag with intron and deletion between mismatches
- Throw exception when number of deleted bases in mdtag disagrees with CIGAR
- Get correct matches for mdtag with mismatch, insertion and deletion
- Get correct matches for mdtag with mismatches, insertion and deletion
- Get correct matches for MDTag with mismatches and deletions
- Get correct matches base from MDTag and CIGAR with N
- get end of read with mismatches and a deletion at end
- get correct string out of mdtag with no mismatches
- get correct string out of mdtag with mismatches at start
- get correct string out of mdtag with deletion at end
- get correct string out of mdtag with mismatches at end
- get correct string out of complex mdtag
- check complex mdtag
- move a cigar alignment by two for a read
- rewrite alignment to all matches
- rewrite alignment to two mismatches followed by all matches
- rewrite alignment to include a deletion but otherwise all matches
- rewrite alignment to include an insertion at the start of the read but otherwise all matches
- create new md tag from read vs. reference, perfect match
- create new md tag from read vs. reference, perfect alignment match, 1 mismatch
- create new md tag from read vs. reference, alignment with deletion
- create new md tag from read vs. reference, alignment with insert
- handle '=' and 'X' operators
- CIGAR/MD tag mismatch should cause errors
GenotypesToVariantsConverterSuite:
- Simple test of integer RMS
- Simple test of floating point RMS
- Max genotype quality should lead to max variant quality
- Genotype quality = 0.5 for two samples should lead to variant quality of 0.75
PairingRDDSuite:
2016-05-14 11:14:14 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-05-14 11:14:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sliding on an empty RDD returns an empty RDD
2016-05-14 11:14:18 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sliding on an RDD where count() < width returns an empty RDD
2016-05-14 11:14:20 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sliding on an RDD where count() == width returns an RDD with one element.
2016-05-14 11:14:21 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sliding on a small RDD works correctly
2016-05-14 11:14:22 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sliding works correctly on a partitioned RDD
2016-05-14 11:14:24 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- pairing a simple sequence works
2016-05-14 11:14:25 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- pairing an empty sequence returns an empty sequence
2016-05-14 11:14:26 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- pairing a sorted sequence works
2016-05-14 11:14:26 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- pairWithEnds on an empty sequence returns an empty sequence
2016-05-14 11:14:27 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- pairWithEnds gives us the right number and set of values
ADAMVariationRDDFunctionsSuite:
2016-05-14 11:14:28 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- recover samples from variant context
2016-05-14 11:14:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- joins SNV database annotation
2016-05-14 11:14:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can write, then read in .vcf file
SingleReadBucketSuite:
- convert unmapped pair to fragment
- convert proper pair to fragment
- convert read pair to fragment with first of pair chimeric read
FlankReferenceFragmentsSuite:
- don't put flanks on non-adjacent fragments
- put flanks on adjacent fragments
ReferencePositionSuite:
- create reference position from mapped read
- create reference position from variant
- create reference position from genotype
AlignmentRecordRDDFunctionsSuite:
2016-05-14 11:14:30 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sorting reads
2016-05-14 11:14:32 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- characterizeTags counts integer tag values correctly
2016-05-14 11:14:33 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- withTag returns only those records which have the appropriate tag
2016-05-14 11:14:33 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- withTag, when given a tag name that doesn't exist in the input, returns an empty RDD
2016-05-14 11:14:33 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- characterizeTagValues counts distinct values of a tag
2016-05-14 11:14:34 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- characterizeTags counts tags in a SAM file correctly
2016-05-14 11:14:34 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- round trip from ADAM to SAM and back to ADAM produces equivalent Read values
2016-05-14 11:14:35 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- SAM conversion sets read mapped flag properly
2016-05-14 11:14:35 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert malformed FASTQ (no quality scores) => SAM => well-formed FASTQ => SAM
2016-05-14 11:14:36 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- round trip from ADAM to FASTQ and back to ADAM produces equivalent Read values
2016-05-14 11:14:36 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- round trip from ADAM to paired-FASTQ and back to ADAM produces equivalent Read values
2016-05-14 11:14:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:14:38 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:14:38 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- writing a small sorted file as SAM should produce the expected result
2016-05-14 11:14:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:14:38 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:14:38 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- writing unordered sam from unordered sam
2016-05-14 11:14:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:14:38 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:14:38 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- writing ordered sam from unordered sam
2016-05-14 11:14:38 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:14:39 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:14:39 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- write single sam file back
2016-05-14 11:14:40 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:14:41 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:14:41 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- write single bam file back
FlattenerSuite:
- Flatten schema and record
ShuffleRegionJoinSuite:
2016-05-14 11:14:42 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Overlapping reference regions
2016-05-14 11:14:42 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Multiple reference regions do not throw exception
2016-05-14 11:14:43 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- RegionJoin2 contains the same results as cartesianRegionJoin
ConsensusGeneratorFromReadsSuite:
2016-05-14 11:14:44 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking search for consensus list for artificial reads
NucleotideContigFragmentRDDFunctionsSuite:
2016-05-14 11:14:44 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- generate sequence dict from fasta
2016-05-14 11:14:44 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- recover reference string from a single contig fragment
2016-05-14 11:14:44 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- recover trimmed reference string from a single contig fragment
2016-05-14 11:14:44 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- recover reference string from multiple contig fragments
2016-05-14 11:14:45 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- recover trimmed reference string from multiple contig fragments
2016-05-14 11:14:45 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save single contig fragment as FASTA text file
2016-05-14 11:14:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save single contig fragment with description as FASTA text file
2016-05-14 11:14:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save single contig fragment with null fields as FASTA text file
2016-05-14 11:14:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save single contig fragment with null fragment number as FASTA text file
2016-05-14 11:14:46 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save single contig fragment with null number of fragments in contig as FASTA text file
2016-05-14 11:14:47 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save multiple contig fragments from same contig as FASTA text file
2016-05-14 11:14:47 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- save multiple contig fragments with description from same contig as FASTA text file
2016-05-14 11:14:47 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- merge single contig fragment null fragment number
2016-05-14 11:14:47 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- merge single contig fragment number zero
2016-05-14 11:14:48 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- merge multiple contig fragments
FragmentConverterSuite:
- build a fragment collector and convert to a read
2016-05-14 11:14:48 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert an rdd of discontinuous fragments, all from the same contig
2016-05-14 11:14:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert an rdd of contiguous fragments, all from the same contig
2016-05-14 11:14:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert an rdd of varied fragments from multiple contigs
InterleavedFastqInputFormatSuite:
2016-05-14 11:14:49 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- interleaved FASTQ hadoop reader: interleaved_fastq_sample1.ifq->interleaved_fastq_sample1.ifq.output
2016-05-14 11:14:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- interleaved FASTQ hadoop reader: interleaved_fastq_sample2.ifq->interleaved_fastq_sample2.ifq.output
2016-05-14 11:14:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- interleaved FASTQ hadoop reader: interleaved_fastq_sample3.ifq->interleaved_fastq_sample3.ifq.output
2016-05-14 11:14:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- interleaved FASTQ hadoop reader: interleaved_fastq_sample4.ifq->interleaved_fastq_sample4.ifq.output
ReferenceRegionSuite:
- contains(: ReferenceRegion)
- contains(: ReferencePosition)
- merge
- overlaps
- distance(: ReferenceRegion)
- distance(: ReferencePosition)
- create region from unmapped read fails
- create region from mapped read contains read start and end
- validate that adjacent regions can be merged
- validate that non-adjacent regions cannot be merged
- compute convex hull of two sets
- region name is sanitized when creating region from read
- intersection fails on non-overlapping regions
- compute intersection
- overlap tests for oriented reference region
- check the width of a reference region
AlignmentRecordConverterSuite:
- testing the fields in a converted ADAM Read
- converting a read with null quality is OK
- convert a read to fastq
- reverse complement reads when converting to fastq
- converting to fastq with unmapped reads
- converting a fragment with no alignments should yield unaligned reads
- converting a fragment with alignments should restore the alignments
IntervalListReaderSuite:
- Can read the simple GATK-supplied example interval list file
SingleFastqInputFormatSuite:
2016-05-14 11:14:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- FASTQ hadoop reader: fastq_sample1.fq->single_fastq_sample1.fq.output
2016-05-14 11:14:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- FASTQ hadoop reader: fastq_sample2.fq->single_fastq_sample2.fq.output
2016-05-14 11:14:50 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- FASTQ hadoop reader: fastq_sample3.fq->single_fastq_sample3.fq.output
2016-05-14 11:14:51 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- FASTQ hadoop reader: fastq_sample4.fq->single_fastq_sample4.fq.output
RegExpSuite:
- matches returns Some(matcher) when a complete match is found
- find returns Some(matcher) when a partial match is found
AttributeUtilsSuite:
- parseTags returns a reasonable set of tagStrings
- parseTags works with NumericSequence tagType
- empty string is parsed as zero tagStrings
- incorrectly formatted tag throws an exception
- string tag with a ':' in it is correctly parsed
MarkDuplicatesSuite:
2016-05-14 11:14:51 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- single read
2016-05-14 11:14:51 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- reads at different positions
2016-05-14 11:14:51 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- reads at the same position
2016-05-14 11:14:52 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- reads at the same position with clipping
2016-05-14 11:14:52 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- reads on reverse strand
2016-05-14 11:14:53 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- unmapped reads
2016-05-14 11:14:53 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- read pairs
2016-05-14 11:14:53 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- read pairs with fragments
- quality scores
2016-05-14 11:14:53 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- read pairs that cross chromosomes
IndelRealignmentTargetSuite:
2016-05-14 11:14:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking simple realignment target
2016-05-14 11:14:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating simple target from read with deletion
2016-05-14 11:14:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating simple target from read with insertion
2016-05-14 11:14:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- joining simple realignment targets on same chr
2016-05-14 11:14:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- joining simple realignment targets on different chr throws exception
2016-05-14 11:14:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating targets from three intersecting reads, same indel
2016-05-14 11:14:54 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating targets from three intersecting reads, two different indel
2016-05-14 11:14:55 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating targets from two disjoint reads
2016-05-14 11:14:55 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating targets for artificial reads: one-by-one
2016-05-14 11:14:55 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating targets for artificial reads: all-at-once (merged)
2016-05-14 11:14:55 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- creating indel targets for mason reads
TwoBitSuite:
- correctly read sequence from .2bit file
- correctly return masked sequences from .2bit file
- correctly return Ns from .2bit file
ConsensusSuite:
- test the insertion of a consensus insertion into a reference
- test the insertion of a consensus deletion into a reference
- inserting empty consensus returns the reference
DecadentReadSuite:
- reference position of decadent read
- reference position of decadent read with insertions
- build a decadent read from a read with null qual
- converting bad read should fail
2016-05-14 11:14:56 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:14:56 WARN  DecadentRead:64 - Converting read {"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null} to decadent read failed with java.lang.IllegalArgumentException: Error "(D,Some(3)) (of class scala.Tuple2)" while constructing DecadentRead from Read({"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null}). Skipping...
- convert an RDD that has an bad read in it with loose validation
2016-05-14 11:14:56 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:14:56 ERROR Executor:96 - Exception in task 1.0 in stage 0.0 (TID 1)
java.lang.IllegalArgumentException: Error "(D,Some(3)) (of class scala.Tuple2)" while constructing DecadentRead from Read({"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null})
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:42)
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:34)
    at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:57)
	at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:55)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555)
	at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)
	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
	at org.apache.spark.scheduler.Task.run(Task.scala:88)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Caused by: scala.MatchError: (D,Some(3)) (of class scala.Tuple2)
	at org.bdgenomics.adam.util.MdTag$.apply(MdTag.scala:71)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag$lzycompute(RichAlignmentRecord.scala:98)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag(RichAlignmentRecord.scala:96)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceBase$1(RichAlignmentRecord.scala:176)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceContext(RichAlignmentRecord.scala:192)
	at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:213)
    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:211)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.AbstractTraversable.map(Traversable.scala:105)
    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:211)
	at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:200)
    at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)
    at scala.collection.immutable.List.foldLeft(List.scala:84)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts$lzycompute(RichAlignmentRecord.scala:200)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts(RichAlignmentRecord.scala:198)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions$lzycompute(RichAlignmentRecord.scala:196)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions(RichAlignmentRecord.scala:196)
    at org.bdgenomics.adam.rich.DecadentRead.(DecadentRead.scala:91)
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:38)
    ... 15 more
2016-05-14 11:14:56 WARN  TaskSetManager:71 - Lost task 1.0 in stage 0.0 (TID 1, localhost): java.lang.IllegalArgumentException: Error "(D,Some(3)) (of class scala.Tuple2)" while constructing DecadentRead from Read({"readInFragment": 0, "contig": {"contigName": "1", "contigLength": null, "contigMD5": null, "referenceURL": null, "assembly": null, "species": null, "referenceIndex": null}, "start": 248262648, "oldPosition": null, "end": 248262721, "mapq": 23, "readName": null, "sequence": "GATCTTTTCAACAGTTACAGCAGAAAGTTTTCATGGAGAAATGGAATCACACTTCAAATGATTTCATTTTGTTGGG", "qual": "IBBHEFFEKFCKFHFACKFIJFJDCFHFEEDJBCHIFIDDBCGJDBBJAJBJFCIDCACHBDEBHADDDADDAED;", "cigar": "4S1M1D71M", "oldCigar": null, "basesTrimmedFromStart": 0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true, "mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand": false, "mateNegativeStrand": false, "primaryAlignment": false, "secondaryAlignment": false, "supplementaryAlignment": false, "mismatchingPositions": "3^C71", "origQual": null, "attributes": null, "recordGroupName": null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContig": null, "inferredInsertSize": null})
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:42)
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:34)
    at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:57)
	at org.bdgenomics.adam.rich.DecadentRead$$anonfun$cloy$1.apply(DecadentRead.scala:55)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555)
	at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1125)
	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
	at org.apache.spark.scheduler.Task.run(Task.scala:88)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Caused by: scala.MatchError: (D,Some(3)) (of class scala.Tuple2)
	at org.bdgenomics.adam.util.MdTag$.apply(MdTag.scala:71)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag$lzycompute(RichAlignmentRecord.scala:98)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.mdTag(RichAlignmentRecord.scala:96)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceBase$1(RichAlignmentRecord.scala:176)
	at org.bdgenomics.adam.rich.RichAlignmentRecord.getReferenceContext(RichAlignmentRecord.scala:192)
	at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:213)
    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1$$anonfun$2.apply(RichAlignmentRecord.scala:211)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.AbstractTraversable.map(Traversable.scala:105)
    at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:211)
	at org.bdgenomics.adam.rich.RichAlignmentRecord$$anonfun$1.apply(RichAlignmentRecord.scala:200)
    at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)
    at scala.collection.immutable.List.foldLeft(List.scala:84)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts$lzycompute(RichAlignmentRecord.scala:200)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referenceContexts(RichAlignmentRecord.scala:198)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions$lzycompute(RichAlignmentRecord.scala:196)
    at org.bdgenomics.adam.rich.RichAlignmentRecord.referencePositions(RichAlignmentRecord.scala:196)
    at org.bdgenomics.adam.rich.DecadentRead.(DecadentRead.scala:91)
    at org.bdgenomics.adam.rich.DecadentRead$.apply(DecadentRead.scala:38)
    ... 15 more

2016-05-14 11:14:56 ERROR TaskSetManager:75 - Task 1 in stage 0.0 failed 1 times; aborting job
- converting an RDD that has an bad read in it with strict validation will throw an error
AlphabetSuite:
- test size of a case-sensitive alphabet
- test apply of a case-sensitive alphabet
- test reverse complement of a case-sensitive alphabet
- test exact reverse complement of a case-sensitive alphabet
- test size of a case-insensitive alphabet
- test apply of a case-insensitive alphabet
- test reverse complement of a case-insensitive alphabet
- test exact reverse complement of a case-insensitive alphabet
- DNA alphabet
- map unknown bases to N
FieldEnumerationSuite:
2016-05-14 11:14:56 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
- Empty projections are illegal
2016-05-14 11:14:57 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:14:57 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Simple projection on Read works
BroadcastRegionJoinSuite:
- alternating returns an alternating seq of items
- Single region returns itself
- Two adjacent regions will be merged
- Nonoverlapping regions will all be returned
- Many overlapping regions will all be merged
- ADAMRecords return proper references
2016-05-14 11:14:58 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Ensure same reference regions get passed together
2016-05-14 11:14:58 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Overlapping reference regions
2016-05-14 11:14:59 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Multiple reference regions do not throw exception
2016-05-14 11:14:59 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- regionJoin contains the same results as cartesianRegionJoin
CoverageSuite:
- regionToWindows
- calculate empty coverage
- calculate coverage of one region
- calculate coverage of two regions
- calculate coverage of three regions
- calculate coverage of two adjacent regions
- calculate coverage of two nearby regions
- calculate coverage of three out-of-order regions
- calculate coverage of two regions which join at a window boundary
2016-05-14 11:15:00 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find empty coverage
2016-05-14 11:15:01 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of one region
2016-05-14 11:15:01 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of two regions
2016-05-14 11:15:02 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of three regions
2016-05-14 11:15:04 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of two adjacent regions
2016-05-14 11:15:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of two nearby regions
2016-05-14 11:15:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of three out-of-order regions
2016-05-14 11:15:07 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find coverage of two regions which join at a window boundary
VariantContextConverterSuite:
- Convert GATK site-only SNV to ADAM
- Convert GATK site-only SNV to ADAM with contig conversion
- Convert GATK site-only CNV to ADAM
- Convert GATK SNV w/ genotypes w/ phase information to ADAM
- Convert GATK SNV with different filters to ADAM
- Convert ADAM site-only SNV to GATK
- Convert ADAM site-only SNV to GATK with contig conversion
- Convert ADAM SNV w/ genotypes to GATK
- Convert GATK multi-allelic sites-only SNVs to ADAM
- Convert GATK multi-allelic SNVs to ADAM
- Convert gVCF reference records to ADAM
AttributeSuite:
- test SAMTagAndValue parsing
- Attributes can be correctly re-encoded as text SAM tags
SAMRecordConverterSuite:
- testing the fields in an alignmentRecord obtained from a mapped samRecord conversion
- testing the fields in an alignmentRecord obtained from an unmapped samRecord conversion
- '*' quality gets nulled out
ADAMContextSuite:
2016-05-14 11:15:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sc.loadParquet should not fail on unmapped reads
2016-05-14 11:15:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- sc.loadParquet should not load a file without a type specified
2016-05-14 11:15:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can read a small .SAM file
2016-05-14 11:15:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can read a small .SAM with all attribute tag types
2016-05-14 11:15:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can filter a .SAM file based on quality
- Can convert to phred
- Can convert from phred
2016-05-14 11:15:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- findFiles correctly finds a nested set of directories
2016-05-14 11:15:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- loadADAMFromPaths can load simple RDDs that have just been saved
2016-05-14 11:15:10 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Can read a .gtf file
2016-05-14 11:15:10 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Can read a .bed file
2016-05-14 11:15:10 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Can read a .narrowPeak file
2016-05-14 11:15:10 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Can read a .interval_list file
2016-05-14 11:15:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can read a small .vcf file
2016-05-14 11:15:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from interleaved FASTQ: 1
2016-05-14 11:15:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from interleaved FASTQ: 2
2016-05-14 11:15:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from interleaved FASTQ: 3
2016-05-14 11:15:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from interleaved FASTQ: 4
2016-05-14 11:15:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from single ended FASTQ: 1
2016-05-14 11:15:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from single ended FASTQ: 2
2016-05-14 11:15:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from single ended FASTQ: 3
2016-05-14 11:15:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- import records from single ended FASTQ: 4
2016-05-14 11:15:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- filter on load using the filter2 API
RichCigarSuite:
- moving 2 bp from a deletion to a match operator
- moving 2 bp from a insertion to a match operator
- moving 1 base in a two element cigar
- move to start of read
UtilSuite:
- isSameConfig
MDTaggingSuite:
2016-05-14 11:15:14 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags over boundary
2016-05-14 11:15:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags; reads span full contig
2016-05-14 11:15:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags; reads start inside first fragment
2016-05-14 11:15:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags; reads end inside last fragment
2016-05-14 11:15:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags; reads start inside first fragment and end inside last fragment
2016-05-14 11:15:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test adding MDTags; reads start and end in middle fragements
NormalizationUtilsSuite:
- cannot move an indel left if there are no bases to it's left
- move a simple indel to farthest position left until bases run out
- move a simple indel to farthest position left, past length of indel
- cannot move a left normalized indel in a short tandem repeat
- move an indel in a short tandem repeat
- move an indel in a short tandem repeat of more than 2 bases, where shift is not an integer multiple of repeated sequence length
- moving a simple read with single deletion that cannot shift
- shift an indel left by 0 in a cigar
- shift an indel left by 1 in a cigar
- do not left align a complex read which is already left aligned
SmithWatermanSuite:
- gather max position from simple scoring matrix
- gather max position from irregular scoring matrix
- gather max position from irregular scoring matrix with deletions
- score simple alignment with constant gap
- score irregular scoring matrix
- score irregular scoring matrix with indel
- can unroll cigars correctly
- execute simple trackback
- execute trackback with indel
- run end to end smith waterman for simple reads
- run end to end smith waterman for short sequences with indel
- run end to end smith waterman for longer sequences with snp
- run end to end smith waterman for longer sequences with short indel
- run end to end smith waterman for shorter sequence in longer sequence
- run end to end smith waterman for shorter sequence in longer sequence, with indel
FastaConverterSuite:
2016-05-14 11:15:17 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- find contig index
- convert a single record without naming information
- convert a single record with naming information
2016-05-14 11:15:17 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert single fasta sequence
2016-05-14 11:15:17 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert fasta with multiple sequences
2016-05-14 11:15:17 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert fasta with multiple sequences; short fragment
2016-05-14 11:15:18 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- convert reference fasta file
IndelTableSuite:
- check for indels in a region with known indels
- check for indels in a contig that doesn't exist
- check for indels in a region without known indels
2016-05-14 11:15:18 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- build indel table from rdd of variants
RealignIndelsSuite:
2016-05-14 11:15:18 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking mapping to targets for artificial reads
2016-05-14 11:15:19 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking alternative consensus for artificial reads
2016-05-14 11:15:19 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking extraction of reference from reads
2016-05-14 11:15:19 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- checking realigned reads for artificial input
2016-05-14 11:15:19 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test mismatch quality scoring
2016-05-14 11:15:20 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test mismatch quality scoring for no mismatches
2016-05-14 11:15:20 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test mismatch quality scoring after unpacking read
- we shouldn't try to realign a region with no target
2016-05-14 11:15:20 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- we shouldn't try to realign reads with no indel evidence
2016-05-14 11:15:20 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test OP and OC tags
BaseQualityRecalibrationSuite:
2016-05-14 11:15:21 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- BQSR Test Input #1
2016-05-14 11:15:29 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- BQSR Test Input #1 w/ VCF Sites
ProgramRecordSuite:
- convert a sam program record with no optional fields into a program record and v/v
- convert a sam program record into a program record and v/v
SequenceDictionarySuite:
- Convert from sam sequence record and back
- Convert from SAM sequence dictionary file (with extra fields)
- merge into existing dictionary
- Convert from SAM sequence dictionary and back
- Can retrieve sequence by name
- SequenceDictionary's with same single element are equal
- SequenceDictionary's with same two elements are equals
- SequenceDictionary's with different elements are unequal
- SequenceDictionaries with same elements in different order are compatible
- isCompatible tests equality on overlap
- The addition + works correctly
- The append operation ++ works correctly
- ContainsRefName works correctly for different string types
- Apply on name works correctly for different String types
- convert from sam sequence record and back
- convert from sam sequence dictionary and back
- conversion to sam sequence dictionary has correct sort order
GenomicPositionPartitionerSuite:
- partitions the UNMAPPED ReferencePosition into the top partition
- if we do not have a contig for a record, we throw an IAE
- partitioning into N pieces on M total sequence length, where N > M, results in M partitions
- correctly partitions a single dummy sequence into two pieces
- correctly counts cumulative lengths
- correctly partitions positions across two dummy sequences
2016-05-14 11:15:35 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:15:35 WARN  TaskSetManager:71 - Stage 0 contains a task of very large size (131 KB). The maximum recommended task size is 100 KB.
2016-05-14 11:15:35 WARN  TaskSetManager:71 - Stage 1 contains a task of very large size (131 KB). The maximum recommended task size is 100 KB.
- test that we can range partition ADAMRecords
2016-05-14 11:15:36 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- test that simple partitioning works okay on a reasonable set of ADAMRecords
RichAlignmentRecordSuite:
- referenceLengthFromCigar
- Unclipped Start
- Unclipped End
- Illumina Optics
- Cigar Clipping Sequence
- tags contains optional fields
- Reference Positions
- read overlap unmapped read
- read overlap reference position
- read overlap same position different contig
GeneSuite:
2016-05-14 11:15:36 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can load a set of gene models from an Ensembl GTF file
2016-05-14 11:15:37 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can load a set of gene models from a Gencode GTF file
2016-05-14 11:15:37 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- chr20 gencode transcript sequences match the published sequences
MapToolsSuite:
- add two nonzero integer maps
- add two nonzero float maps
- adding an empty map is the identity
RecordGroupDictionarySuite:
- simple conversion to and from sam read group
- sample name must be set
- simple equality checks
Run completed in 1 minute, 26 seconds.
Total number of tests run: 373
Suites: completed 52, aborted 0
Tests: succeeded 373, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
May 14, 2016 11:15:14 AM INFO: org.apache.parquet.filter2.compat.FilterCompat: Filtering using predicate: eq(start, 16097631)
May 14, 2016 11:15:14 AM INFO: org.apache.parquet.filter2.compat.FilterCompat: Filtering using predicate: eq(start, 16097631)
[INFO] 
[INFO] --- maven-jar-plugin:2.6:jar (default-jar) @ adam-core_2.10 ---
[INFO] 
[INFO] --- maven-jar-plugin:2.6:test-jar (default) @ adam-core_2.10 ---
[INFO] Building jar: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target/adam-core_2.10-0.19.0-tests.jar
[INFO] 
[INFO] >>> scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) > generate-sources @ adam-core_2.10 >>>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-core_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-core_2.10 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-core_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/main/scala added.
[INFO] 
[INFO] <<< scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) < generate-sources @ adam-core_2.10 <<<
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) @ adam-core_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-metrics_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-io_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-cli_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.scoverage:scalac-scoverage-plugin_2.10:1.1.1 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.adam:adam-core_2.10:0.19.0 requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-misc_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-metrics_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-io_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.utils:utils-cli_2.10:0.2.4 requires scala version: 2.10.4
[WARNING]  org.scoverage:scalac-scoverage-plugin_2.10:1.1.1 requires scala version: 2.10.4
[WARNING]  org.bdgenomics.adam:adam-core_2.10:0.19.0 requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
warning: there were 33 feature warning(s); re-run with -feature for details
model contains 258 documentable templates
/home/xubo/cloud/adam-2.10-0.19-git/adam-core/src/main/scala/org/bdgenomics/adam/models/VariantContext.scala:24: warning: Could not find any member to link for "org.bdgenomics.adam.rdd.variation.VariationContext".
/**
^
two warnings found
[INFO] Building jar: /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target/adam-core_2.10-0.19.0-javadoc.jar
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ adam-core_2.10 ---
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target/adam-core_2.10-0.19.0.jar to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-core_2.10/0.19.0/adam-core_2.10-0.19.0.jar
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-core/pom.xml to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-core_2.10/0.19.0/adam-core_2.10-0.19.0.pom
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target/adam-core_2.10-0.19.0-tests.jar to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-core_2.10/0.19.0/adam-core_2.10-0.19.0-tests.jar
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-core/target/adam-core_2.10-0.19.0-javadoc.jar to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-core_2.10/0.19.0/adam-core_2.10-0.19.0-javadoc.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10: APIs for Java 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-apis_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-apis_2.10 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-apis_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-apis_2.10 ---
[INFO] Modified 0 of 4 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-apis_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/main/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-apis_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-apis_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-apis_2.10 ---
[INFO] Test Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-apis_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/test/resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-apis_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-apis_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-apis_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-apis_2.10 ---
Discovery starting.
Discovery completed in 270 milliseconds.
Run starting. Expected test count is: 2
JavaADAMContextSuite:
2016-05-14 11:16:17 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-05-14 11:16:19 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can read a small .SAM file
2016-05-14 11:16:21 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
- can read a small .SAM file inside of java
Run completed in 6 seconds, 913 milliseconds.
Total number of tests run: 2
Suites: completed 2, aborted 0
Tests: succeeded 2, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
[INFO] 
[INFO] --- maven-jar-plugin:2.6:jar (default-jar) @ adam-apis_2.10 ---
[INFO] 
[INFO] --- maven-jar-plugin:2.6:test-jar (default) @ adam-apis_2.10 ---
[INFO] 
[INFO] >>> scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) > generate-sources @ adam-apis_2.10 >>>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-apis_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-apis_2.10 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-apis_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/src/main/scala added.
[INFO] 
[INFO] <<< scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) < generate-sources @ adam-apis_2.10 <<<
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) @ adam-apis_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
warning: there were 2 feature warning(s); re-run with -feature for details
model contains 11 documentable templates
one warning found
[INFO] Building jar: /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/target/adam-apis_2.10-0.19.0-javadoc.jar
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ adam-apis_2.10 ---
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/target/adam-apis_2.10-0.19.0.jar to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-apis_2.10/0.19.0/adam-apis_2.10-0.19.0.jar
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/pom.xml to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-apis_2.10/0.19.0/adam-apis_2.10-0.19.0.pom
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/target/adam-apis_2.10-0.19.0-tests.jar to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-apis_2.10/0.19.0/adam-apis_2.10-0.19.0-tests.jar
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-apis/target/adam-apis_2.10-0.19.0-javadoc.jar to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-apis_2.10/0.19.0/adam-apis_2.10-0.19.0-javadoc.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building ADAM_2.10: CLI 0.19.0
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.0:revision (default) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- templating-maven-plugin:1.0-alpha-3:filter-sources (filter-src) @ adam-cli_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/generated-sources/java-templates added.
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-cli_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/main/scala added.
[INFO] 
[INFO] --- scalariform-maven-plugin:0.1.4:format (default-cli) @ adam-cli_2.10 ---
[INFO] Modified 0 of 36 .scala files
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ adam-cli_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ adam-cli_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/generated-sources/java-templates:-1: info: compiling
[INFO] /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/main/scala:-1: info: compiling
[INFO] Compiling 26 source files to /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/scala-2.10.4/classes at 1463195790028
[INFO] prepare-compile in 0 s
[INFO] compile in 22 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ adam-cli_2.10 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 1 source file to /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/scala-2.10.4/classes
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ adam-cli_2.10 ---
[INFO] Test Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ adam-cli_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 9 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ adam-cli_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ adam-cli_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ adam-cli_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ adam-cli_2.10 ---
Discovery starting.
Discovery completed in 317 milliseconds.
Run starting. Expected test count is: 33
TransformSuite:
2016-05-14 11:16:55 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-05-14 11:16:57 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:16:59 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:16:59 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- unordered sam to unordered sam
2016-05-14 11:17:00 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:01 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:17:01 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- unordered sam to ordered sam
2016-05-14 11:17:01 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
2016-05-14 11:17:02 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:17:02 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- unordered sam, to adam, to sam
2016-05-14 11:17:03 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:04 WARN  AlignmentRecordRDDFunctions:520 - Caught exception when merging via Hadoop FileSystem API:
java.lang.UnsupportedOperationException: Not implemented by the RawLocalFileSystem FileSystem implementation
2016-05-14 11:17:04 WARN  AlignmentRecordRDDFunctions:521 - Retrying as manual copy from the driver which will degrade performance.
- unordered sam, to adam, to ordered sam
FlattenSuite:
2016-05-14 11:17:04 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can flatten a simple VCF file
FlagStatSuite:
2016-05-14 11:17:05 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- Standard FlagStat test
ViewSuite:
2016-05-14 11:17:06 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:07 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -f 0 -F 0 is a no-op
2016-05-14 11:17:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:08 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- no -f or -F args is a no-op
2016-05-14 11:17:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -f 4: only unmapped reads
2016-05-14 11:17:09 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:10 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -F 4: only mapped reads
2016-05-14 11:17:10 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:10 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -f 4 -F 8: unmapped reads with mapped mates
2016-05-14 11:17:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -f 12: unmapped reads with unmapped mates
2016-05-14 11:17:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:11 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -g 12: reads that are unmapped or whose mate is unmapped
2016-05-14 11:17:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -F 12: mapped reads with mapped mates
2016-05-14 11:17:12 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -g 36: unmapped reads or reads with mate on negative strand
2016-05-14 11:17:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:13 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- -F 36: unmapped reads or reads with mate on negative strand
PluginExecutorSuite:
2016-05-14 11:17:14 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- take10 works correctly on example SAM
2016-05-14 11:17:14 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- java take10 works correctly on example SAM
2016-05-14 11:17:14 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- takeN works correctly on example SAM with arg of '3'
ADAM2FastaSuite:
2016-05-14 11:17:15 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- round trip FASTA to nucleotide contig fragments in ADAM format to FASTA
Features2ADAMSuite:
- can convert a simple BED file !!! IGNORED !!!
2016-05-14 11:17:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
- can convert a simple wigfix file
AboutSuite:
- template variables have been replaced
- templated values are not empty
ADAMMainSuite:
- default command groups is not empty
- module provides default command groups
- inject default command groups when called via main
- command groups is empty when called via apply
- single command group
- add new command group to default command groups
- module restores default command groups when called via apply
- custom module with single command group
- custom module with new command group added to default command groups
Adam2FastqSuite:
2016-05-14 11:17:16 WARN  MetricsSystem:71 - Using default name DAGScheduler for source because spark.app.id is not set.
2016-05-14 11:17:18 ERROR AlignmentRecordRDDFunctions:75 - Found 16 read names that don't occur exactly twice:
    1x: 16

Samples:
    SRR062634.16445865
    SRR062634.9119161
    SRR062634.17190076
    SRR062634.17969132
    SRR062634.7301099
    SRR062634.2087100
    SRR062634.20911784
    SRR062634.16769670
    SRR062634.18958430
    SRR062634.12099057
    SRR062634.12606172
    SRR062634.14985224
    SRR062634.10448889
    SRR062634.4789722
    SRR062634.3203184
    SRR062634.17698657
- convert SAM to paired FASTQ
Run completed in 25 seconds, 412 milliseconds.
Total number of tests run: 33
Suites: completed 11, aborted 0
Tests: succeeded 33, failed 0, canceled 0, ignored 1, pending 0
All tests passed.
[INFO] 
[INFO] --- maven-jar-plugin:2.6:jar (default-jar) @ adam-cli_2.10 ---
[INFO] Building jar: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/adam-cli_2.10-0.19.0.jar
[INFO] 
[INFO] --- maven-shade-plugin:2.4.1:shade (default) @ adam-cli_2.10 ---
[INFO] Including commons-cli:commons-cli:jar:1.2 in the shaded jar.
[INFO] Including commons-httpclient:commons-httpclient:jar:3.1 in the shaded jar.
[INFO] Including commons-codec:commons-codec:jar:1.4 in the shaded jar.
[INFO] Including commons-logging:commons-logging:jar:1.1.3 in the shaded jar.
[INFO] Including org.apache.commons:commons-compress:jar:1.4.1 in the shaded jar.
[INFO] Including org.codehaus.jackson:jackson-core-asl:jar:1.9.13 in the shaded jar.
[INFO] Including org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13 in the shaded jar.
[INFO] Including com.google.code.findbugs:jsr305:jar:1.3.9 in the shaded jar.
[INFO] Including org.slf4j:slf4j-api:jar:1.7.10 in the shaded jar.
[INFO] Including log4j:log4j:jar:1.2.17 in the shaded jar.
[INFO] Including org.xerial.snappy:snappy-java:jar:1.1.1.7 in the shaded jar.
[INFO] Including com.thoughtworks.paranamer:paranamer:jar:2.6 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-io_2.10:jar:0.2.4 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-misc_2.10:jar:0.2.4 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpclient:jar:4.5.1 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpcore:jar:4.4.3 in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-cli_2.10:jar:0.2.4 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-avro:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-column:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-common:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-encoding:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-hadoop:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-jackson:jar:1.8.1 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-format:jar:2.3.0-incubating in the shaded jar.
[INFO] Including org.bdgenomics.utils:utils-metrics_2.10:jar:0.2.4 in the shaded jar.
[INFO] Including com.netflix.servo:servo-core:jar:0.10.0 in the shaded jar.
[INFO] Including com.google.code.findbugs:annotations:jar:2.0.0 in the shaded jar.
[INFO] Including com.netflix.servo:servo-internal:jar:0.10.0 in the shaded jar.
[INFO] Including org.scoverage:scalac-scoverage-plugin_2.10:jar:1.1.1 in the shaded jar.
[INFO] Including org.bdgenomics.bdg-formats:bdg-formats:jar:0.7.0 in the shaded jar.
[INFO] Including org.apache.avro:avro:jar:1.7.7 in the shaded jar.
[INFO] Including org.bdgenomics.adam:adam-core_2.10:jar:0.19.0 in the shaded jar.
[INFO] Including com.esotericsoftware.kryo:kryo:jar:2.24.0 in the shaded jar.
[INFO] Including com.esotericsoftware.minlog:minlog:jar:1.2 in the shaded jar.
[INFO] Including org.objenesis:objenesis:jar:2.1 in the shaded jar.
[INFO] Including commons-io:commons-io:jar:1.3.2 in the shaded jar.
[INFO] Including it.unimi.dsi:fastutil:jar:6.6.5 in the shaded jar.
[INFO] Including org.apache.parquet:parquet-scala_2.10:jar:1.8.1 in the shaded jar.
[INFO] Including org.seqdoop:hadoop-bam:jar:7.1.0 in the shaded jar.
[INFO] Including com.github.samtools:htsjdk:jar:1.139 in the shaded jar.
[INFO] Including org.apache.commons:commons-jexl:jar:2.1.1 in the shaded jar.
[INFO] Including org.tukaani:xz:jar:1.5 in the shaded jar.
[INFO] Including org.apache.ant:ant:jar:1.8.2 in the shaded jar.
[INFO] Including org.apache.ant:ant-launcher:jar:1.8.2 in the shaded jar.
[INFO] Including com.google.guava:guava:jar:16.0.1 in the shaded jar.
[INFO] Including org.bdgenomics.adam:adam-apis_2.10:jar:0.19.0 in the shaded jar.
[INFO] Including org.scala-lang:scala-library:jar:2.10.4 in the shaded jar.
[INFO] Including org.slf4j:slf4j-log4j12:jar:1.7.12 in the shaded jar.
[INFO] Including args4j:args4j:jar:2.0.31 in the shaded jar.
[INFO] Including net.codingwell:scala-guice_2.10:jar:4.0.0 in the shaded jar.
[INFO] Including com.google.inject:guice:jar:4.0 in the shaded jar.
[INFO] Including javax.inject:javax.inject:jar:1 in the shaded jar.
[INFO] Including aopalliance:aopalliance:jar:1.0 in the shaded jar.
[INFO] Including com.google.inject.extensions:guice-multibindings:jar:4.0 in the shaded jar.
[WARNING] annotations-2.0.0.jar, jsr305-1.3.9.jar define 34 overlapping classes: 
[WARNING]   - javax.annotation.Nonnegative
[WARNING]   - javax.annotation.CheckForSigned
[WARNING]   - javax.annotation.CheckForNull
[WARNING]   - javax.annotation.Tainted
[WARNING]   - javax.annotation.meta.TypeQualifierValidator
[WARNING]   - javax.annotation.meta.TypeQualifier
[WARNING]   - javax.annotation.Syntax
[WARNING]   - javax.annotation.Detainted
[WARNING]   - javax.annotation.Nonnull$Checker
[WARNING]   - javax.annotation.meta.TypeQualifierNickname
[WARNING]   - 24 more...
[WARNING] maven-shade-plugin has detected that some class files are
[WARNING] present in two or more JARs. When this happens, only one
[WARNING] single version of the class is copied to the uber jar.
[WARNING] Usually this is not harmful and you can skip these warnings,
[WARNING] otherwise try to manually exclude artifacts based on
[WARNING] mvn dependency:tree -Ddetail=true and the above output.
[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/adam-cli_2.10-0.19.0.jar with /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/adam-cli_2.10-0.19.0-shaded.jar
[INFO] 
[INFO] >>> scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) > generate-sources @ adam-cli_2.10 >>>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-versions) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- git-commit-id-plugin:2.2.0:revision (default) @ adam-cli_2.10 ---
[INFO] 
[INFO] --- templating-maven-plugin:1.0-alpha-3:filter-sources (filter-src) @ adam-cli_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/generated-sources/java-templates added.
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ adam-cli_2.10 ---
[INFO] Source directory: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/src/main/scala added.
[INFO] 
[INFO] <<< scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) < generate-sources @ adam-cli_2.10 <<<
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) @ adam-cli_2.10 ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.5.2 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version: 2.10.4
[WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version: 2.10.0
[WARNING] Multiple versions of scala libraries detected!
model contains 81 documentable templates
[INFO] Building jar: /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/adam-cli_2.10-0.19.0-javadoc.jar
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ adam-cli_2.10 ---
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/adam-cli_2.10-0.19.0.jar to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-cli_2.10/0.19.0/adam-cli_2.10-0.19.0.jar
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/pom.xml to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-cli_2.10/0.19.0/adam-cli_2.10-0.19.0.pom
[INFO] Installing /home/xubo/cloud/adam-2.10-0.19-git/adam-cli/target/adam-cli_2.10-0.19.0-javadoc.jar to /home/xubo/.m2/repository/org/bdgenomics/adam/adam-cli_2.10/0.19.0/adam-cli_2.10-0.19.0-javadoc.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] ADAM_2.10 .......................................... SUCCESS [  9.207 s]
[INFO] ADAM_2.10: Core .................................... SUCCESS [02:15 min]
[INFO] ADAM_2.10: APIs for Java ........................... SUCCESS [ 14.505 s]
[INFO] ADAM_2.10: CLI ..................................... SUCCESS [01:12 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:51 min
[INFO] Finished at: 2016-05-14T11:17:40+08:00
[INFO] Final Memory: 75M/437M
[INFO] ------------------------------------------------------------------------

你可能感兴趣的:(adam)