@(OTHERS)[maven]
有时候maven编译或者打包时下载包的阶段出错,一般都是网络问题,相信自己,你没写错。如:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test (default-test) on project myusml: Execution default-test of goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test failed: Plugin org.apache.maven.plugins:maven-surefire-plugin:2.18.1 or one of its dependencies could not be resolved: Failed to collect dependencies at org.apache.maven.plugins:maven-surefire-plugin:jar:2.18.1 -> org.apache.maven.surefire:maven-surefire-common:jar:2.18.1: Failed to read artifact descriptor for org.apache.maven.surefire:maven-surefire-common:jar:2.18.1: Could not transfer artifact org.apache.maven.surefire:maven-surefire-common:pom:2.18.1 from/to central (https://repo.maven.apache.org/maven2): Remote host closed connection during handshake: SSL peer shut down incorrectly -> [Help 1]
国内访问某些外国网站的网络很不稳定,因此需要添加国内的中央库:
1、修改$M2_HOME/conf/setting.xml中添加国内的源
(1)配置 mirror
...
osc
central
http://maven.oschina.net/content/groups/public/
osc_thirdparty
thirdparty
http://maven.oschina.net/content/repositories/thirdparty/
...
(2)配置 profile
...
osc
true
osc
http://maven.oschina.net/content/groups/public/
osc_thirdparty
http://maven.oschina.net/content/repositories/thirdparty/
osc
http://maven.oschina.net/content/groups/public/
...
详见http://maven.oschina.net/help.html
2、出现以下错误:
Description Resource Path Location Type Could not calculate build plan: Failure to transfer org.apache.maven.plugins:maven-compiler-plugin:pom:2.0.2
执行:
find ~/.m2 -name “*.lastUpdated” -exec grep -q “Could not transfer” {} \; -print -exec rm {} \;
原因是之前有一些下载失败了。
详见http://stackoverflow.com/questions/5074063/maven-error-failure-to-transfer
3、配置eclipse使用本地安装的maven,而不是自带的版本。同时在User setting里面,也要指定setting.xml是使用哪个配置文件,从而指定国内的库。
否则国内有时候连不上maven中央库,导致出现以下异常:
Description Resource Path Location Type
Failed to read artifact descriptor for org.apache.storm:storm-core:jar:0.10.0
org.eclipse.aether.resolution.ArtifactDescriptorException: Failed to read artifact descriptor for org.apache.storm:storm-core:jar:0.10.0
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:302)
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.readArtifactDescriptor(DefaultArtifactDescriptorReader.java:218)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.resolveCachedArtifactDescriptor(DefaultDependencyCollector.java:535)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.getArtifactDescriptorResult(DefaultDependencyCollector.java:519)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.processDependency(DefaultDependencyCollector.java:409)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.processDependency(DefaultDependencyCollector.java:363)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.process(DefaultDependencyCollector.java:351)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.collectDependencies(DefaultDependencyCollector.java:254)
at org.eclipse.aether.internal.impl.DefaultRepositorySystem.collectDependencies(DefaultRepositorySystem.java:316)
at org.apache.maven.project.DefaultProjectDependenciesResolver.resolve(DefaultProjectDependenciesResolver.java:172)
at org.apache.maven.project.DefaultProjectBuilder.resolveDependencies(DefaultProjectBuilder.java:215)
at org.apache.maven.project.DefaultProjectBuilder.build(DefaultProjectBuilder.java:188)
at org.apache.maven.project.DefaultProjectBuilder.build(DefaultProjectBuilder.java:119)
at org.eclipse.m2e.core.internal.embedder.MavenImpl.readMavenProject(MavenImpl.java:636)
at org.eclipse.m2e.core.internal.project.registry.DefaultMavenDependencyResolver.resolveProjectDependencies(DefaultMavenDependencyResolver.java:63)
at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager.refreshPhase2(ProjectRegistryManager.java:529)
at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager$3.call(ProjectRegistryManager.java:491)
at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager$3.call(ProjectRegistryManager.java:1)
at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.executeBare(MavenExecutionContext.java:176)
at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.execute(MavenExecutionContext.java:151)
at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager.refresh(ProjectRegistryManager.java:495)
at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager.refresh(ProjectRegistryManager.java:350)
at org.eclipse.m2e.core.internal.project.registry.ProjectRegistryManager.refresh(ProjectRegistryManager.java:297)
at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager.updateProjectConfiguration0(ProjectConfigurationManager.java:398)
at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager$2.call(ProjectConfigurationManager.java:345)
at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager$2.call(ProjectConfigurationManager.java:1)
at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.executeBare(MavenExecutionContext.java:176)
at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.execute(MavenExecutionContext.java:151)
at org.eclipse.m2e.core.internal.embedder.MavenExecutionContext.execute(MavenExecutionContext.java:99)
at org.eclipse.m2e.core.internal.embedder.MavenImpl.execute(MavenImpl.java:1351)
at org.eclipse.m2e.core.internal.project.ProjectConfigurationManager.updateProjectConfiguration(ProjectConfigurationManager.java:342)
at org.eclipse.m2e.core.ui.internal.UpdateMavenProjectJob.runInWorkspace(UpdateMavenProjectJob.java:77)
at org.eclipse.core.internal.resources.InternalWorkspaceJob.run(InternalWorkspaceJob.java:38)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)
Caused by: org.eclipse.aether.resolution.ArtifactResolutionException: Failure to transfer org.apache.storm:storm-core:pom:0.10.0 from https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced. Original error: Could not transfer artifact org.apache.storm:storm-core:pom:0.10.0 from/to central (https://repo.maven.apache.org/maven2): Remote host closed connection during handshake
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:444)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:246)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifact(DefaultArtifactResolver.java:223)
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:287)
... 33 more
Caused by: org.eclipse.aether.transfer.ArtifactTransferException: Failure to transfer org.apache.storm:storm-core:pom:0.10.0 from https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced. Original error: Could not transfer artifact org.apache.storm:storm-core:pom:0.10.0 from/to central (https://repo.maven.apache.org/maven2): Remote host closed connection during handshake
at org.eclipse.aether.internal.impl.DefaultUpdateCheckManager.newException(DefaultUpdateCheckManager.java:238)
at org.eclipse.aether.internal.impl.DefaultUpdateCheckManager.checkArtifact(DefaultUpdateCheckManager.java:206)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.gatherDownloads(DefaultArtifactResolver.java:585)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.performDownloads(DefaultArtifactResolver.java:503)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:421)
... 36 more
pom.xml /stormfilter line 1 Maven Dependency Problem
如果还不行,先到命令行执行mvn clean package,看原因是什么,很有可能是连接中央库失败。
默认情况下,maven package只会将用户的代码打包,它会认为运行机器的classpath中存在它所需要的依赖包。但有时候需要将项目的所有依赖包都打包到发布的jar包中,此时可以通过以下的plugin实现:
org.apache.maven.plugins
maven-shade-plugin
2.3
package
shade
*:*
META-INF/*.SF
META-INF/*.DSA
META-INF/*.RSA
如果有一部分依赖包不想打包进jar包中,可以指定其为provided。比如运行一个spark程序时,classpath中必然会加载spark的包,此时就可以将spark指定为provided。
org.apache.spark
spark-core_2.10
1.5.1
provided
使用这个plugin有可能出现以下错误:
2016-08-11 10:08:54,902 ERROR org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost: Failed to load coprocessor com.lujinhong.hbase.solr.indexer.SolrIndexerObserver
java.lang.SecurityException: Invalid signature file digest for Manifest main attributes
at sun.security.util.SignatureFileVerifier.processImpl(SignatureFileVerifier.java:286)
at sun.security.util.SignatureFileVerifier.process(SignatureFileVerifier.java:239)
at java.util.jar.JarVerifier.processEntry(JarVerifier.java:317)
at java.util.jar.JarVerifier.update(JarVerifier.java:228)
at java.util.jar.JarFile.initializeVerifier(JarFile.java:348)
at java.util.jar.JarFile.getInputStream(JarFile.java:415)
at sun.misc.URLClassPath$JarLoader$2.getInputStream(URLClassPath.java:775)
at sun.misc.Resource.cachedInputStream(Resource.java:77)
at sun.misc.Resource.getByteBuffer(Resource.java:160)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:436)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at org.apache.hadoop.hbase.util.CoprocessorClassLoader.loadClass(CoprocessorClassLoader.java:292)
at org.apache.hadoop.hbase.coprocessor.CoprocessorHost.load(CoprocessorHost.java:183)
at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.loadTableCoprocessors(RegionCoprocessorHost.java:326)
at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.(RegionCoprocessorHost.java:225)
at org.apache.hadoop.hbase.regionserver.HRegion.(HRegion.java:773)
at org.apache.hadoop.hbase.regionserver.HRegion.(HRegion.java:681)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:5677)
at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5987)
at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5959)
at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5915)
at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5866)
at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(OpenRegionHandler.java:356)
at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(OpenRegionHandler.java:126)
at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:128)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
解决方法见:
http://stackoverflow.com/questions/999489/invalid-signature-file-when-attempting-to-run-a-jar
即在plugin中添加以下内容【上面例子中已经包括了】:
*:*
META-INF/*.SF
META-INF/*.DSA
META-INF/*.RSA
如果在运行中出现NoSuchMethodError或者ClassNotFoundException的异常,而你又确定classpath中存在这个包,那是因为你的classpath中存在2个不同版本的jar包了,比如常见的log4j,你在classpath中添加了log4j.jar,而spark的lib目录中也有log4j.jar,而且这2个jar包版本不一致的话,就会出现这种问题。
解决办法有2种:
(1)修改程序,让你的代码使用spark(比如)提供的jar包的版本。
(2)使用称为shading的方式修改你的应用。使用上面提到的maven-shade-plugin插件进行高级配置来支持这种打包方式。shading可以让你以另一种命名空间保留冲突的包,并自动重写应用的代码使得它们使用重命名后的版本。这种技术有些简单粗暴,不过对于解决运行时依赖冲突的问题非常有效。
另一个例子,storm使用了log4j2.x,而hbase使用了1.x,因此在storm项目中导入hbase需要将其exclude掉:
org.apache.hbase
hbase-client
1.0.0
org.slf4j
slf4j-api
org.slf4j
slf4j-log4j12
log4j
log4j
只要log4j有类似NoClassDefindFound之类的冲突都这样解决。
在pom.xml中增加以下内容:
cloudera
https://repository.cloudera.com/artifactory/cloudera-repos/
然后指定CDH的版本:
org.apache.hbase
hbase-client
1.0.0-cdh5.4.5
版本号可以从https://repository.cloudera.com/artifactory/cloudera-repos/查到。
-DskipTests,不执行测试用例,但编译测试用例类生成相应的class文件至target/test-classes下。
-Dmaven.test.skip=true,不执行测试用例,也不编译测试用例类。
在
标签中增加以下内容
1.8
1.8
1.8
如果不指定这些参数,则会使用$JAVA_HOME定义的jdk版本。所以也可以通过改变$JAVA_HOME来改变行为
或者在pom.xml中添加以下内容:
org.apache.maven.plugins
maven-compiler-plugin
1.7
见第2点shade与超级包里面有一种更好的解决方案。
使用maven打包(尤其是uber包)时,可能出现以下错误:
Exception in thread "main" java.lang.SecurityException: Invalid signature file digest for Manifest main attributes
at sun.security.util.SignatureFileVerifier.processImpl(SignatureFileVerifier.java:286)
at sun.security.util.SignatureFileVerifier.process(SignatureFileVerifier.java:239)
at java.util.jar.JarVerifier.processEntry(JarVerifier.java:317)
at java.util.jar.JarVerifier.update(JarVerifier.java:228)
at java.util.jar.JarFile.initializeVerifier(JarFile.java:348)
at java.util.jar.JarFile.getInputStream(JarFile.java:415)
at sun.misc.URLClassPath$JarLoader$2.getInputStream(URLClassPath.java:775)
at sun.misc.Resource.cachedInputStream(Resource.java:77)
at sun.misc.Resource.getByteBuffer(Resource.java:160)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:436)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
这是因为打包了一些元信息文件进包里面导致的,可以将其删除:
zip -d jarname.jar META-INF/*.RSA META-INF/*.DSA META-INF/*.SF
如果maven项目中有scala代码,需要在pom.xml中添加以下内容:
net.alchim31.maven
scala-maven-plugin
3.2.0
compile-scala
compile
add-source
compile
test-compile-scala
test-compile
add-source
testCompile
2.11.8
定义时:
UTF-8
0.2
使用时:
com.lujinhong
lujinhong-commons
${project.version}
先将本地项目mvn install到本地,然后在项目中添加repo:
in-project
In Project Repo
file:///Users/liaoliuqing/.m2/repository
最后就可以正常的使用dependency了,如:
com.lujinhong
lujinhong-commons-hadoop
${project.version}
尤其是依赖于父项目的子模块,不然会有各种问题:http://stackoverflow.com/questions/1981151/warning-on-using-project-parent-version-as-the-version-of-a-module-in-maven-3
[WARNING] ‘version’ contains an expression but should be a constant. @ com.lujinhong:lujinhong-commons:${project.version}, /Users/liaoliuqing/99_Project/lujinhong-commons/pom.xml, line 9, column 11