HBase/Spark Guava依赖冲突解决方案

前言

版本 guava 版本
Spark 2.2.0 12.0.1
HBase 1.0.0-cdh5.6.0 20.0

由于guava的版本在16.0以后,不向后兼容,所以Spark程序中集成HBase是会报找不到依赖的方法错误, 如下:

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.IllegalAccessError: tried to access method com.google.common.base.Stopwatch.()V from class org.apache.hadoop.hbase.zookeeper.MetaTableLocator
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:229)
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:202)
    at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:326)
    at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:301)
    at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:166)
    at org.apache.hadoop.hbase.client.ClientScanner.(ClientScanner.java:161)
    at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
    at org.springframework.data.hadoop.hbase.HbaseTemplate$1.doInTable(HbaseTemplate.java:132)
    at org.springframework.data.hadoop.hbase.HbaseTemplate.execute(HbaseTemplate.java:61)
    ... 75 more

解决方案

使用 maven-shade-plugin 打包hbase依赖,把依赖的包打包进项目,并且修改依赖包的包名,这样hbase引用的就是修改包名后的guava包了,不在依赖原生的guava包。



    
        xql
        com.xxx.xxx.xxx
        1.0.0
    
    4.0.0

    your-hbase

    
        
            org.apache.hbase
            hbase-server
            1.0.0
            
                
                    com.google.guava
                    guava
                
            
        
        
            org.apache.hbase
            hbase-common
            1.0.0
        
        
            com.google.guava
            guava
            16.0
        
    

    
        
            
                org.apache.maven.plugins
                maven-shade-plugin
                3.0.0
                
                    
                        package
                        
                            shade
                        
                    
                
                
                    
                        
                            *:*
                            
                                META-INF/*.SF
                                META-INF/*.DSA
                                META-INF/*.RSA
                            
                        
                    
                    
                
            
        
    

然后,spark集成hbase时,引用maven-shade-plugin打包后的hbase包,冲突解决。

 
     com.xxx.xxx.xxx
     your-hbase
     1.0.0
 

你可能感兴趣的:(HBase/Spark Guava依赖冲突解决方案)