大数据应用笔记:Phoenix +HBASE 在JAVA应用中错误一例“hbase.DoNotRetryIOException”

项目中使用了Phoenix +HBASE的大数据平台。

JAVA8 + Mybatis + SpringMVC 框架下,无论是通过  phoenix-4.14.0-cdh5.14.2-client.jar 驱动来查询数据表还是通过

SQuirrel 客户端工具查询界面来访问该数据表,均报以下错误:

org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: MyDATA_TABLE: at index 76
    at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
    at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:598)
    at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16357)
    at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7874)
    at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1989)
    at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1971)
    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33652)
    at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2183)
    at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:185)
    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:165)
Caused by: java.lang.NullPointerException: at index 76
    at com.google.common.collect.ImmutableList.checkElementNotNull(ImmutableList.java:311)
    at com.google.common.collect.ImmutableList.construct(ImmutableList.java:302)
    at com.google.common.collect.ImmutableList.copyOf(ImmutableList.java:278)
    at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:548)
    at org.apache.phoenix.schema.PTableImpl.(PTableImpl.java:421)
    at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
    at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1073)
    at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:614)
    at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:3430)
    at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:3380)
    at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:556)
    ... 9 more
    at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:144)
    at org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:1379)
    at org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:1343)
    at org.apache.phoenix.query.ConnectionQueryServicesImpl.getTable(ConnectionQueryServicesImpl.java:1560)
    at org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:643)
    at org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:538)
    at org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:573)
    at org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.(FromCompiler.java:391)
    at org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:228)
    at org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:206)
    at org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStatement.java:482)
    at org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStatement.java:456)
    at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:302)
    at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:291)
    at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
    at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:290)
    at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:283)
    at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:1793)
    at net.sourceforge.squirrel_sql.client.session.mainpanel.objecttree.tabs.table.ContentsTab.createResultSet(ContentsTab.java:349)
    at net.sourceforge.squirrel_sql.client.session.mainpanel.objecttree.tabs.table.ContentsTab.createDataSet(ContentsTab.java:249)
    at net.sourceforge.squirrel_sql.client.session.mainpanel.objecttree.tabs.BaseDataSetTab$1.run(BaseDataSetTab.java:131)
    at net.sourceforge.squirrel_sql.fw.util.TaskExecuter.run(TaskExecuter.java:82)
    at java.lang.Thread.run(Unknown Source)
Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: MyDATA_TABLE: at index 76

其他数据表读取和访问均正常,经STACK OVERFLOW查询和同事验证,解决办法如下:

仍然是通过SQuirrel 客户端工具来打开,在SQL对话窗体中执行

delete from SYSTEM.CATALOG WHERE TABLE_NAME='MYDATA_TABLE'  --将Phoenix系统表中该数据表的定义全部删除

再次运行该表的创建语句 CREATE TABLE XXX(略)

重新访问该表就会看到数据可以正常访问了,且以前的数据不会丢失。

原因:Phoenix 对于该表建立的映射关系中与HBASE中的字段不一致,后者缺乏某一个字段的定义,导致数据向HBASE存取时发生空指针异常!

你可能感兴趣的:(企业应用,其它,Java,Web开发,大数据)