ERROR 604 (42P00): Syntax error. Mismatched input. Expecting "RPAREN", got "ffffd72e"

sqoop 导hbase数据到hive

sqoop import  --driver org.apache.phoenix.jdbc.PhoenixDriver  --connect jdbc:phoenix:192.168.35.27,192.168.35.32,192.168.35.33:2181    --query "select *  from \"ods_app_deviceInfo\" where \$CONDITIONS and \"DAYSTR\" ='20190921'"  --hive-table s_evt_sjl_ods_app_eventInfo_i_d  --hive-database stage  --target-dir /user/hive/warehouse/stage.db/s_evt_sjl_ods_app_eventInfo_i_d/20190921  --hive-import  --delete-target-dir  --hive-overwrite  --split-by rowkey  --hive-partition-key dt  --hive-partition-value 20190921

phoenix 报错信息

19/10/09 16:06:57 INFO db.DBInputFormat: Using read commited transaction isolation
19/10/09 16:06:57 INFO mapred.MapTask: Processing split: rowkey >= 'KKKMS耼耲S' AND rowkey < 'X聘聘聙聛쀹쀲\'
19/10/09 16:06:57 INFO db.DBRecordReader: Working on split: rowkey >= 'KKKMS耼耲S' AND rowkey < 'X聘聘聙聛쀹쀲\'
19/10/09 16:06:57 ERROR db.DBRecordReader: Top level exception: 
org.apache.phoenix.exception.PhoenixParserException: ERROR 601 (42P00): Syntax error. Unexpected char: '''
        at org.apache.phoenix.exception.PhoenixParserException.newException(PhoenixParserException.java:33)
        at org.apache.phoenix.parse.SQLParser.parseStatement(SQLParser.java:118)
        at org.apache.phoenix.jdbc.PhoenixStatement$PhoenixStatementParser.parseStatement(PhoenixStatement.java:1185)
        at org.apache.phoenix.jdbc.PhoenixStatement.parseStatement(PhoenixStatement.java:1268)
        at org.apache.phoenix.jdbc.PhoenixPreparedStatement.(PhoenixPreparedStatement.java:94)
        at org.apache.phoenix.jdbc.PhoenixConnection.prepareStatement(PhoenixConnection.java:715)
        at org.apache.phoenix.jdbc.PhoenixConnection.prepareStatement(PhoenixConnection.java:744)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:562)
        at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
        at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:793)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:270)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Unexpected char: '''
        at org.apache.phoenix.parse.PhoenixSQLLexer.mOTHER(PhoenixSQLLexer.java:4170)
        at org.apache.phoenix.parse.PhoenixSQLLexer.mTokens(PhoenixSQLLexer.java:5234)
        at org.antlr.runtime.Lexer.nextToken(Lexer.java:85)
        at org.antlr.runtime.BufferedTokenStream.fetch(BufferedTokenStream.java:143)
        at org.antlr.runtime.BufferedTokenStream.sync(BufferedTokenStream.java:137)
        at org.antlr.runtime.CommonTokenStream.skipOffTokenChannels(CommonTokenStream.java:113)
        at org.antlr.runtime.CommonTokenStream.LT(CommonTokenStream.java:102)
        at org.antlr.runtime.BufferedTokenStream.LA(BufferedTokenStream.java:174)
        at org.antlr.runtime.BaseRecognizer.mismatchIsUnwantedToken(BaseRecognizer.java:127)
        at org.apache.phoenix.parse.PhoenixSQLParser.recoverFromMismatchedToken(PhoenixSQLParser.java:346)
        at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
        at org.apache.phoenix.parse.PhoenixSQLParser.not_expression(PhoenixSQLParser.java:6519)
        at org.apache.phoenix.parse.PhoenixSQLParser.and_expression(PhoenixSQLParser.java:6353)
        at org.apache.phoenix.parse.PhoenixSQLParser.or_expression(PhoenixSQLParser.java:6271)
        at org.apache.phoenix.parse.PhoenixSQLParser.expression(PhoenixSQLParser.java:6236)
        at org.apache.phoenix.parse.PhoenixSQLParser.single_select(PhoenixSQLParser.java:4439)
        at org.apache.phoenix.parse.PhoenixSQLParser.unioned_selects(PhoenixSQLParser.java:4521)
        at org.apache.phoenix.parse.PhoenixSQLParser.select_node(PhoenixSQLParser.java:4586)
        at org.apache.phoenix.parse.PhoenixSQLParser.oneStatement(PhoenixSQLParser.java:766)
        at org.apache.phoenix.parse.PhoenixSQLParser.statement(PhoenixSQLParser.java:500)
        at org.apache.phoenix.parse.SQLParser.parseStatement(SQLParser.java:108)
        ... 20 more
19/10/09 16:06:57 INFO mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
19/10/09 16:06:57 INFO mapred.LocalJobRunner: Starting task: attempt_local609028226_0001_m_000004_0
19/10/09 16:06:57 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/10/09 16:06:57 INFO output.FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19/10/09 16:06:57 INFO mapred.Task:  Using ResourceCalculatorProcessTree : [ ]
19/10/09 16:06:57 INFO db.DBInputFormat: Using read commited transaction isolation
19/10/09 16:06:57 INFO mapred.MapTask: Processing split: rowkey >= 'X聘聘聙聛쀹쀲\' AND rowkey < 'ffffd72e'
19/10/09 16:06:57 INFO db.DBRecordReader: Working on split: rowkey >= 'X聘聘聙聛쀹쀲\' AND rowkey < 'ffffd72e'
19/10/09 16:06:57 ERROR db.DBRecordReader: Top level exception: 
org.apache.phoenix.exception.PhoenixParserException: ERROR 604 (42P00): Syntax error. Mismatched input. Expecting "RPAREN", got "ffffd72e" at line 1, column 84.
        at org.apache.phoenix.exception.PhoenixParserException.newException(PhoenixParserException.java:33)
        at org.apache.phoenix.parse.SQLParser.parseStatement(SQLParser.java:111)
        at org.apache.phoenix.jdbc.PhoenixStatement$PhoenixStatementParser.parseStatement(PhoenixStatement.java:1185)
        at org.apache.phoenix.jdbc.PhoenixStatement.parseStatement(PhoenixStatement.java:1268)
        at org.apache.phoenix.jdbc.PhoenixPreparedStatement.(PhoenixPreparedStatement.java:94)
        at org.apache.phoenix.jdbc.PhoenixConnection.prepareStatement(PhoenixConnection.java:715)
        at org.apache.phoenix.jdbc.PhoenixConnection.prepareStatement(PhoenixConnection.java:744)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:562)
        at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
        at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:793)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:270)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: MismatchedTokenException(98!=124)
        at org.apache.phoenix.parse.PhoenixSQLParser.recoverFromMismatchedToken(PhoenixSQLParser.java:352)
        at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
        at org.apache.phoenix.parse.PhoenixSQLParser.not_expression(PhoenixSQLParser.java:6519)
        at org.apache.phoenix.parse.PhoenixSQLParser.and_expression(PhoenixSQLParser.java:6334)
        at org.apache.phoenix.parse.PhoenixSQLParser.or_expression(PhoenixSQLParser.java:6271)
        at org.apache.phoenix.parse.PhoenixSQLParser.expression(PhoenixSQLParser.java:6236)
        at org.apache.phoenix.parse.PhoenixSQLParser.single_select(PhoenixSQLParser.java:4439)
        at org.apache.phoenix.parse.PhoenixSQLParser.unioned_selects(PhoenixSQLParser.java:4521)
        at org.apache.phoenix.parse.PhoenixSQLParser.select_node(PhoenixSQLParser.java:4586)
        at org.apache.phoenix.parse.PhoenixSQLParser.oneStatement(PhoenixSQLParser.java:766)
        at org.apache.phoenix.parse.PhoenixSQLParser.statement(PhoenixSQLParser.java:500)
        at org.apache.phoenix.parse.SQLParser.parseStatement(SQLParser.java:108)
        ... 20 more
19/10/09 16:06:57 INFO mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
19/10/09 16:06:57 INFO mapred.LocalJobRunner: Starting task: attempt_local609028226_0001_m_000005_0
19/10/09 16:06:57 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
19/10/09 16:06:57 INFO output.FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
19/10/09 16:06:57 INFO mapred.Task:  Using ResourceCalculatorProcessTree : [ ]
19/10/09 16:06:57 INFO db.DBInputFormat: Using read commited transaction isolation
19/10/09 16:06:57 INFO mapred.MapTask: Processing split: rowkey >= 'ffffd72e' AND rowkey <= 'ffffd72e-cb66-4b1f-953d-152b75c18952_281682053156_1569082666308'
19/10/09 16:06:57 INFO db.DBRecordReader: Working on split: rowkey >= 'ffffd72e' AND rowkey <= 'ffffd72e-cb66-4b1f-953d-152b75c18952_281682053156_1569082666308'
19/10/09 16:06:57 INFO db.DBRecordReader: Executing query: select *  from "ods_app_deviceInfo" where ( rowkey >= 'ffffd72e' ) AND ( rowkey <= 'ffffd72e-cb66-4b1f-953d-152b75c18952_281682053156_1569082666308' ) and "DAYSTR" ='20190921'
19/10/09 16:06:57 INFO mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
19/10/09 16:06:57 INFO mapred.LocalJobRunner: 
19/10/09 16:06:57 INFO mapred.Task: Task:attempt_local609028226_0001_m_000005_0 is done. And is in the process of committing
19/10/09 16:06:57 INFO mapred.LocalJobRunner: 
19/10/09 16:06:57 INFO mapred.Task: Task attempt_local609028226_0001_m_000005_0 is allowed to commit now
19/10/09 16:06:57 INFO output.FileOutputCommitter: Saved output of task 'attempt_local609028226_0001_m_000005_0' to hdfs://thhadoop/user/hive/warehouse/stage.db/s_evt_sjl_ods_app_eventInfo_i_d/20190921/_temporary/0/task_local609028226_0001_m_000005
19/10/09 16:06:57 INFO mapred.LocalJobRunner: map
19/10/09 16:06:57 INFO mapred.Task: Task 'attempt_local609028226_0001_m_000005_0' done.
19/10/09 16:06:57 INFO mapred.LocalJobRunner: Finishing task: attempt_local609028226_0001_m_000005_0
19/10/09 16:06:57 INFO mapred.LocalJobRunner: map task executor complete.
19/10/09 16:06:57 WARN mapred.LocalJobRunner: job_local609028226_0001
java.lang.Exception: java.io.IOException: SQLException in nextKeyValue
        at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:489)
        at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:549)
Caused by: java.io.IOException: SQLException in nextKeyValue
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:562)
        at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
        at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:793)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:270)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.phoenix.exception.PhoenixParserException: ERROR 601 (42P00): Syntax error. Unexpected char: '''
        at org.apache.phoenix.exception.PhoenixParserException.newException(PhoenixParserException.java:33)
        at org.apache.phoenix.parse.SQLParser.parseStatement(SQLParser.java:118)
        at org.apache.phoenix.jdbc.PhoenixStatement$PhoenixStatementParser.parseStatement(PhoenixStatement.java:1185)
        at org.apache.phoenix.jdbc.PhoenixStatement.parseStatement(PhoenixStatement.java:1268)
        at org.apache.phoenix.jdbc.PhoenixPreparedStatement.(PhoenixPreparedStatement.java:94)
        at org.apache.phoenix.jdbc.PhoenixConnection.prepareStatement(PhoenixConnection.java:715)
        at org.apache.phoenix.jdbc.PhoenixConnection.prepareStatement(PhoenixConnection.java:744)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
        ... 13 more
Caused by: java.lang.RuntimeException: Unexpected char: '''
        at org.apache.phoenix.parse.PhoenixSQLLexer.mOTHER(PhoenixSQLLexer.java:4170)
        at org.apache.phoenix.parse.PhoenixSQLLexer.mTokens(PhoenixSQLLexer.java:5234)
        at org.antlr.runtime.Lexer.nextToken(Lexer.java:85)
        at org.antlr.runtime.BufferedTokenStream.fetch(BufferedTokenStream.java:143)
        at org.antlr.runtime.BufferedTokenStream.sync(BufferedTokenStream.java:137)
        at org.antlr.runtime.CommonTokenStream.skipOffTokenChannels(CommonTokenStream.java:113)
        at org.antlr.runtime.CommonTokenStream.LT(CommonTokenStream.java:102)
        at org.antlr.runtime.BufferedTokenStream.LA(BufferedTokenStream.java:174)
        at org.antlr.runtime.BaseRecognizer.mismatchIsUnwantedToken(BaseRecognizer.java:127)
        at org.apache.phoenix.parse.PhoenixSQLParser.recoverFromMismatchedToken(PhoenixSQLParser.java:346)
        at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
        at org.apache.phoenix.parse.PhoenixSQLParser.not_expression(PhoenixSQLParser.java:6519)
        at org.apache.phoenix.parse.PhoenixSQLParser.and_expression(PhoenixSQLParser.java:6353)
        at org.apache.phoenix.parse.PhoenixSQLParser.or_expression(PhoenixSQLParser.java:6271)
        at org.apache.phoenix.parse.PhoenixSQLParser.expression(PhoenixSQLParser.java:6236)
        at org.apache.phoenix.parse.PhoenixSQLParser.single_select(PhoenixSQLParser.java:4439)
        at org.apache.phoenix.parse.PhoenixSQLParser.unioned_selects(PhoenixSQLParser.java:4521)
        at org.apache.phoenix.parse.PhoenixSQLParser.select_node(PhoenixSQLParser.java:4586)
        at org.apache.phoenix.parse.PhoenixSQLParser.oneStatement(PhoenixSQLParser.java:766)
        at org.apache.phoenix.parse.PhoenixSQLParser.statement(PhoenixSQLParser.java:500)
        at org.apache.phoenix.parse.SQLParser.parseStatement(SQLParser.java:108)
        ... 20 more
19/10/09 16:06:58 INFO mapreduce.Job: Job job_local609028226_0001 failed with state FAILED due to: NA
19/10/09 16:06:58 INFO mapreduce.Job: Counters: 20
        File System Counters
                FILE: Number of bytes read=651296584
                FILE: Number of bytes written=658094664
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=0
                HDFS: Number of bytes written=14942279
                HDFS: Number of read operations=38
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=26
        Map-Reduce Framework
                Map input records=17961
                Map output records=17961
                Input split bytes=619
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=19
                Total committed heap usage (bytes)=4020240384
        File Input Format Counters 
                Bytes Read=0
        File Output Format Counters 
                Bytes Written=5181467
19/10/09 16:06:58 INFO mapreduce.ImportJobBase: Transferred 14.2501 MB in 13.4899 seconds (1.0564 MB/sec)
19/10/09 16:06:58 INFO mapreduce.ImportJobBase: Retrieved 17961 records.
19/10/09 16:06:58 ERROR tool.ImportTool: Import failed: Import job failed!

解决办法:

加入–boundary-query 参数:

sqoop import  --driver org.apache.phoenix.jdbc.PhoenixDriver  --connect jdbc:phoenix:192.168.35.27,192.168.35.32,192.168.35.33:2181   --boundary-query "SELECT min(substr(rowkey, 0, 1)), max(substr(rowkey, 0, 1)) from \"ods_app_deviceInfo\""  --query "select *  from \"ods_app_deviceInfo\" where \$CONDITIONS and \"DAYSTR\" ='20190921'"  --hive-table s_evt_sjl_ods_app_eventInfo_i_d  --hive-database stage  --target-dir /user/hive/warehouse/stage.db/s_evt_sjl_ods_app_eventInfo_i_d/20190921  --hive-import  --delete-target-dir  --hive-overwrite  --split-by rowkey  --hive-partition-key dt  --hive-partition-value 20190921

https://github.com/forcedotcom/phoenix/wiki/Tuning

你可能感兴趣的:(问题)