ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user

使用sqoop从MySQL数据库导出数据时:

[root@node3 bin]# ./sqoop import --connect jdbc:mysql://192.168.0.109:3306/fantest1 --username root --password root --table goods
出现如下错误:
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
17/05/21 10:46:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/05/21 10:46:57 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/05/21 10:46:57 INFO tool.CodeGenTool: Beginning code generation
17/05/21 10:46:57 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user 'root'@'node3' (using password: YES)
java.sql.SQLException: Access denied for user 'root'@'node3' (using password: YES)
	at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1055)
	at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3536)
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3468)
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:917)
	at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:3974)
	at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1282)
	at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2142)
	at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:773)
	at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:46)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)
	at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:352)
	at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:282)
	at java.sql.DriverManager.getConnection(DriverManager.java:664)
	at java.sql.DriverManager.getConnection(DriverManager.java:247)
	at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:801)
	at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
	at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:660)
	at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:683)
	at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
	at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:223)
	at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:347)
	at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1277)
	at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1089)
	at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:396)
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
17/05/21 10:46:57 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
	at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1095)
	at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:396)
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

原因:MySQL权限不够,解决如下:

mysql> grant all privileges on *.* to [email protected] identified by 'root';
Query OK, 0 rows affected (0.00 sec)
mysql> flush privileges;
这里因为是集群,我把所有权限对所有的集群节点上的IP段都开放了,现在再执行导入:

[root@node3 bin]# ./sqoop import --connect jdbc:mysql://192.168.0.109:3306/fantest1 --username root --password root --table goods
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
17/05/21 11:54:20 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/05/21 11:54:20 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/05/21 11:54:20 INFO tool.CodeGenTool: Beginning code generation
17/05/21 11:54:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `goods` AS t LIMIT 1
17/05/21 11:54:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `goods` AS t LIMIT 1
17/05/21 11:54:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hadoop/hadoop-2.2.0
注: /tmp/sqoop-root/compile/4cf3dc2d0f700c28ecd48d5af5b57199/goods.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
17/05/21 11:54:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/4cf3dc2d0f700c28ecd48d5af5b57199/goods.jar
17/05/21 11:54:22 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/05/21 11:54:22 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/05/21 11:54:22 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
17/05/21 11:54:22 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/05/21 11:54:22 INFO mapreduce.ImportJobBase: Beginning import of goods
17/05/21 11:54:23 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/05/21 11:54:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
17/05/21 11:54:24 INFO client.RMProxy: Connecting to ResourceManager at node3/192.168.0.108:8032
17/05/21 11:54:27 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`id`), MAX(`id`) FROM `goods`
17/05/21 11:54:28 INFO mapreduce.JobSubmitter: number of splits:3
17/05/21 11:54:28 INFO Configuration.deprecation: mapred.job.name is deprecated. Instead, use mapreduce.job.name
17/05/21 11:54:28 INFO Configuration.deprecation: mapred.cache.files.timestamps is deprecated. Instead, use mapreduce.job.cache.files.timestamps
17/05/21 11:54:28 INFO Configuration.deprecation: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
17/05/21 11:54:28 INFO Configuration.deprecation: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class
17/05/21 11:54:28 INFO Configuration.deprecation: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class
17/05/21 11:54:28 INFO Configuration.deprecation: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class
17/05/21 11:54:28 INFO Configuration.deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
17/05/21 11:54:28 INFO Configuration.deprecation: mapred.cache.files is deprecated. Instead, use mapreduce.job.cache.files
17/05/21 11:54:28 INFO Configuration.deprecation: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
17/05/21 11:54:28 INFO Configuration.deprecation: mapred.job.classpath.files is deprecated. Instead, use mapreduce.job.classpath.files
17/05/21 11:54:28 INFO Configuration.deprecation: user.name is deprecated. Instead, use mapreduce.job.user.name
17/05/21 11:54:28 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
17/05/21 11:54:28 INFO Configuration.deprecation: mapred.cache.files.filesizes is deprecated. Instead, use mapreduce.job.cache.files.filesizes
17/05/21 11:54:28 INFO Configuration.deprecation: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class
17/05/21 11:54:28 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1495332360698_0002
17/05/21 11:54:29 INFO impl.YarnClientImpl: Submitted application application_1495332360698_0002 to ResourceManager at node3/192.168.0.108:8032
17/05/21 11:54:29 INFO mapreduce.Job: The url to track the job: http://node3:8088/proxy/application_1495332360698_0002/
17/05/21 11:54:29 INFO mapreduce.Job: Running job: job_1495332360698_0002
17/05/21 11:54:39 INFO mapreduce.Job: Job job_1495332360698_0002 running in uber mode : false
17/05/21 11:54:39 INFO mapreduce.Job:  map 0% reduce 0%
17/05/21 11:54:51 INFO mapreduce.Job:  map 100% reduce 0%
17/05/21 11:54:52 INFO mapreduce.Job: Job job_1495332360698_0002 completed successfully
17/05/21 11:54:52 INFO mapreduce.Job: Counters: 27
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=274617
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=295
		HDFS: Number of bytes written=20
		HDFS: Number of read operations=12
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=6
	Job Counters 
		Launched map tasks=3
		Other local map tasks=3
		Total time spent by all maps in occupied slots (ms)=27853
		Total time spent by all reduces in occupied slots (ms)=0
	Map-Reduce Framework
		Map input records=3
		Map output records=3
		Input split bytes=295
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=524
		CPU time spent (ms)=4190
		Physical memory (bytes) snapshot=280723456
		Virtual memory (bytes) snapshot=6218539008
		Total committed heap usage (bytes)=54829056
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=20
17/05/21 11:54:52 INFO mapreduce.ImportJobBase: Transferred 20 bytes in 28.4932 seconds (0.7019 bytes/sec)
17/05/21 11:54:52 INFO mapreduce.ImportJobBase: Retrieved 3 records.
导入至HDFS成功。

你可能感兴趣的:(Hadoop)