大数据学习15:报错 FAILED: SemanticException Unable to determine if

大数据学习15:报错 FAILED: SemanticException Unable to determine if

报错:由于修改了hdfs 的端口,导致报错,这里是因为使用UDF自定义函数,hive要将hfds上的jar传入分布式缓存中,并且mysql中元数据未修改导致。
hive (default)> select ename ,hello(ename) from emp;
converting to local hdfs://hadoop002:8020/lib/hive_train-1.0.jar
Added [/tmp/03626a44-b1f4-4a3a-b60d-676754ec373c_resources/hive_train-1.0.jar] to class path
Added resources: [hdfs://hadoop002:8020/lib/hive_train-1.0.jar]
FAILED: SemanticException Unable to determine if hdfs://hadoop002:9000/user/hive/warehouse/emp is encrypted: java.lang.IllegalArgumentException: Wrong FS: hdfs://hadoop002:9000/user/hive/warehouse/emp, expected: hdfs://hadoop002:8020

解决办法:
原因,是mysql中dbs 和sds 有 hive 的元数据信息,修改了hdfs的端口,吗么也需要去mysql中也进行修改。
update 表名 set 字段名=REPLACE (字段名,’原来的值’,’要修改的值’)
范例:

update DBS set DB_LOCATION_URI=REPLACE (DB_LOCATION_URI,'bdc240.hexun.com:8020','nameservice1');  
update SDS set LOCATION=REPLACE (LOCATION,'bdc240.hexun.com:8020','nameservice1');  

实际操作:

update DBS set DB_LOCATION_URI=REPLACE (DB_LOCATION_URI,'hadoop002:9000','hadoop002:8020');
update SDS set LOCATION=REPLACE (LOCATION,'hadoop002:9000','hadoop002:8020');

结果
输入前:

mysql> select * from dbs;
+-------+-----------------------+--------------------------------------------------+---------+------------+------------+
| DB_ID | DESC                  | DB_LOCATION_URI                                  | NAME    | OWNER_NAME | OWNER_TYPE |
+-------+-----------------------+--------------------------------------------------+---------+------------+------------+
|     1 | Default Hive database | hdfs://hadoop002:9000/user/hive/warehouse        | default | public     | ROLE       |
|     6 | NULL                  | hdfs://hadoop002:9000/user/hive/warehouse/wxk.db | wxk     | root       | USER       |
+-------+-----------------------+--------------------------------------------------+---------+------------+------------+
mysql> select * from sds;
+-------+-------+------------------------------------------+---------------+---------------------------+-------------------------------------------------------+-------------+------------------------------------------------------------+----------+
| SD_ID | CD_ID | INPUT_FORMAT                             | IS_COMPRESSED | IS_STOREDASSUBDIRECTORIES | LOCATION                                              | NUM_BUCKETS | OUTPUT_FORMAT                                              | SERDE_ID |
+-------+-------+------------------------------------------+---------------+---------------------------+-------------------------------------------------------+-------------+------------------------------------------------------------+----------+
|    10 |    10 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:9000/hive_external/emp               |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       10 |
|    13 |    13 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:9000/user/hive/warehouse/aa_imported |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       13 |
|    14 |    14 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:9000/user/hive/warehouse/emp         |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       14 |
|    16 |    16 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:9000/user/hive/warehouse/wxk.db/aaa  |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       16 |
|    21 |    21 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:9000/user/hive/warehouse/load_a      |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       21 |
|    22 |    22 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:9000/user/hive/warehouse/load_a2     |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       22 |
|    26 |    26 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:9000/user/hive/warehouse/a           |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       26 |
|    27 |    27 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:9000/user/hive/warehouse/b           |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       27 |
|    28 |    28 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:9000/user/hive/warehouse/dept        |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       28 |
+-------+-------+------------------------------------------+---------------+---------------------------+-------------------------------------------------------+-------------+------------------------------------------------------------+----------+
9 rows in set (0.00 sec)

输入后:

mysql> select * from dbs;
+-------+-----------------------+--------------------------------------------------+---------+------------+------------+
| DB_ID | DESC                  | DB_LOCATION_URI                                  | NAME    | OWNER_NAME | OWNER_TYPE |
+-------+-----------------------+--------------------------------------------------+---------+------------+------------+
|     1 | Default Hive database | hdfs://hadoop002:8020/user/hive/warehouse        | default | public     | ROLE       |
|     6 | NULL                  | hdfs://hadoop002:8020/user/hive/warehouse/wxk.db | wxk     | root       | USER       |
+-------+-----------------------+--------------------------------------------------+---------+------------+------------+
mysql> select * from sds;
+-------+-------+------------------------------------------+---------------+---------------------------+-------------------------------------------------------+-------------+------------------------------------------------------------+----------+
| SD_ID | CD_ID | INPUT_FORMAT                             | IS_COMPRESSED | IS_STOREDASSUBDIRECTORIES | LOCATION                                              | NUM_BUCKETS | OUTPUT_FORMAT                                              | SERDE_ID |
+-------+-------+------------------------------------------+---------------+---------------------------+-------------------------------------------------------+-------------+------------------------------------------------------------+----------+
|    10 |    10 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:8020/hive_external/emp               |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       10 |
|    13 |    13 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:8020/user/hive/warehouse/aa_imported |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       13 |
|    14 |    14 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:8020/user/hive/warehouse/emp         |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       14 |
|    16 |    16 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:8020/user/hive/warehouse/wxk.db/aaa  |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       16 |
|    21 |    21 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:8020/user/hive/warehouse/load_a      |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       21 |
|    22 |    22 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:8020/user/hive/warehouse/load_a2     |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       22 |
|    26 |    26 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:8020/user/hive/warehouse/a           |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       26 |
|    27 |    27 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:8020/user/hive/warehouse/b           |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       27 |
|    28 |    28 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://hadoop002:8020/user/hive/warehouse/dept        |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |       28 |
+-------+-------+------------------------------------------+---------------+---------------------------+-------------------------------------------------------+-------------+------------------------------------------------------------+----------+

ps:
查看mysql是否自动commit;

mysql> show variables like 'autocommit';
+---------------+-------+
| Variable_name | Value |
+---------------+-------+
| autocommit    | ON    |
+---------------+-------+

然后再次运行,正常了!

hive (default)> select ename ,hello(ename) from emp;
OK
SMITH   Hello: SMITH
ALLEN   Hello: ALLEN
WARD    Hello: WARD
JONES   Hello: JONES
MARTIN  Hello: MARTIN
BLAKE   Hello: BLAKE
CLARK   Hello: CLARK
SCOTT   Hello: SCOTT
KING    Hello: KING
TURNER  Hello: TURNER
ADAMS   Hello: ADAMS
JAMES   Hello: JAMES
FORD    Hello: FORD
MILLER  Hello: MILLER
HIVE    Hello: HIVE
Time taken: 2.403 seconds, Fetched: 15 row(s)

你可能感兴趣的:(大数据)