sqoop导出orc数据至mysql,将Sqoop导入为OrC文件

Is there any option in sqoop to import data from RDMS and store it as ORC file format in HDFS?

Alternatives tried: imported as text format and used a temp table to read input as text file and write to hdfs as orc in hive

解决方案

At least in Sqoop 1.4.5 there exists hcatalog integration that support orc file format (amongst others).

For example you have the option

--hcatalog-storage-stanza

which can be set to

stored as orc tblproperties ("orc.compress"="SNAPPY")

Example:

sqoop import

--connect jdbc:postgresql://foobar:5432/my_db

--driver org.postgresql.Driver

--connection-manager org.apache.sqoop.manager.GenericJdbcManager

--username foo

--password-file hdfs:///user/foobar/foo.txt

--table fact

--hcatalog-home /usr/hdp/current/hive-webhcat

--hcatalog-database my_hcat_db

--hcatalog-table fact

--create-hcatalog-table

--hcatalog-storage-stanza 'stored as orc tblproperties ("orc.compress"="SNAPPY")'

你可能感兴趣的:(sqoop导出orc数据至mysql,将Sqoop导入为OrC文件)