Import/Export
Overview
The EXPORT
command exports the data of a table or partition, along with the metadata, into a specified output location. This output location can then be moved over to a different Hadoop or Hive instance and imported from there with the IMPORT
command.
When exporting a partitioned table, the original data may be located in different HDFS locations. The ability to export/import a subset of the partition is also supported.
Exported metadata is stored in the target directory, and data files are stored in subdirectories.
The EXPORT
and IMPORT
commands work independently(独立地) of the source and target metastore DBMS used; for example, they can be used between Derby and MySQL databases.
Export Syntax
EXPORT TABLE tablename [PARTITION (part_column=
"value"
[, ...])]
TO
'export_target_path'
|
Import Syntax
IMPORT [[EXTERNAL] TABLE new_or_original_tablename [PARTITION (part_column=
"value"
[, ...])]]
FROM
'source_path'
[LOCATION
'import_target_path'
]
|
Examples
Simple export and import:
export table department to 'hdfs_exports_location/department'; import from 'hdfs_exports_location/department';
Rename table on import:
export table department to 'hdfs_exports_location/department'; import table imported_dept from 'hdfs_exports_location/department';
Export partition and import:
export table employee partition (emp_country="in", emp_state="ka") to 'hdfs_exports_location/employee'; import from 'hdfs_exports_location/employee';
Export table and import partition:
export table employee to 'hdfs_exports_location/employee'; import table employee partition (emp_country="us", emp_state="tn") from 'hdfs_exports_location/employee';
Specify the import location:
export table department to 'hdfs_exports_location/department'; import table department from 'hdfs_exports_location/department' location 'import_target_location/department';
Import as an external table:
export table department to 'hdfs_exports_location/department'; import external table department from 'hdfs_exports_location/department';