In the Importing data from RDBMS into Hive i blogged about how to import data from RDBMS into Hive using Sqoop. In that case the import command took care of both creating table in Hive based on RDMBS table as well as importing data from RDBMS into Hive. But Sqoop can also be used to import data stored in HDFS text file into Hive. I wanted to try that out, so what i did is i created the contact table in Hive manually and then used the contact table that i exported as text file into HDFS as input
sqoop import --connect jdbc:mysql://macos/test --table contact -m 1
After import is done i can see content of the text file by executing hdfs dfs -cat contact/part-m-00000
like this
sqoop create-hive-table --connect jdbc:mysql://macos/test --table Address --fields-terminated-by ','
LOAD DATA INPATH 'contact' into table contact;