In the Importing data from RDBMS into Hadoop using sqoop i blogged about how to import data from RDBMS to Hive, but now i wanted to figure out how to export data from Hive back to RDBMS, Sqoop has export feature that allows you to export data from Hadoop directory(CSV files in a directory) to RDBMS, I wanted to try exporting data from sqoop so first i created a simple contact_hive table and populated some data in it, then i used sqoop to export the content of contact_hive table into contact table in MySQL, i followed these steps, if you already have a hive table populated then you can skip first 5 steps and go to step 6.
1,MahendraSingh,Dhoni,mahendra@bcci.com2,Virat,Kohali,virat@bcci.com5,Sachin,Tendulkar,sachin@bcci.com
hdfs dfs -put contacthive.csv /tmp
CREATE TABLE contact_hive(contactId Int, firstName String, lastName String, email String) row format delimited fields terminated by "," stored as textfile;
LOAD DATA INPATH "/tmp/contacthive.csv" OVERWRITE INTO TABLE contact_hive;
/apps/hive/warehouse
, You can verify that by executing following command on HDFShdfs dfs -ls /apps/hive/warehouse/contact_hive
CREATE TABLE CUSTOMER ( contactid INTEGER NOT NULL , firstname VARCHAR(50), lastname VARCHAR(50), email varchar(50));
sqoop export --connect jdbc:mysql://localhost/test --table CONTACT --export-dir /apps/hive/warehouse/contact_hive