Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

The Sqoop in Hadoop story to process structural data


Sqoop in Hadoop eco system
Gettyimages.in
Why Sqoop you need while working on Hadoop

The Sqoop and its primary reason is to Import data from structural data sources such as Oracle/DB2 into HDFS(also called Hadoop file system).

To our readers, I have collected a good video from Edureka which helps you to understand the functionality of Sqoop.

The comparison between Sqoop and Flume


The Sqoop the word came from SQL+Hadoop

Sqoop word came from SQL+HADOOP=SQOOP. And Sqoop is a data transfer tool. The main use of Sqoop is to import and export the large amount of data from RDBMS to HDFS and vice versa.

List of basic Sqoop commands
  • Codegen- It helps to generate code to interact with database records.
  • Create-hive-table- It helps to Import a table definition into a hive
  • Eval- It helps to evaluateSQL statement and display the results
  • Export-It helps to export an HDFS directory into a database table
  • Help- It helps to list the available commands
  • Import- It helps to import a table from a database to HDFS
  • Import-all-tables- It helps to import tables from a database to HDFS
  • List-databases- It helps to list available databases on a server
  • List-tables-It helps to list tables in a database
  • Version-It helps to display the version information


This post first appeared on ApplyBigAnalytics: Data Articles,Trainings,Certif, please read the originial post: here

Share the post

The Sqoop in Hadoop story to process structural data

×

Subscribe to Applybiganalytics: Data Articles,trainings,certif

Get updates delivered right to your inbox!

Thank you for your subscription

×