0 votes
in Big Data | Hadoop by
External Table

External table is different compared to the managed table. In this, APACHE HIVE does not manage the data. It does move data to the warehouse directory, location is mentioned during the creation of the table.

Step 1: Creating a directory

hadoopusr$ hdfs dfs -mkdir /HiveData

Step 2: Load data in HDFS

hadoopusr$ hdfs dfs -put /home/arani/Desktop/student4.csv /HiveData

Step 3: Create external table.

hive> create external table student4(name string,rollno int,dept string)
    > row format delimited 
    > fields terminated by ','
    > location '/HiveData';

Step 4: Create a table and load data from HDFS.

create table student1(name string, rollno int, dept string)
    > row format delimited 
    > fields terminated by ',';
   hive > load data inpath '/HiveData/student' into table student1;

(In this query, the local keyword is not being used, means HIVE will not look for data in local)

Related questions

0 votes
asked Apr 3, 2020 in Big Data | Hadoop by Tate
0 votes
asked Apr 3, 2020 in Big Data | Hadoop by Tate
...