数据插入到配置单元表中

9rygscc1  于 2021-06-04  发布在  Hadoop
关注(0)|答案(1)|浏览(370)

我想在配置单元表中插入数据。
1) 创建数据库。
2) 在特定数据库中创建表。
3) 在特定位置创建虚拟表。
4) 使用虚拟表将数据插入主表。
当我插入数据时,进程无一例外地完成,但数据没有插入到表中。

hive> create database final;

确定时间:2.56秒

hive> create table final.abc (user_name string, password string)
> ROW FORMAT DELIMITED   
> FIELDS TERMINATED BY ',' 
> LINES TERMINATED BY '\n'
> STORED AS TEXTFILE;

正常时间:0.591秒

hive> create table foo (user string , password string)          
> ROW FORMAT DELIMITED   
> FIELDS TERMINATED BY ',' 
> LINES TERMINATED BY '\n'
> STORED AS TEXTFILE
> Location '/usr/hive/hive-0.10.0/fiels';

正常时间:0.051秒

hive> insert into table final.abc select 'username','password' from foo;

Total MapReduce jobs = 3
Launching Job 1 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201306191046_0002, Tracking URL =/jobdetails.jsp?jobid=job_201306191046_0002
Kill Command = /usr/hadoop/hadoop-1.1.2/libexec/../bin/hadoop job  -kill job_201306191046_0002
Hadoop job information for Stage-1: number of mappers: 0; number of reducers: 0
2013-06-19 12:04:36,870 Stage-1 map = 0%,  reduce = 0%
2013-06-19 12:04:37,878 Stage-1 map = 100%,  reduce = 100%
Ended Job = job_201306191046_0002
Ended Job = -331805541, job is filtered out (removed at runtime).
Ended Job = -1750065493, job is filtered out (removed at runtime).
Moving data to: hdfs://localhost:9000/tmp/hive-root/hive_2013-06-19_12-04-32_830_4819535129373917658/-ext-10000
Loading data to table final.abc
Table final.abc stats: [num_partitions: 0, num_files: 0, num_rows: 0, total_size: 0, raw_data_size: 0]

MapReduce Jobs Launched: 
Job 0:  HDFS Read: 0 HDFS Write: 0 SUCCESS
Total MapReduce CPU Time Spent: 0 msec
OK
Time taken: 5.475 seconds

如果有什么想法,请告诉我。我哪里错了。

9udxz4iz

9udxz4iz1#

select 'username','password' from foo

上面的命令应该至少显示一行,然后只有它才能将数据插入到新表中。
所以上传一些样本数据文件到 /usr/hive/hive-0.10.0/fiels 以便您的查询工作。
使用如下数据创建示例文件

username,password

将该文件上载到hadoop集群

hadoop fs -put filename /usr/hive/hive-0.10.0/fiels

相关问题