将数据加载到docker上托管的配置单元时出错

jum4pzuy  于 2021-05-27  发布在  Hadoop
关注(0)|答案(1)|浏览(458)

我正在尝试使用以下方法将数据加载到docker容器上运行的配置单元表中:
我得到以下错误:`windows powershell版权所有(c)microsoft corporation。版权所有。
尝试新的跨平台powershellhttps://aka.ms/pscore6

PS C:\Users\John Mekubo> cd desktop
PS C:\Users\John Mekubo\desktop> cd hive
PS C:\Users\John Mekubo\desktop\hive> cd docker-hive
PS C:\Users\John Mekubo\desktop\hive\docker-hive> docker-compose up -d
Creating network "docker-hive_default" with the default driver
Creating docker-hive_hive-metastore-postgresql_1 ... done                                     Creating docker-hive_namenode_1                  ... done                                     Creating docker-hive_datanode_1                  ... done                                     Creating docker-hive_presto-coordinator_1        ... done                                     Creating docker-hive_hive-server_1               ... done                                     Creating docker-hive_hive-metastore_1            ... done                                     PS C:\Users\John Mekubo\desktop\hive\docker-hive> docker container ls
CONTAINER ID        IMAGE                                                    COMMAND                  CREATED             STATUS                             PORTS                                            NAMES
cbd69ea93202        shawnzhu/prestodb:0.181                                  "./bin/launcher run"     45 seconds ago      Up 20 seconds                      8080/tcp, 0.0.0.0:8083->8083/tcp                 docker-hive_presto-coordinator_1
4dc6aa6f1ea2        bde2020/hive:2.3.2-postgresql-metastore                  "entrypoint.sh /opt/…"   45 seconds ago      Up 27 seconds                      10000/tcp, 0.0.0.0:9083->9083/tcp, 10002/tcp     docker-hive_hive-metastore_1
088d30586261        bde2020/hive:2.3.2-postgresql-metastore                  "entrypoint.sh /bin/…"   45 seconds ago      Up 28 seconds                      0.0.0.0:10000->10000/tcp, 10002/tcp              docker-hive_hive-server_1
25df8dba5dab        bde2020/hadoop-datanode:2.0.0-hadoop2.7.4-java8          "/entrypoint.sh /run…"   45 seconds ago      Up 24 seconds (health: starting)   0.0.0.0:50075->50075/tcp                         docker-hive_datanode_1
84a080977f21        bde2020/hive-metastore-postgresql:2.3.0                  "/docker-entrypoint.…"   45 seconds ago      Up 22 seconds                      5432/tcp                                         docker-hive_hive-metastore-postgresql_1
9d603abd33f0        bde2020/hadoop-namenode:2.0.0-hadoop2.7.4-java8          "/entrypoint.sh /run…"   45 seconds ago      Up 28 seconds (health: starting)   0.0.0.0:50070->50070/tcp                         docker-hive_namenode_1
34cd1ca07bec        bde2020/hadoop-namenode:2.0.0-hadoop3.2.1-java8          "/entrypoint.sh /run…"   4 minutes ago       Up 4 minutes (healthy)             0.0.0.0:9000->9000/tcp, 0.0.0.0:9870->9870/tcp   namenode
0f723195fd59        bde2020/hadoop-datanode:2.0.0-hadoop3.2.1-java8          "/entrypoint.sh /run…"   4 minutes ago       Up 4 minutes (healthy)             9864/tcp                                         datanode
962d67fd2a7a        bde2020/hadoop-resourcemanager:2.0.0-hadoop3.2.1-java8   "/entrypoint.sh /run…"   4 minutes ago       Up 2 minutes (healthy)             8088/tcp                                         resourcemanager
a7aabae79a39        bde2020/hadoop-historyserver:2.0.0-hadoop3.2.1-java8     "/entrypoint.sh /run…"   4 minutes ago       Up 4 minutes (healthy)             8188/tcp                                         historyserver
6a76b861dde6        bde2020/hadoop-nodemanager:2.0.0-hadoop3.2.1-java8       "/entrypoint.sh /run…"   4 minutes ago       Up 4 minutes (healthy)             8042/tcp                                         nodemanager
PS C:\Users\John Mekubo\desktop\hive\docker-hive> docker exec -it docker-hive_hive-server_1 /bin/bash
root@088d30586261:/opt# /opt/hive/bin/beeline -u jdbc:hive2://localhost:10000
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.4/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000
20/11/17 07:06:02 [main]: WARN jdbc.HiveConnection: Failed to connect to localhost:10000
Could not open connection to the HS2 server. Please check the server URI and if the URI is correct, then ask the administrator to check the server status.
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
Beeline version 2.3.2 by Apache Hive
beeline>
PS C:\Users\John Mekubo\desktop\hive\docker-hive> docker exec -it docker-hive_hive-server_1 /bin/bash
root@088d30586261:/opt# hadoop fs -ls /
Found 3 items
drwxr-xr-x   - root supergroup          0 2020-11-15 10:07 /home
drwxrwxr-x   - root supergroup          0 2020-11-13 05:48 /tmp
drwxr-xr-x   - root supergroup          0 2020-11-13 09:32 /user
root@088d30586261:/opt# hadoop fs -ls /user/root/regdata
Found 1 items
drwxr-xr-x   - root supergroup          0 2020-11-15 11:10 /user/root/regdata/payments
root@088d30586261:/opt# /opt/hive/bin/beeline -u jdbc:hive2://localhost:10000
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.4/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000
Connected to: Apache Hive (version 2.3.2)
Driver: Hive JDBC (version 2.3.2)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 2.3.2 by Apache Hive
0: jdbc:hive2://localhost:10000> show databases;
+----------------+
| database_name  |
+----------------+
| default        |
+----------------+
1 row selected (3.947 seconds)
0: jdbc:hive2://localhost:10000> create database bigdata;
No rows affected (0.518 seconds)
0: jdbc:hive2://localhost:10000> use bigdata;
No rows affected (0.109 seconds)
0: jdbc:hive2://localhost:10000> CREATE EXTERNAL TABLE IF NOT EXISTS bigdata.payments(county string, sub_county string, firstname string, lastname string,ffa string,inua_mama string,uct string, facility string, amount_paid float)
. . . . . . . . . . . . . . . .> ROW FORMAT DELIMITED
. . . . . . . . . . . . . . . .> FIELDS TERMINATED BY ','
. . . . . . . . . . . . . . . .> STORED AS TEXTFILE
. . . . . . . . . . . . . . . .> LOCATION '/user/root/regdata/'
. . . . . . . . . . . . . . . .> ;
No rows affected (1.265 seconds)
0: jdbc:hive2://localhost:10000> show tables;
+-----------+
| tab_name  |
+-----------+
| payments  |
+-----------+
1 row selected (0.172 seconds)
0: jdbc:hive2://localhost:10000> LOAD DATA INPATH '/user/root/regdata/payments' INTO TABLE payments;
Error: Error while compiling statement: FAILED: SemanticException Line 1:17 Invalid path ''/user/root/regdata/payments'': No files matching path hdfs://namenode:8027/user/root/regdata/payments (state=42000,code=40000)
0: jdbc:hive2://localhost:10000>`


即使尝试使用外部表。将创建外部表,但从表中选择数据时,查询仅返回表列,如下所示:

6pp0gazn

6pp0gazn1#

我看到“/user/root/regdata/payments”也是一个目录。你是在那个目录下存储文件吗?如果是,请尝试将“create external table”ddl更改为类似这样的“stored as textfile location”/user/root/regdata/payments

相关问题