如何从aws-glue连接到安装在ec2示例上的配置单元?

qxgroojn  于 2021-06-27  发布在  Hive
关注(0)|答案(0)|浏览(303)

我想通过在aws glue上运行spark作业来访问hive metastore。这样做需要我放置配置单元示例的ip并访问它。从我的地方,它的工作,但不是从美国焊接学会胶水。
我尝试使用以下代码访问hive:

spark_session = (
    glueContext.spark_session
    .builder
    .appName('example-pyspark-read-and-write-from-hive')
    .config(
        "hive.metastore.uris",
        "thrift://172.16.12.34:9083",
        conf=SparkConf()
    )
    .enableHiveSupport()
    .getOrCreate()
)

我也看过各种文档,但没有人能告诉我如何连接到特定端口的ec2示例。
代码是:

import sys

from awsglue.context import GlueContext
from awsglue.job import Job
from awsglue.transforms import *
from awsglue.utils import getResolvedOptions
from pyspark import SparkConf, SparkContext
from pyspark.conf import SparkConf
from pyspark.context import SparkConf, SparkContext
from pyspark.sql import (DataFrameReader, DataFrameWriter, HiveContext,
                         SparkSession)

"""
SparkSession ss = SparkSession
.builder()
.appName(" Hive example")
.config("hive.metastore.uris", "thrift://localhost:9083")
.enableHiveSupport()
.getOrCreate();
"""
args = getResolvedOptions(sys.argv, ['JOB_NAME'])
sc = SparkContext()
glueContext = GlueContext(sc)
spark_session = (
    glueContext.spark_session
    .builder
    .appName('example-pyspark-read-and-write-from-hive')
    .config(
        "hive.metastore.uris",
        "thrift://172.16.12.34:9083",
        conf=SparkConf()
    )
    .enableHiveSupport()
    .getOrCreate()
)
job = Job(glueContext)
job.init(args['JOB_NAME'], args)
data = [('First', 1), ('Second', 2), ('Third', 3), ('Fourth', 4), ('Fifth', 5)]
df = spark_session.createDataFrame(data)
df.write.saveAsTable('example_2')
job.commit()

我希望在配置单元中写入表,但却从glue中得到以下错误:

An error occurred while calling o239.saveAsTable. No Route to Host from ip-172-31-14-64/172.31.14.64 to ip-172-31-15-11.ap-south-1.compute.internal:8020 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host;

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题