将本地ipython笔记本连接到空气间隙集群上的spark

nfg76nw0  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(495)

我有一台w7机器,我每天都用它工作。我的公司在私有云上也有一个气隙hadoop集群。我只能通过油灰接触云层。当我想在集群上使用spark时,我启动putty,然后执行以下两个操作之一:
从炮弹上发射Pypark
使用vnc访问集群上的redhat gui,并从那里以spark模式启动ipython笔记本
有没有办法使用我的本地w7 ipython笔记本连接到spark?
编辑以下一些尝试和错误后,丹尼尔达拉博斯评论
在本教程之后,我在w7机器上本地安装了spark。然后,我创建了一个新的pyspark概要文件,并在本教程之后更改了启动文件。此时,我可以在本地启动ipython并成功地让它创建一个spark上下文。当我跑步时:

sc.stop()
conf = SparkConf().setAppName('SPark Test').setMaster('localhost:7077')
sc = SparkContext(conf=conf)

然后我得到一个错误:

---------------------------------------------------------------------------
Py4JJavaError                             Traceback (most recent call last)
<ipython-input-15-1e8f5b112924> in <module>()
      1 sc.stop()
      2 conf = SparkConf().setAppName('SPark Test').setMaster('localhost:7077')
----> 3 sc = SparkContext(conf=conf)

C:\Spark\python\pyspark\context.pyc in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
    111         try:
    112             self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
--> 113                           conf, jsc, profiler_cls)
    114         except:
    115             # If an error occurs, clean up in order to allow future SparkContext creation:

C:\Spark\python\pyspark\context.pyc in _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, jsc, profiler_cls)
    168 
    169         # Create the Java SparkContext through Py4J
--> 170         self._jsc = jsc or self._initialize_context(self._conf._jconf)
    171 
    172         # Create a single Accumulator in Java that we'll send all our updates through;

C:\Spark\python\pyspark\context.pyc in _initialize_context(self, jconf)
    222         Initialize SparkContext in function to allow subclass specific initialization
    223         """
--> 224         return self._jvm.JavaSparkContext(jconf)
    225 
    226     @classmethod

C:\Spark\python\lib\py4j-0.8.2.1-src.zip\py4j\java_gateway.py in __call__(self, *args)
    699         answer = self._gateway_client.send_command(command)
    700         return_value = get_return_value(answer, self._gateway_client, None,
--> 701                 self._fqn)
    702 
    703         for temp_arg in temp_args:

C:\Spark\python\pyspark\sql\utils.pyc in deco(*a,**kw)
     34     def deco(*a,**kw):
     35         try:
---> 36             return f(*a,**kw)
     37         except py4j.protocol.Py4JJavaError as e:
     38             s = e.java_exception.toString()

C:\Spark\python\lib\py4j-0.8.2.1-src.zip\py4j\protocol.py in get_return_value(answer, gateway_client, target_id, name)
    298                 raise Py4JJavaError(
    299                     'An error occurred while calling {0}{1}{2}.\n'.
--> 300                     format(target_id, '.', name), value)
    301             else:
    302                 raise Py4JError(

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: org.apache.spark.SparkException: Could not parse Master URL: 'localhost:7077'
    at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2693)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:506)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
    at java.lang.reflect.Constructor.newInstance(Unknown Source)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:214)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Unknown Source)
swvgeqrz

swvgeqrz1#

用putty创建一个ssh隧道,转发一个本地端口(例如。 7077 )至Spark主控器(例如。 spark-master:7077 ). 然后在本地ipython笔记本中使用本地端口( spark://localhost:7077 )作为星火大师的地址。

相关问题