centos Pyspark中的环境变量

cngwdvgl  于 2022-11-07  发布在  Spark
关注(0)|答案(1)|浏览(145)

我已经在集群模式下安装了Hadoop,现在我已经安装了Spark。我想使用pyspark,这是我的.bashrc


# User specific aliases and functions

export HADOOP_HOME=/opt/hadoop
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:/opt/hadoop/spark/bin:/opt/hadoop/spark/sbin
export JAVA_HOME=/usr/java/jdk1.8.0_202-amd64

# Estas variables las metemos con spark

export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export SPARK_DIST_CLASSPATH=$(hadoop classpath)
export SPARK_HOME=/opt/hadoop/spark

# Para pyspark

export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.9.3-src.zip:$PYTHONPATH
export PATH=$SPARK_HOME/python:$PATH
export PYSPARK_PYTHON=/usr/bin/python2.7
export PYSPARK_DRIVER_PYTHON=/usr/bin/python2.7

当我运行pyspark命令时,会发生以下情况:

[hadoop@nodo1 ~]$ pyspark
Python 2.7.5 (default, Nov 16 2020, 22:23:17) 
[GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
  File "/opt/hadoop/spark/python/pyspark/shell.py", line 29, in <module>
    from pyspark.context import SparkContext
  File "/opt/hadoop/spark/python/pyspark/__init__.py", line 53, in <module>
    from pyspark.rdd import RDD, RDDBarrier
  File "/opt/hadoop/spark/python/pyspark/rdd.py", line 34, in <module>
    from pyspark.java_gateway import local_connect_and_auth
  File "/opt/hadoop/spark/python/pyspark/java_gateway.py", line 31, in <module>
    from pyspark.find_spark_home import _find_spark_home
  File "/opt/hadoop/spark/python/pyspark/find_spark_home.py", line 68
    print("Could not find valid SPARK_HOME while searching {0}".format(paths), file=sys.stderr)
                                                                                   ^
SyntaxError: invalid syntax

我使用的是Hadoop 3.2.3Spark 3.1.2Python 2.7.5CentOs 7
错误在哪里?

ffx8fchx

ffx8fchx1#

问题出在Python版本中。安装Python3时通过保留以下环境变量修复了问题:

export HADOOP_HOME=/opt/hadoop
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:/opt/hadoop/spark/bin:/opt/hadoop/spark/sbin
export JAVA_HOME=/usr/java/jdk1.8.0_202-amd64
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export SPARK_DIST_CLASSPATH=$(hadoop classpath)
export SPARK_HOME=/opt/hadoop/spark

相关问题