从jupyter笔记本运行spark/python

wfauudbj  于 2021-07-09  发布在  Spark
关注(0)|答案(0)|浏览(254)

我已经创建了shell脚本来从jupyter笔记本访问pyspark。当我运行脚本时,我得到下面这个错误。

sudo /home/scripts/jupyspark.sh test.py 

**/home/scripts/jupyspark.sh: line 6: /bin/pyspark: No such file or directory**

这是我的jupyspark脚本


# !/bin/bash

export PYSPARK_DRIVER_PYTHON=jupyter
export PYSPARK_DRIVER_PYTHON_OPTS="notebook --NotebookApp.open_browser=True --NotebookApp.ip='localhost' --NotebookApp.port=8888"

${SPARK_HOME}/bin/pyspark \
--master local[4] \
--executor-memory 1G \
--driver-memory 1G \
--conf spark.sql.warehouse.dir="file:///tmp/spark-warehouse" \
--packages com.databricks:spark-csv_2.11:1.5.0 \
--packages com.amazonaws:aws-java-sdk-pom:1.10.34 \
--packages org.apache.hadoop:hadoop-aws:2.7.3

我还做了以下步骤:

cat ~/.bash_profile 
export SPARK_HOME=/usr/local/spark
export PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH
export HADOOP_HOME=/usr/local/hadoop
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native/:$LD_LIBRARY_PATH
export AWS_ACCESS_KEY_ID='MY_ACCESS_KEY'
export AWS_SECRET_ACCESS_KEY='MY_SECRET_ACCESS_KEY'

你有什么办法解决这个问题吗?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题