我试着在macos终端上运行sparkshell ./Documents/spark/spark-3.0.0-bin-hadoop2.7/bin/spark-shell
它开始运行,但我只想用 spark-shell
.
我看了一个4分钟的视频,显示了它是如何做到的,但它不适合我。
我不完全明白 ~/.bash_profile
但下面是它的样子:
# added by Anaconda3 5.3.1 installer
# >>> conda init >>>
# !! Contents within this block are managed by 'conda init' !!
__conda_setup="$(CONDA_REPORT_ERRORS=false '/Users/ajay/anaconda3/bin/conda' shell.bash hook 2> /dev/null)"
if [ $? -eq 0 ]; then
\eval "$__conda_setup"
else
if [ -f "/Users/ajay/anaconda3/etc/profile.d/conda.sh" ]; then
. "/Users/ajay/anaconda3/etc/profile.d/conda.sh"
CONDA_CHANGEPS1=false conda activate base
else
\export PATH="/Users/ajay/anaconda3/bin:$PATH"
fi
fi
unset __conda_setup
# <<< conda init <<<
export SPARK_HOME=/Users/ajay/Documents/spark/spark-3.0.0-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
PATH="/usr/local/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin"
$path提供 /usr/local/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin
我需要如何更改 ~/.bash_profile
为了 spark-shell
去上班?
编辑
这是我在跑步中得到的信息 ./Documents/spark/spark-3.0.0-bin-hadoop2.7/bin/spark-shell
```
20/08/27 16:51:16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.2:4040
Spark context available as 'sc' (master = local[*], app id = local-1598527288778).
Spark session available as 'spark'.
跑步时 `spark-shell` 它显示: `-bash: spark-shell: command not found`
1条答案
按热度按时间af7jpaap1#
这两行的顺序是错误的,因为您将Spark装置“添加”到
$PATH
然后立即覆盖$PATH
.你可能更喜欢这样:
别忘了对
.bash_profile
,.profile
,.bashrc
将只在新shell中生效(除非手动加载它们)。