我真的被困在这里面了。我试着在本地模式下运行气流中的spark submit操作符。虽然对所需文件夹的访问权限对任何用户都具有读取和执行权限,但我有以下错误。请帮忙。
[2020-12-08 11:00:54,456] {base_hook.py:89} INFO - Using connection to: id: spark_local. Host: local[*], Port: None, Schema: None, Login: None, Password: None, extra: XXXXXXXX
[2020-12-08 11:00:54,458] {spark_submit_hook.py:325} INFO - Spark-Submit cmd: /opt/spark/bin/ --master local[*] --name airflow-spark --queue root.default --deploy-mode client /home/ubuntu/market_risk/utils/etl.py
[2020-12-08 11:00:54,463] {taskinstance.py:1150} ERROR - [Errno 13] Permission denied: '/opt/spark/bin/'
Traceback (most recent call last):
File "/home/ubuntu/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
result = task_copy.execute(context=context)
File "/home/ubuntu/.local/lib/python3.6/site-packages/airflow/contrib/operators/spark_submit_operator.py", line 187, in execute
self._hook.submit(self._application)
File "/home/ubuntu/.local/lib/python3.6/site-packages/airflow/contrib/hooks/spark_submit_hook.py", line 395, in submit
**kwargs)
File "/usr/lib/python3.6/subprocess.py", line 729, in __init__
restore_signals, start_new_session)
File "/usr/lib/python3.6/subprocess.py", line 1364, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
PermissionError: [Errno 13] Permission denied: '/opt/spark/bin/'
我在气流中的Spark连接设置如下:
在airflow ui中,我没有设置任何环境变量。
暂无答案!
目前还没有任何答案,快来回答吧!