pyspark不能在ubuntu上运行给出错误oserror:[error99]

ws51t4hk  于 2021-05-31  发布在  Hadoop
关注(0)|答案(0)|浏览(241)

每当我尝试在我的ubuntu服务器上运行pyspark时,我都会遇到以下错误。hadoop已经安装,所有环境变量都已设置。如果我尝试,我可以运行jupyter笔记本绑定到另一个端口。

Traceback (most recent call last):
  File "/usr/local/anaconda3/bin/jupyter-notebook", line 11, in <module>
    sys.exit(main())
  File "/usr/local/anaconda3/lib/python3.7/site-packages/jupyter_core/application.py", line 268, in launch_instance
    return super(JupyterApp, cls).launch_instance(argv=argv,**kwargs)
  File "/usr/local/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py", line 663, in launch_instance
    app.initialize(argv)
  File "</usr/local/anaconda3/lib/python3.7/site-packages/decorator.py:decorator-gen-7>", line 2, in initialize
  File "/usr/local/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
    return method(app, *args,**kwargs)
  File "/usr/local/anaconda3/lib/python3.7/site-packages/notebook/notebookapp.py", line 1769, in initialize
    self.init_webapp()
  File "/usr/local/anaconda3/lib/python3.7/site-packages/notebook/notebookapp.py", line 1490, in init_webapp
    self.http_server.listen(port, self.ip)
  File "/usr/local/anaconda3/lib/python3.7/site-packages/tornado/tcpserver.py", line 151, in listen
    sockets = bind_sockets(port, address=address)
  File "/usr/local/anaconda3/lib/python3.7/site-packages/tornado/netutil.py", line 174, in bind_sockets
    sock.bind(sockaddr)
OSError: [Errno 99] Cannot assign requested address

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题