apache spark-在windows上启动pyspark时出错

gg58donl  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(301)

我正在尝试用python在windows上使用mllib。所以我需要spark,而spark又需要hadoop。我已经安装了anaconda2,它包含python2.7、numpy等。
我一直在遵循这个食谱,这在我看来主要是让我到我需要去的地方,但我想我被困在最后一个错误:

Python 2.7.13 |Anaconda 4.3.1 (64-bit)| (default, Dec 19 2016, 13:29:36) [MSC v.1500 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://anaconda.org
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
  File "C:\spark\bin\..\python\pyspark\shell.py", line 43, in <module>
    spark = SparkSession.builder\
  File "C:\spark\python\pyspark\sql\session.py", line 179, in getOrCreate
    session._jsparkSession.sessionState().conf().setConfString(key, value)
  File "C:\spark\python\lib\py4j-0.10.4-src.zip\py4j\java_gateway.py", line 1133, in __call__
  File "C:\spark\python\pyspark\sql\utils.py", line 79, in deco
    raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':"

从这个输出可以清楚地看到,没有关于找不到winutils.exe的错误。
此外,异常源于py4j的java域,但是由于illegalargumentexception,我们已经丢失了回溯。
感谢所有指导!
干杯

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题