py4jerror:org.apache.spark.api.pythonutils.getencryptionenabled在jvm中不存在

gz5pxeao  于 2021-05-24  发布在  Spark
关注(0)|答案(0)|浏览(267)

当我尝试运行一行 nlu 文件中提供的文件包:

import nlu
nlu.load('ner').predict('Angela Merkel from Germany and the American Donald Trump dont share many opinions')
---------------------------------------------------------------------------
Py4JError                                 Traceback (most recent call last)
<ipython-input-3-eb1d63b68bdb> in <module>
----> 1 nlu.load('ner').predict('Angela Merkel from Germany and the American Donald Trump dont share many opinions')

~\AppData\Roaming\Python\Python37\site-packages\nlu\__init__.py in load(request, verbose)
    119     '''
    120     gc.collect()
--> 121     spark = sparknlp.start()
    122     spark_started = True
    123     if verbose:

~\AppData\Roaming\Python\Python37\site-packages\sparknlp\__init__.py in start(gpu, spark23)
     61         builder.config("spark.jars.packages", maven_spark24)
     62 
---> 63     return builder.getOrCreate()
     64 
     65 

~\AppData\Roaming\Python\Python37\site-packages\pyspark\sql\session.py in getOrCreate(self)

~\AppData\Roaming\Python\Python37\site-packages\pyspark\context.py in getOrCreate(cls, conf)

~\AppData\Roaming\Python\Python37\site-packages\pyspark\context.py in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)

~\AppData\Roaming\Python\Python37\site-packages\pyspark\context.py in _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, jsc, profiler_cls)

~\Anaconda3\envs\tf2\lib\site-packages\py4j\java_gateway.py in __getattr__(self, name)
   1485                 answer, self._gateway_client, self._fqn, "__dir__")
   1486             self._statics = return_value.split("\n")
-> 1487         return self._statics[:]
   1488 
   1489     @property

Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM
​

我已经在我的环境中安装了pyspark和py4j:

Successfully installed py4j-0.10.9 pyspark-3.0.1

我怎样才能解决这个问题?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题