java虚拟机错误

3htmauhk  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(470)

我将apachespark从2.1.1升级到2.2.1(没有做任何其他更改),但是现在我不能运行了 spark-submit 或者pyspark外壳没有错误。
如果我尝试从命令行运行pi计算示例 ./bin/run-example SparkPi 10 我得到以下错误:

Exception in thread "main" java.lang.VerifyError: 
class org.apache.spark.sql.execution.LogicalRDD overrides final method sameResult.
(Lorg/apache/spark/sql/catalyst/plans/QueryPlan;)Z

或者,如果我尝试通过运行 ./bin/pyspark 我得到:

Exception in thread "Thread-2" java.lang.VerifyError: class org.apache.spark.sql.execution.LogicalRDD overrides final method sameResult.(Lorg/apache/spark/sql/catalyst/plans/QueryPlan;)Z

...

ERROR:root:Exception while sending command.
Traceback (most recent call last):
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 883, in send_command
    response = connection.send_command(command)
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1040, in send_command
    "Error while receiving", e, proto.ERROR_ON_RECEIVE)
Py4JNetworkError: Error while receiving
Exception in thread "Thread-16" java.lang.VerifyError: class org.apache.spark.sql.execution.LogicalRDD overrides final method sameResult.(Lorg/apache/spark/sql/catalyst/plans/QueryPlan;)Z

...

ERROR:root:Exception while sending command.
Traceback (most recent call last):
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 883, in send_command
    response = connection.send_command(command)
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1040, in send_command
    "Error while receiving", e, proto.ERROR_ON_RECEIVE)
Py4JNetworkError: Error while receiving
Traceback (most recent call last):
  File "/usr/local/spark/python/pyspark/shell.py", line 47, in <module>
    spark = SparkSession.builder.getOrCreate()
  File "/usr/local/spark/python/pyspark/sql/session.py", line 177, in getOrCreate
    session = SparkSession(sc)
  File "/usr/local/spark/python/pyspark/sql/session.py", line 211, in __init__
    jsparkSession = self._jvm.SparkSession(self._jsc.sc())
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1532, in __getattr__
py4j.protocol.Py4JError: SparkSession does not exist in the JVM

我该怎么解决这个问题?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题