在pyspark上训练asl模型时遇到“py4jjavaerror”

shyt4zoc  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(461)

我在本地机器上复制数据营课程代码时遇到了困难。课程名为“pyspark中的推荐引擎”。如果有人知道如何在本地机器或googlecolab上使用pyspark训练als模型,请帮助。每次遇到这个错误,我都会在本地机器和googlecolab上尝试同样的过程。可能我错过了一些非常基本的东西

ratings = ratings.select(ratings.user_id.cast("integer"), ratings.movie_id.cast("integer"), ratings.rating.cast("double"))

ratings.printSchema()

(train, test) = ratings.randomSplit([0.8, 0.2], seed = 1234)

from pyspark.ml.recommendation import ALS

als = ALS(userCol="user_id", itemCol="movie_id", ratingCol="rating",rank =25, maxIter = 100, regParam = 0.5, nonnegative = True,coldStartStrategy= "drop", implicitPrefs = False)

model=als.fit(train)

输出:

Py4JJavaError                             Traceback (most recent call last)
<ipython-input-5-c31a339ff67a> in <module>
     11 als = ALS(userCol="user_id", itemCol="movie_id", ratingCol="rating",rank =25, maxIter = 100, regParam = 0.5, nonnegative = True,coldStartStrategy= "drop", implicitPrefs = False)
     12 
---> 13 model=als.fit(train)

~/venv/lib/python3.6/site-packages/pyspark/ml/base.py in fit(self, dataset, params)
    127                 return self.copy(params)._fit(dataset)
    128             else:
--> 129                 return self._fit(dataset)
    130         else:
    131             raise ValueError("Params must be either a param map or a list/tuple of param maps, "

~/venv/lib/python3.6/site-packages/pyspark/ml/wrapper.py in _fit(self, dataset)
    319 
    320     def _fit(self, dataset):
--> 321         java_model = self._fit_java(dataset)
    322         model = self._create_model(java_model)
    323         return self._copyValues(model)

~/venv/lib/python3.6/site-packages/pyspark/ml/wrapper.py in _fit_java(self, dataset)
    316         """
    317         self._transfer_params_to_java()
--> 318         return self._java_obj.fit(dataset._jdf)
    319 
    320     def _fit(self, dataset):

~/venv/lib/python3.6/site-packages/py4j/java_gateway.py in __call__(self, *args)
   1303         answer = self.gateway_client.send_command(command)
   1304         return_value = get_return_value(
-> 1305             answer, self.gateway_client, self.target_id, self.name)
   1306 
   1307         for temp_arg in temp_args:

~/venv/lib/python3.6/site-packages/pyspark/sql/utils.py in deco(*a,**kw)
    129     def deco(*a,**kw):
    130         try:
--> 131             return f(*a,**kw)
    132         except py4j.protocol.Py4JJavaError as e:
    133             converted = convert_exception(e.java_exception)

~/venv/lib/python3.6/site-packages/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)
    326                 raise Py4JJavaError(
    327                     "An error occurred while calling {0}{1}{2}.\n".
--> 328                     format(target_id, ".", name), value)
    329             else:
    330                 raise Py4JError(

Py4JJavaError: An error occurred while calling o59.fit.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 62.0 failed 1 times, most recent failure: Lost task 0.0 in stage 62.0 (TID 544, 192.168.101.225, executor driver): java.lang.StackOverflowError
------------------------------------------------    
----------------------------------------
Exception happened during processing of request from ('127.0.0.1', 54328)
Traceback (most recent call last):
  File "/usr/lib/python3.6/socketserver.py", line 320, in _handle_request_noblock
    self.process_request(request, client_address)
  File "/usr/lib/python3.6/socketserver.py", line 351, in process_request
    self.finish_request(request, client_address)
  File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request
    self.RequestHandlerClass(request, client_address, self)
  File "/usr/lib/python3.6/socketserver.py", line 724, in __init__
    self.handle()
  File "/home/abhilash/venv/lib/python3.6/site-packages/pyspark/accumulators.py", line 268, in handle
    poll(accum_updates)
  File "/home/abhilash/venv/lib/python3.6/site-packages/pyspark/accumulators.py", line 241, in poll
    if func():
  File "/home/abhilash/venv/lib/python3.6/site-packages/pyspark/accumulators.py", line 245, in accum_updates
    num_updates = read_int(self.rfile)
  File "/home/abhilash/venv/lib/python3.6/site-packages/pyspark/serializers.py", line 595, in read_int
    raise EOFError
EOFError
---------------------------------------

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题