我在Django框架中使用Celery和Redis代理,尝试了5次代码将send_email作为异步任务发送给用户。我的Celery服务器正在工作,即使它正在接收来自Django的任务,它也在响应celery cli接口,但之后我收到了如下错误:
Traceback (most recent call last):
File "c:\users\vipin\appdata\local\programs\python\python3
es\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\vipin\appdata\local\programs\python\python3
es\celery\app\trace.py", line 544, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
task.py -
from celery.decorators import task
from django.core.mail import EmailMessage
import time
@task(name="Sending_Emails")
def send_email(to_email,message):
time1 = 1
while(time1 != 5):
print("Sending Email")
email = EmailMessage('Checking Asynchronous Task', message+str(time1), to=[to_email])
email.send()
time.sleep(1)
time1 += 1
views.py -
print("sending for Queue")
send_email.delay(request.user.email,"Email sent : ")
print("sent for Queue")
settings.py -
# CELERY STUFF
BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/India'
celery.py -
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ECartApplication.settings')
app = Celery('ECartApplication')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
我希望电子邮件应发送5次,但收到错误:
[tasks]
. ECartApplication.celery.debug_task
. Sending_Emails
[2019-05-19 12:41:27,695: INFO/SpawnPoolWorker-2] child process 3628 calling sel
f.run()
[2019-05-19 12:41:27,696: INFO/SpawnPoolWorker-1] child process 5748 calling sel
f.run()
[2019-05-19 12:41:28,560: INFO/MainProcess] Connected to redis://localhost:6379/
/
[2019-05-19 12:41:30,599: INFO/MainProcess] mingle: searching for neighbors
[2019-05-19 12:41:35,035: INFO/MainProcess] mingle: all alone
[2019-05-19 12:41:39,069: WARNING/MainProcess] c:\users\vipin\appdata\local\prog
rams\python\python37-32\lib\site-packages\celery\fixups\django.py:202: UserWarni
ng: Using settings.DEBUG leads to a memory leak, never use this setting in produ
ction environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2019-05-19 12:41:39,070: INFO/MainProcess] celery@vipin-PC ready.
[2019-05-19 12:41:46,448: INFO/MainProcess] Received task: Sending_Emails[db10da
d4-a8ec-4ad2-98a6-60e8c3183dd1]
[2019-05-19 12:41:47,455: ERROR/MainProcess] Task handler raised error: ValueErr
or('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
File "c:\users\vipin\appdata\local\programs\python\python37-32\lib\site-packag
es\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\vipin\appdata\local\programs\python\python37-32\lib\site-packag
es\celery\app\trace.py", line 544, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
2条答案
按热度按时间vvppvyoh1#
当你在Windows 7/10上运行Python时,这是一个问题。
有一个解决方案,您只需要使用模块
eventlet
,您可以使用pip
安装该模块:pip安装事件小程序
之后,在命令末尾执行带有
-P eventlet
的worker:celery -一个MyWorker工作者-l信息-P事件小程序
pgccezyw2#
下面的命令也适用于Windows 11: