django-celery实现异步/定时任务

依赖和环境说明

python3.6
django1.11.23
celery==3.1.25  异步任务
django-celery==3.2.2  定时任务管理包
redis==2.10.6
django-redis-cache==1.7.1 方便配置Redis缓存

以上依赖pip安装,不再赘述

配置

在django创建项目的项目配置路径下的__init__.py中添加

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)

在django创建项目的项目配置路径下,创建celery.py,如下:

#! /usr/bin/python3
# -*- coding:utf-8 -*-
# file: celery.py
# author: wangchenxi
# mail: wongchenxi@icloud.com
# brief:
# version: 0.1.00
# Create Time:2019-12-08 22:21:00
# Last Update: 2019-12-10 01:26:48 AM
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery, platforms
from django.conf import settings
from celery.schedules import crontab

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'wangchenxi_top.settings')

app = Celery('wangchenxi_top')
# imports = ('api.tasks', 'blog.tasks')
platforms.C_FORCE_ROOT = True
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

在django创建项目的项目配置路径下settings.py中添加celery配置相关,如下:

import djcelery
''''''
INSTALLED_APPS = [
    # ...
    'djcelery',
    # ....
]
TIME_ZONE = 'Asia/Shanghai'
USE_I18N = True
USE_L10N = True
USE_TZ = False

djcelery.setup_loader()
BROKER_URL = 'redis://127.0.0.1:6379'
CELERY_CONCURRENCY = 10
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'

开发任务

在项目应用下创建tasks.py,并在项目应用views.py中引入(即使不调用也要引入,不然celery不能发现项目的task),才能在启动的时候自动发现注册任务函数。例如,本人开发的一个api应用添加tasks.py文件如下:

from celery import shared_task
@shared_task
def task_hello_world(*args):
    '''
    xxxxxx
    '''
    pass
    return 'hello,world'

需要在应用的views.py文件中引入如下views.py

from api.tasks import task_hello_world
# 引入之后可以主动调用,发起异步任务,也可以通过配置的
# CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler' 发起任务调用
# 如果任务没有在views.py文件中引入则视为任务没有注册,进而不能通过CELERYBEAT_SCHEDULER完成调用

在views.py主动调用的方法如下:

from api.tasks import task_hello_world

def every_where():
    '''
    xxxxx
    '''
    id = task_hello_world.delay(*args)
      #   如果需要返回结果需要记录id

启动准备及启动

安装完依赖,配置完之后,应该执行

python manage.py migrate
# 因为django-celery模块会去创建djcelery.schedulers.DatabaseScheduler
# 数据表

# 然后启动celery,同时启动beat
python manage.py celery multi start worker --loglevel=info --beat
# 如果不启动beat,则不能完成定时任务

其他

其他问题例如定时任务表怎么配置调用,需要说明的是:每一次调整或这添加定时任务都要重启加载celery。

具体怎么添加定时任务我觉得这个不再celery的配置和简单使用范畴。 在使用celery过程中发现beat调度部分cpu占用率太高,其实这个地方需要根据实际需求调整配置

#默认是0,就相当于是一个耗时循环
#CELERYBEAT_MAX_LOOP_INTERVAL= 0 
CELERYBEAT_MAX_LOOP_INTERVAL= 1

总之,根据个人实际情况调整就是了。

版权声明:除特别注明外,本站所有文章均为王晨曦个人站点原创

转载请注明:出处来自王晨曦个人站点 » django-celery实现异步/定时任务

点赞

发表评论

电子邮件地址不会被公开。 必填项已用*标注

  1. Does your site have a contact page? I'm having problems locating it but, I'd like to send you an e-mail. I've got some recommendations for your blog you might be interested in hearing. Either way, great blog and I look forward to seeing it expand over time.

  2. Hi there! I know this is kinda off topic but I was wondering which blog platform are you using for this website? I'm getting fed up of Wordpress because I've had issues with hackers and I'm looking at alternatives for another platform. I would be awesome if you could point me in the direction of a good platform.

  3. Hi, I want to subscribe for this webpage to take latest updates, therefore where can i do it please assist.

  4. You actually make it seem so easy with your presentation but I find this topic to be really something which I think I would never understand. It seems too complex and very broad for me. I am looking forward for your next post, I'll try to get the hang of it!