Application¶
The Celery class is the main entry point for creating a celery-asyncio application.
Celery¶
Celery ¶
Celery application.
Arguments:
main (str): Name of the main module if running as __main__.
This is used as the prefix for auto-generated task names.
Keyword Arguments: broker (str): URL of the default broker used. backend (Union[str, Type[celery.backends.base.Backend]]): The result store backend class, or the name of the backend class to use.
Default is the value of the :setting:`result_backend` setting.
autofinalize (bool): If set to False a :exc:`RuntimeError`
will be raised if the task registry or tasks are used before
the app is finalized.
set_as_current (bool): Make this the global current app.
include (List[str]): List of modules every worker should import.
amqp (Union[str, Type[AMQP]]): AMQP object or class name.
events (Union[str, Type[celery.app.events.Events]]): Events object or
class name.
log (Union[str, Type[Logging]]): Log object or class name.
control (Union[str, Type[celery.app.control.Control]]): Control object
or class name.
tasks (Union[str, Type[TaskRegistry]]): A task registry, or the name of
a registry class.
fixups (List[str]): List of fix-up plug-ins (e.g., see
:mod:`celery.fixups.django`).
config_source (Union[str, class]): Take configuration from a class,
or object. Attributes may include any settings described in
the documentation.
task_cls (Union[str, Type[celery.app.task.Task]]): base task class to
use. See :ref:`this section <custom-task-cls-app-wide>` for usage.
producer_pool
property
¶
Legacy producer pool (not used in celery-asyncio).
Raises NotImplementedError as celery-asyncio uses async producers directly.
__init__ ¶
__init__(
main=None,
loader=None,
backend=None,
amqp=None,
events=None,
log=None,
control=None,
set_as_current=True,
tasks=None,
broker=None,
include=None,
changes=None,
config_source=None,
fixups=None,
task_cls=None,
autofinalize=True,
namespace=None,
strict_typing=True,
**kwargs,
)
task ¶
Decorator to create a task class out of any callable.
See :ref:Task options<task-options> for a list of the
arguments that can be passed to this decorator.
Examples: .. code-block:: python
@app.task
def refresh_feed(url):
store_feed(feedparser.parse(url))
with setting extra options:
.. code-block:: python
@app.task(exchange='feeds')
def refresh_feed(url):
return store_feed(feedparser.parse(url))
Note: App Binding: For custom apps the task decorator will return a proxy object, so that the act of creating the task is not performed until the task is used or the task registry is accessed.
If you're depending on binding to be deferred, then you must
not access any attributes on the returned object until the
application is fully set up (finalized).
send_task ¶
send_task(
name,
args=None,
kwargs=None,
countdown=None,
eta=None,
task_id=None,
producer=None,
connection=None,
router=None,
result_cls=None,
expires=None,
publisher=None,
link=None,
link_error=None,
add_to_parent=True,
group_id=None,
group_index=None,
retries=0,
chord=None,
reply_to=None,
time_limit=None,
soft_time_limit=None,
root_id=None,
parent_id=None,
route_name=None,
shadow=None,
chain=None,
task_type=None,
replaced_task_nesting=0,
**options,
)
Send task by name.
Supports the same arguments as :meth:@-Task.apply_async.
Arguments:
name (str): Name of task to call (e.g., "tasks.add").
result_cls (AsyncResult): Specify custom result class.
config_from_object ¶
Read configuration from object.
Object is either an actual object or the name of a module to import.
Example: >>> celery.config_from_object('myapp.celeryconfig')
>>> from myapp import celeryconfig
>>> celery.config_from_object(celeryconfig)
Arguments: silent (bool): If true then import errors will be ignored. force (bool): Force reading configuration immediately. By default the configuration will be read only when required.
config_from_envvar ¶
Read configuration from environment variable.
The value of the environment variable must be the name of a module to import.
Example: >>> os.environ['CELERY_CONFIG_MODULE'] = 'myapp.celeryconfig' >>> celery.config_from_envvar('CELERY_CONFIG_MODULE')
autodiscover_tasks ¶
Auto-discover task modules.
Searches a list of packages for a "tasks.py" module (or use related_name argument).
If the name is empty, this will be delegated to fix-ups (e.g., Django).
For example if you have a directory layout like this:
.. code-block:: text
foo/__init__.py
tasks.py
models.py
bar/__init__.py
tasks.py
models.py
baz/__init__.py
models.py
Then calling app.autodiscover_tasks(['foo', 'bar', 'baz']) will
result in the modules foo.tasks and bar.tasks being imported.
Arguments:
packages (List[str]): List of packages to search.
This argument may also be a callable, in which case the
value returned is used (for lazy evaluation).
related_name (Optional[str]): The name of the module to find. Defaults
to "tasks": meaning "look for 'module.tasks' for every
module in packages.". If None will only try to import
the package, i.e. "look for 'module'".
force (bool): By default this call is lazy so that the actual
auto-discovery won't happen until an application imports
the default modules. Forcing will cause the auto-discovery
to happen immediately.
connection_for_write ¶
Establish connection used for producing.
See Also:
:meth:connection for supported arguments.
select_queues ¶
Select subset of queues.
Arguments: queues (Sequence[str]): a list of queue names to keep.
shared_task¶
shared_task ¶
Create shared task (decorator).
This can be used by library authors to create tasks that'll work for any app environment.
Returns: ~celery.local.Proxy: A proxy that always takes the task from the current apps task registry.
Example:
>>> from celery import Celery, shared_task
>>> @shared_task
... def add(x, y):
... return x + y
...
>>> app1 = Celery(broker='amqp://')
>>> add.app is app1
True
>>> app2 = Celery(broker='redis://')
>>> add.app is app2
True