Lime Taskhandler¶
Lime Taskhandler is a wrapper around celery.
Run lime-task as a service¶
The service accepts the same options as starting a celery worker.
- Loglevel: The default celery loglevel is
warning, so it can be very helpful for local development to explicitly lower it toinfo
lime-task-handler --loglevel info
- Namespace: If you have configured a namespace in your
config.yamlfile like shown below you need to tell that taskhandler to consume from that queue when you start the service. Otherwise the service will only consume fromlime_task_queue_default.
globals:
namespace: <NAMESPACE>
lime-task-handler --queues lime_task_queue_<NAMESPACE>
- Scheduled tasks: you need to start the taskhandler "on beat" in order to run tasks on a schedule
lime-task-handler --beat
All this options can obviously be combined in one command:
lime-task-handler --loglevel info --queues lime_task_queue_<NAMESPACE> --beat
Alternatively make sure that the lime docker container taskhandler is running.
Depending systems¶
The Taskhandler uses RabbitMQ as a message broker and saves the results in Elasticsearch, so make sure those services are running (docker services name: rabbitmq and elastic)
Configuration¶
Service can be configured as:
# config.yaml
tasks:
broker_connection_string: amqp://
backend_connection_string: elasticsearch://localhost
elastic_connection_string: elasticsearch://localhost
task_time_limit: None
task_soft_time_limit: None
task_queue_name: lime_default_task_queue
task_exchange_name: lime_default_task_exchange
task_routing_key_name: lime_default_routing_key
enable_scheduled_tasks: True
enable_system_scheduled_tasks: False
features:
importer_with_taskhandler: False
importer:
connection_string: amqp://guest@localhost//
use_sql_server: False
jobs_days_visible: 30
sql_server_host: localhost
sql_server_database: lime_crm_import
sql_server_username:
sql_server_password:
use_s3: False
s3_bucket:
s3_region:
s3_aws_access_key_id: None
s3_aws_secret_access_key: None