-
in the project you want to use it type:
pipenv install -e git+https://github.com/astrosat/django-astrosat-tasks.git@master#egg=django-astrosat
-
add "astrosat_tasks" to your INSTALLED_APPS settings like this:
INSTALLED_APPS = [
...
'astrosat_tasks',
...
]
-
add lots of settings; look at "astrosat_tasks/conf/settings.py" to see what to add
-
include the astrosat URLconf in your project "urls.py" in the usual way:
api_urlpatterns += astrosat_tasks_api_urlpatterns
urlpatterns = [
...
path("astrosat_tasks/", include(astrosat_tasks_urlpatterns)),
...
]
-
run
python manage.py migrate
to create the astrosat models. -
add whatever tasks you want in "<app>/tasks.py" using the same syntax as "example/tasks.py".
- profit!
django-astrosat-tasks comes w/ an example project to help w/ developing/testing. Because it requires a task broker (rabbitmq), it runs in Docker.
git clone <repo> django-astrosat-tasks
cd django-astrosat-tasks/example
docker-compose up
(this will start the services: "db", "broker", and "server"; you can run them separately if desired)- goto "http://localhost:8000" and enjoy
- you can monitor the task queue at "http://localhost:15672"
note that the reference to django-astrosat-users in "Pipfile" was created with: pipenv install -e .
. This looks for the "setup.py" file in the current directory. If the distribution changes just run pipenv update django-astrosat-tasks
, otherwise code changes should just be picked up b/c of the "-e" flag.
note also that in order for runserver to pickup live changes to the code, the Pipfile, Dockerfile, etc. are at the ROOTDIR rather than in the example app, and both example and astrosat_tasks are mounted as volumes in docker-compose.yml.
In most projects, you will want to run celery as a service using something like this:
celery worker --app=astrosat_tasks.celery:app --beat --scheduler django_celery_beat.schedulers.DatabaseScheduler --workdir=$APP_HOME/server --loglevel=INFO -n worker.%%h