python - What is the proper way to deploy the same application/code to Elastic Beanstalk server and worker environments? -
so have web service (flask + mysql + celery) , i'm trying figure out proper way deploy on elastic beanstalk separate web server , worker environments/tiers. have working launching worker (using this answer) on same instance web server, want have worker(s) running in separately auto-scaled environment. note celery tasks rely on main server code (e.g. making queries, etc) cannot separated. it's app 2 entry points.
the way can think having code/config-script examine env variable (e.g. env_type = "worker" or "server") determine whether launch standard flask app, or celery worker. other caveat here have "eb deploy" code 2 separate environments (server , worker), when i'd like/expect them deployed simultaneously since both use same code base.
apologies if has been asked before, i've looked around lot , couldn't find anything, find surprising since seems common use case.
edit: found this answer, addresses concern deploying twice (i guess it's technically deploy once , update 2 environments, scriptable). question regarding how bootstrap application server vs worker mode still stands.
regarding bootstrapping, if setup environment variable elastic beanstalk environment (docs here), never have touch again when re-deploy code script. need add environment variable if create new environment.
thus when starting up, can check in python env variable , bootstrap there , load need.
my preference instead of creating enum specifying "worker" or "server", boolean env variable env_worker=1
or something. it'll remove possibility of typing mistakes , easier read.
if os.environ.get('env_worker') not none: # bootstrap worker stuff here else: # specific stuff server here
Comments
Post a Comment