I have a few bokeh applications that I’m embedding throughout my webpage, using autoload_server. These applications use data from a database, but for performance sake, I would like them to read from a dataframe instead. The issue is that this dataframe is very large, ~20G, and I don’t want to have one per bokeh app, just one per server. My current solution is as follows:
def on_server_loaded(server_context): ###this only runs in one app’s server_lifecycle
dbAsDf = /* read in database */
def on_session_created(session_context): ####this runs on all apps’ server_lifecycle
and is then accessed by
tornado_context is a small change in made in tornado.py and application_context.py so that both apps can see the server running them.
This works, but feels rather hackish, and I’d rather not change the bokeh source code if possible.
Is there a better way to share data between appA and appB when run as: “bokeh serve appA appB”?