Passing a dataframe between running applications

I have a few bokeh applications that I’m embedding throughout my webpage, using autoload_server. These applications use data from a database, but for performance sake, I would like them to read from a dataframe instead. The issue is that this dataframe is very large, ~20G, and I don’t want to have one per bokeh app, just one per server. My current solution is as follows:

def on_server_loaded(server_context): ###this only runs in one app’s server_lifecycle

dbAsDf = /* read in database */

     setattr(server_context.application_context._tornado_context,'df',dbAsDf)

def on_session_created(session_context): ####this runs on all apps’ server_lifecycle

     setattr(session_context._document,'server_context',session_context.server_context.application_context._tornado_context)

and is then accessed by

curdoc().server_context.df

tornado_context is a small change in made in tornado.py and application_context.py so that both apps can see the server running them.

This works, but feels rather hackish, and I’d rather not change the bokeh source code if possible.

Is there a better way to share data between appA and appB when run as: “bokeh serve appA appB”?

Thank you