I have a suite of multiple sensors that are sending data to a laptop over eithernet. This data is then put in a live bokeh table. What I have noticed in my testing is that each new client connected makes a different instance of the bokeh table. I am worried that multiple instances will pull data from the socket buffer on their own, thus leading to spotty data collection per user.
along with others. It sounds like I can use bokeh.client for this and just pretend multiple instances are one user, but I heard that was not recommened. I have been looking at Flask and that has also lead to multiple errors. The main reason I am struggling is my bokeh code, takes user input (controls what tables the user wants to see), and also pushes socket inputs to the live bokeh table.
I’m a little unclear on the architecture/requirements. It sounds like you have some stream of external telemetry coming in from different sources, and you want every app session that starts up to present the same, complete view of that data, is that correct? If so my first suggestion would be to not pull the data from inside the app code itself, since that means every session gets/updates the data independently. Bokeh has various lifecycle hooks, e.g. an “on server start” hook that you can use to set up or update central data structures (say, via a thread) that all sessions can then refer to. For an example, see the spectrogram and how it handles audio data
Yes. I would like one unified interactible page page for all users. To summarize I have data coming into a laptop over a serial port. I have the bokeh program pulling from the serial buffer, If each session pulls from the serial buffer, all of them will misdata. This is the main concern. This laptop will be eventually moved to a remote place, and the data stream will be monitered through this bokeh site (This is meant to be used to simplify seeing the live data for non-programmers). Thank you for your response, I will take a look and let you know how it works out.
Read through a bit of the example. And I see some similiarities, but I think it misses my main issue. I have a serial buffer with data backets. As soon as one app session takes the data from that packet it can no longer be seen. So if 2 users joined at once, each one would only get about half of the data points.
The bokeh website is designed to be able to display a different number of sensors each measuring different things. This was made this way because, what sensors are actually being run can change, as some of them break. I made it so (through bokeh dropdown boxes, etc), the user can select what tables they want to generate.
My current idea is to have an on_load_up that duplicates a baseline queue for any specific app session. And to display a unified server uptime on each app session, I will most likely just create a single file that holds that data. Thank you for the help, I will give this a try.
Right, you would need to store assemble and maintain the “complete” set of data, in whatever way is most convenient, and then the the app session would interact with that, not with the serial buffer directly. I had in mind a data structure assembled in a shared module (like audio.py in the example) but a file could work too.
Thanks for the help. I just got it figured out, I have used a datastructure before this or had to import my own modules. The way I ended up doing it is just having a python file named config and have variables that that main and app_hook reference. I imagine that it is not the prettiest solution, but it works.