Making bokeh applications available on AWS EC2 docker

Hi there, we are working on a Bokeh application (similar to this (https://demo.bokeh.org/crossfilter) which is running fine locally via the bokeh server.

However, I want to make it available to other colleagues via a docker on AWS EC2. The ideal situation would be that the end should send a get request and then it opens the application in the browser. What would be the best way to implement this?

1 Like

I can’t speak directly to using Docker on bare EC2, but I can describe how the docker setup for https://demo.bokeh.org works, and hopefully much of that knowledge is transferrable. Here are some comments:

  • The demo site has its own GitHub repository: https://github.com/bokeh/demo.bokeh.org that anyone can refer to

  • The content of the repo is mostly just a Dockerfile that uses Alpine and Miniconda to set dependencies up, then runs bokeh serve at the end.

  • The site can be run locally:

    docker build --tag demo.bokeh.org .
    
    docker run --rm -p 5006:5006 -it demo.bokeh.org
    

    Then navigate to http://localhost:5006

  • The actual production deployment of the site is on AWS Elastic Beanstalk, according to the instructions at Deploying Elastic Beanstalk Applications from Docker Containers

  • When deploying on AWS, the Load balancer protocol needs to be set to TCP to allow websocket connections similar rules are needed to security group config as well

I’m not sure how things would be different running on bare EC2 instances. My guess is that you could just explicitly build/run the container as described above, and make sure the network is configured to allow users to reach it (e.g forward ports and websockets, terminate SSL if needed etc). I’m not an expert at those things, which is why I used Elastic Beanstalk. It makes deployment basically as simple as uploading the Dockerfile. If that is an option for your use case I would recommend looking in to it.

Thanks for your answer. I’m also willing to deploy our app on AWS Elastic Beanstalk, since that seems easier with just uploading the dockerfile. One question to be fully sure: if we do that, stay all used data for the plots then in our own AWS environment?

I’m not quite sure I understand the answer you want. The Bokeh server app is a Python program, so it will do whatever you tell it to do. If you want it to load and store data only from other AWS services (S3 or DocumentDB) then you can certainly write an app that does that. If you want to load or store data outside AWS, there’s nothing to prevent you from writing an app that does that, either.

I’ve just deployed a bokeh app via AWS Elastic Container Service. After the docker container is pushed to ECR you can set up a task definition with ECS. I recommend creating a service based on your image. You can tie the service to a target group so that your load balancer can find the host port and do all the port mapping magic for you. I even went so far as to create a cloudformation template to deploy all the parts.

We have even used the load balancer to provide authentication to the app. If you have more specific questions. I’d be happy to elaborate.

If you have the bandwidth to put together a little mini-guide (e.g. with a few AWS console screenshots to help people have context along the steps) that would be an amazing resource

I second @Bryan - if you have some hints on what needs to be done in general (any setup changes to the docker container specific to cloud infrastructure vs. local PC for example) I would be interested as well, since we are using Azure App Service instead of Amazon.

1 Like