This is more of a Docker related question than nothing else, but it might be useful for someone else.
I’m working on a PoC with Torizon, that uses several containers. I’m using Flask for a simple website in one container, and a Python program and influxdb to get some sensors data on another one (I’ll probably change to Grafana at the end but since I want to add some bidirectional logic between the website and the python program, I don’t think Grafana would work for that)
Now I would like to have another Python program / container that controls the GPIO using gpiod.
# For Apalis iMX8 use IMAGE_ARCH=arm64v8
RUN apt-get update \
&& apt-get install -y --no-install-recommends python3 \
&& apt-get clean && apt-get autoremove && rm -rf /var/lib/apt/lists/*
RUN apt-get update && apt-get install -y procps
RUN python3 -m pip install -U --user pip gpiod
COPY gpio-python.py /usr/bin
gpio-python.py simply uses gpiod to interact with the GPIOs. Nothing too fancy, but works.
# docker build -t gpio-python .
# docker run --rm -it --device /dev/gpiochip0 gpio-python /bin/bash
## python3 /usr/bin/gpio-python.py
I managed to make the script work individually, by launching the container and then the script inside it. A simply
while True with a blinking logic, just to test this works on gpiochip0 (on the PoC one would pass the pin).
Now, I want to have a button in my Flask website that would make a call to this python script which is another container… but I have no idea on how to do route this or make this call.
Since this is a PoC, in the worst case scenario, the gpiod program and Flask could be in the same container, but I wanted to check if there was some other way that won’t involve any networking forwarding or something of the sort (like using influxdb for this as well and have the python script continuously checking for this… Doesn’t sound ideal).
You have many solutions for this problem:
- Convert your gpio app to a REST
server and make it communicate via
REST API. This will require more
effort, but would probably be the
best approach in a real-world
application (where the component
will do more than GPIO toggling),
keeping the different components
separated and connected with clearly
defined interfaces. This will allow
you to update the different parts
independently, as long as the
“contract” defined by the REST API
is not broken
- Share the docker
socket /var/run/docker.sock inside
the container and used docker API
for python to run containers/apps
inside containers. You need to
ensure that the user running the
flask app has the right to access
docker socket (must be root or part
of docker group, with GID=990). This
is what some tools like portainer
- Share docker socket, install
docker cli in the container, and run
it from your app. This may be
simpler for testing/demos, but using
the API provides more control over
errors and parameters. Main
advantage would be that you could
easily test from command line. Of
course you can mix with previous
option, using API from your app but
also having a command line ready to
do quick tests.
- Enable docker TCP
interface and use it to access
docker from running containers. This
is technically doable but I would
not suggest using it because the
interface will be non-secure by
default and, even if you enable
HTTPs and certificates, will be
exposed to the outside and may
become a security concern. Sharing
local socket will limit
accessibility to local machine and
allows you to perform the same
- As you suggested, move
both apps to the same container,
maybe converting the gpio one to a
python module. This may be more
efficient from a resource usage
point of view, but will force a
tight integration that may not be
too bad for a simple demo but may
become an issue on productions
systems where you have multiple
components interacting (an “engine”
controlling the HW, like your gpio
sample, a web frontend and maybe
also a local GUI) and you may not
want to update all of them at the
same time or having to manage
Thanks @valter.tx! For the RESTful API should I use Flask again in my GPIO app or is any other better approach?
It will, of course, depend on the language.
Flask is a very good solution for simple APIs in Python.
If you want to define an API that can be easily accessed from different languages my suggestion is to use Swagger/OpenAPI. This will allow you to define your API using a yaml file and then generate servers/clients for the most popular development languages/frameworks.
In Python you may use connexion to load the swagger API definition and it will implement the “glue” logic between HTTP server and your code in a very easy way.