In the “Virtual Sensors for Data Processing” lab, you developed Python scripts to simulate virtual sensors for a Smart Mailbox project. These virtual sensors processed data from a PIR motion sensor and a synthetic weight sensor to determine the number of letters in a mailbox. The next step involves deploying these virtual sensors using Docker containers. This lab, “Deployment of VSs – Using Containers,” will guide you through the process of containerizing your Python scripts. This will enhance your understanding of modern deployment practices and prepare you for real-world IoT application development.
What is Docker?
Docker is a platform for developing, shipping, and running applications in lightweight and portable containers. A container can be thought of as a standardized unit of software that packages up code and all its dependencies so the application runs quickly and reliably in different computing environments.
Why Use Docker for Virtual Sensors?
- Isolation: Each virtual sensor runs in its own environment, ensuring that dependencies do not conflict.
- Reproducibility: Docker containers ensure that your application runs the same way everywhere.
- Scalability: Easily scale your application by creating multiple instances of a container.
Step-by-Step Guide
Install Docker
Ensure Docker is installed on your system. Docker provides a consistent environment for your application, regardless of where it runs. It should already be installed since we used docker for the open-remote installation
Create a Dockerfile
A Dockerfile is a script containing commands to assemble a Docker image. Each command adds a layer to the image, installing software, copying files, or configuring settings.
Example for mail_virtual_sensor.py:
# Start with a Python base image
FROM python:3.8-slim
# Set a working directory in the container
WORKDIR /usr/src/app
# Copy the Python script into the container
COPY mail_virtual_sensor.py ./
# Install wget
RUN apt-get update && apt-get install -y wget
# Download and install paho-mqtt
RUN wget https://files.pythonhosted.org/packages/source/p/paho-mqtt/paho-mqtt-1.5.0.tar.gz \
&& tar -xzvf paho-mqtt-1.5.0.tar.gz \
&& cd paho-mqtt-1.5.0 \
&& python setup.py install
# Command to run the script when the container starts
CMD ["python", "./mail_virtual_sensor.py"]
Explanation: This Dockerfile starts with a Python 3.8 image, sets up a working directory, copies your script into the container, installs the necessary packages, and specifies the command to run your script.
Build the Docker Image
Use the docker build command to create an image from your Dockerfile. This image contains everything needed to run your virtual sensor.
docker build -t mail_virtual_sensor .
Explanation: This command builds an image named mail_virtual_sensor from the Dockerfile in the current directory.
Run the Container
Start a container based on your new image. This is your virtual sensor running in an isolated environment.
docker run -d --name mail_sensor_container mail_virtual_sensor
Explanation: This command runs the mail_virtual_sensor image in a container named mail_sensor_container. The -d flag runs the container in detached mode, allowing it to run in the background.
Monitoring and Logs
Docker provides commands to check the status of your containers and view logs, helping you understand the behavior of your virtual sensors.
docker logs mail_sensor_container
Explanation: This command displays logs from the mail_sensor_container, which can be crucial for debugging and monitoring.
By following these steps, you’ve successfully containerized one of the 2 scripts, making them more robust and easier to deploy. Docker’s simplicity and efficiency are why it’s a popular choice for deploying applications like your virtual sensors. Using the same steps you can also dockerise the other python script developed last time (weight sensor).
Open Remote Backup/Migration
We have successfully containairised the virtual sensors, but what about everything we have set up/stored on OpenRemote? How can we replicate what we have already created? We can utilise what is called “docker volumes”. By effectively using volumes, you can ensure that your application’s data is preserved, easily backed up, and can be migrated or replicated as needed.
Understanding Docker Volumes in OpenRemote Context
- Persistence: Docker volumes store persistent data such as database files, configuration settings, and state information for OpenRemote. Without volumes, this data would be ephemeral and lost when containers are stopped or removed.
- Data Sharing and Consistency: Volumes ensure consistent data across container restarts and updates, essential for services like databases in OpenRemote.
- Backup and Migration: Volumes are key for backing up OpenRemote’s important data and facilitating migration to new instances or hosts.
How to Use the Docker Volumes of OpenRemote
- Creating a Volume for OpenRemote:
- OpenRemote uses predefined volumes like
proxy-data
,manager-data
, andpostgresql-data
, defined indocker-compose.yml
.
- OpenRemote uses predefined volumes like
- Using Volumes with OpenRemote Containers:
- Example for attaching
manager-data
volume indocker-compose.yml
:services: manager: image: openremote/manager volumes: - manager-data:/storage volumes: manager-data:
- Example for attaching
- Backing Up OpenRemote Volumes:
- Backup command example:
docker run --rm -v manager-data:/data -v $(pwd):/backup ubuntu tar -czvf /backup/manager-data-backup.tar.gz -C /data ./
- Repeat for other volumes like
proxy-data
,postgresql-data
.
- Backup command example:
- Restoring OpenRemote Volumes:
- Restore command example:
docker run --rm -v manager-data:/data -v $(pwd):/backup ubuntu tar -xzvf /backup/manager-data-backup.tar.gz -C /data
- Restore command example:
- Replicating Data for New OpenRemote Deployment:
- Use backed-up data and volumes for setting up a new instance. Define volumes in the new
docker-compose.yml
file and restore data into them.
- Use backed-up data and volumes for setting up a new instance. Define volumes in the new
Important Notes for OpenRemote
- Volume Location: Managed by Docker, with custom paths optional.
- Volume Drivers: Different drivers offer features like remote storage.
- Data Consistency: Regular backups are essential to prevent data loss.