Setup IoT Gateway on a Pi 4

Continuing from my last post, this part covers how I setup docker-compose on the Pi 4 OS64 to help me recover from a system failure.

Learning Docker Compose required some effort. Using the scripts at IOTStack works for the most part but one is no better equipped to troubleshoot if something goes wrong.

The file used configures Node-RED, InfluxDB 2, Grafana, MQTT, Portainer, and Telegraf can be found at https://github.com/chrapchp/Raspberry-DockerCompose. Telegraf was not needed but included anyway to demonstrate that it does work by pushing system info to influx.

Docker compose was run on both the Pi OS64 and Windows 10. This setup is not meant for production environment as it falls short on security. A dev-ops mess. The github repo has pem, .env files which are an anti-patterns. Please refer to sources like this to properly manage secrets.

Back of Napkin Requirements

Node-RED

  • SSL via self-signed certificate
  • user/passwords configuration as default
  • Config, flows, additional components saved outside of docker container
  • Parked – SSL certificate by trusted authority
  • Parked – Reverse Proxy

InfluxDB

  • Use Version 2
  • Config is preset so when docker compose runs, no additional configuration required
  • Test with python and push three sinusoidal signals at different frequencies
  • Parked SSL

MQTT

  • Accept any connection with no authentication
  • Parked – Config should allow for authentication
  • Parked – SSL
  • Parked – Authentication

Telegraf

  • Work without configuration and publish system metrics to Influxdb
  • Parked – SSL

Grafana

  • Work with InfluxDB V2
  • Parked – SSL
  • Parked SAML
  • Parked – Reverse Proxy

Getting Started

Preconditions:

  • Pi OS 64 installed ready for ssh. How-to found here; scroll to bullseye section and follow instructions. Latest images here. My system is on an SSD and how-to can be found here.
  • Docker via convenience scripts run from home directory and running as a service. Could work with Debian Arm64 but did not try that the time of writing.
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
sudo systemctl enable docker 
 pip3 install docker-compose

Steps

Under your home directory. e.g. cd ~ under Linux or c:/user/theuser/ under Windows 10

git clone https://github.com/chrapchp/Raspberry-DockerCompose.git

or if you don’t like the Rasperry-Docker-Compose folder as the name, pass the desired folder name to the git command. e.g. to clone into folder XYZ or c:\sandbox

git clone https://github.com/chrapchp/Raspberry-DockerCompose.git XYZ
git clone https://github.com/chrapchp/Raspberry-DockerCompose.git c:\sandbox

Note your will have to edit the .env file under the cloned repo directory reflect the non-default folder. BIND_VOLUME_ROOT=~/Raspberry-DockerCompose

BIND_VOLUME_ROOT=~/XYZ
BIND_VOLUME_ROOT=c:\sandbox

Docker compose will create the host volumes automatically and keeps existing data intact if they exist. e.g. the config files.

Edit ~Raspberry-DockerCompose/.env or wherever you cloned to repo and change the INFLUX_IP to the device that is running docker-compose. e.g.

INFLUX_IP=192.168.1.86

From a terminal run the following. In my case I was in ~ Raspberry-DockerCompose

docker-compose up 

Docker will download the required images and display something like the following on the terminal.

Smoke Test

Portainer

Open your browser and navigate to https://<ip>:9000. You will prompted to change the admin password. Provide one and proceed to login.

Click on stacks and following the raspberry-dockercompose link and the six container will show up. Explore as you see fit.

Node-RED

Open your browser and navigate to https://<ip>:1880 to log into Node-RED. Change the <ip> to the address reflect your device IP. The repo has a self-signed certificate that really is not useful beyond getting images up an running with SSL. The certificate is untrusted since it was self-signed.

Login with admin/changeOnInstall

To generate a your own certificate cd into ~Raspberry-DockerCompose/volumes/nodered/data and run

sudo openssl req -newkey rsa:2048 -new -nodes -x509 -days 3650 -keyout key.pem -out cert.pem

Since it is self-signed, most of the information can be defaulted since nobody outside will trust it anyway. Note Common Name (e.g. server FQDN or YOUR name) is set to your devices IP. e.g. 192.168.1.23

Changing the admin password. This requires running a node-red command which can be done from the terminal or via Portainer.

From a terminal run

docker exec -it node-red /bin/bash

Or from Portainer select console, then connect.

Then run.

node-red admin hash-pw

You will be prompted a password, provide on and a hash will be created to copy into setup.js. Copy the hash.

Assuming you are still in Raspberry-DockerCompose/volumes/nodered/data, edit setting.js and change the admin entry’s password to the new hash. (copy and paste)

Restart node-RED to implement changes. Flows and installed libraries reside in the data folder. If you have a new docker-compose environment, copy this folder and below to the new host-volume if you want to preserve settings.

InfluxDB

Open your browser and navigate to https://<ip>:8086 to log into influxdb. Change the <ip> to the address reflect your device IP. The username/password is theUser/changeOnInstall as defined in the dev-influxdb.env.

DOCKER_INFLUXDB_INIT_MODE=setup
DOCKER_INFLUXDB_INIT_USERNAME=theUser
DOCKER_INFLUXDB_INIT_PASSWORD=changeOnInstall
DOCKER_INFLUXDB_INIT_ORG=theOrg
DOCKER_INFLUXDB_INIT_BUCKET=theBucket
DOCKER_INFLUXDB_INIT_ADMIN_TOKEN=makeMeComplicated

Navigate into Data and then into API Tokens. You will notice the token for a “theUser”. Drill down in that token. The value is set to makeMeComplicated.. This is the same token used for telegraf which is not a good practice but good ok for learning.

Click on explore. Select the theBucket, filter name=sda, and _field=io_time and submit. A plot will appear implying that telegraf is running ok.

From the terminal, change to ~Raspberry-DockerCompose or whatever you cloned the repo into. On a Pi OS64, pip3 is already installed. Run

 pip3 install matplotlib
 pip3 install influxdb_client

If you ssh’d in matplotlib won’t do anything useful. Either way , run and wait about 20s.

python ./injectSigals.py

Got back to influxDB url and select explore. You may have to refresh your browser. Select filter=tag and check carrier, message, modulatedAM. .

Select past 15m, go into custom window period and change to 1s. The python code injects samples at the 1s resolution. Click submit and a waveform like the following should appear.

Grafana

Open your browser and navigate to https://<ip>:3000 to log into influxdb. Change the <ip> to the address reflect your device IP. Default user/password is admin/admin. Change the password when prompted. Select Data Sources then select InfluxDB

Select Flux as the query language and turn off Basic auth. Ensure the URL points to your device IP. Port defaults to 8086.

As per dev-influxdb.env, set Organization to theOrg, Token to makeMeComplicated, DefaultBucket to theBucket. Save & Test.

Go to Explore, ensure InfluxDB source and paste the following for the query. Run the Query. Depending how long ago you did the injectSignals.py you may have to change the period on the dropdown.

from(bucket: "theBucket")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["tag"] == "carrier" or r["tag"] == "message" or r["tag"] == "modulatedAM")
  |> aggregateWindow(every: 1s, fn: mean, createEmpty: false)
  |> yield(name: "mean")

If all works, three signals should appear.

MQTT

On Windows 10, install mqtt-explorer.com, start it up and create a new entry with the proper IP address to your device, and connect.

On Linux, run

sudo apt install -y mosquitto mosquitto-clients

In one terminal, run for the subscriber

mosquitto_sub -d -t YourTopic

In another terminal, run the publisher

mosquitto_pub -d -t YourTopic -m "Hello world!"

Hello World! will appear in the subscriber terminal.

System Restart

Since docker is a service and the the restart-policy for each docker compose serve is set to restart: unless-stopped, all that is required to run restarts on system startup. Tested under Pi OS 64.

Posted in IoT

IoT Gateway on the Raspberry Pi4

Twelve years ago, I built a power monitoring system acting as a Modbus slave device. It has been running 24×7 since. The hydroponic monitoring and control system built four years ago provides me with lettuce and herbs for the winter. Both systems integrated with version 3 of mango; an open source SCADA/HMI. Mango ran on old PCs then on a repurposed Mac Mini with Ubuntu 20.04. Still, something was missing. I wanted to run on even more low power 24×7. Enter the trusty Raspberry Pi.

Edge Computing

Having worked in both control system engineering and pure tech, I can see the appeal of the IoT buzzwords mania. Some PLC vendors are bolting on MQTT on their product and rebranding it as IoT ready, while some larger legacy vendors are updating their bloated ecosystem of products even more bloated.

Mango and atvise have architected their solutions to be cloud ready.

Sadly, a lot of the “home'” examples with Node-RED and InfluxDB are overly simplistic and do not reflect the reality of the workflow involved in designing, commissioning, and maintenance of large industrial control systems.

I tried the MQTT to InfluxDB via Node-RED and would not want to maintain a large system with that approach. I am not a fan of flow-chart programming as it just becomes a pain to “refactor”. Weidmueller now has Node-RED baked into some of their controllers and I briefly looked at it. What I do like about Node-RED, though, it is built on node.js and provides engineers options to develop custom components to implement recurring design patterns. This simplifies the flow chart hell.

TO-BE Context Diagram

Flashback a few months ago. I installed Node-RED, Mosquito MQTT, and InfluxDB on my Pi4 8Gb. The Node-RED polled the power monitor via Modbus and pushed into Influx. All was humming along until I did something stupid and could not longer boot my PI.

Guess what. I forgot what I did and only had a handful of config backups. Time to think about containerizing software using Docker and Docker Compose avoid this quagmire next time.

Note the architecture below increased the number of failure points compared to the Modbus to SCADA integration. I would not architect something for real control systems without security and fault tolerance baked in. For home it is more that good enough.

Modbus is simple. Floats, integers, etc. flow nicely through a simple protocol. Many devices implement it. MQTT, on the other hand, adds more overhead and in this scenario overkill. I need to explore how best to define a “tag” and propagate through the system without much coding. Just like traditional PLC/HMI systems. Over the Air Updates help in that situation and remains TBD.

Plans to push the Influx/Grafana to the cloud are in the works. On-premise is good enough to repurpose the existing controllers to be edge devices. The hydroponics once contained the CO2/ temperature/relative humidity sensors and are not used for controls. They have now moved to two new boards that just does air quality.

I like the fact that Python gives me access to data science, ML, signal processing tools and opted to use it to handle MQTT messages and marshal them to Influx. Migrating to a custom Node-RED component is in the backlog but I can crank out code faster with python than Javascript.

Dev Environment

My environment contains the standard oscilloscope/logic analyzer, signal generator, power supply. On the software side, I spend most of my time in Jupyter or Visual Studio code. The following outlines some of the software I use.

Next Steps

  • Dockerize InfluxDB 2, Node-RED, Grafana, and Portainer
  • Learn Flux
  • Create the Grafana dashboards
  • Write the MQTT to Influx logic in Python
  • Refactor the board logic to communicate via MQTT
  • Create basic control screens in Node-RED

Posted in IoT

MEAN Tools Installation

Well after some thought, I figured it was time to roll up my sleeves and install some tools and frameworks to start with my minimilist IoT playground. I use macOS and will focus just on that.

Environment under macOS

I first started to go down the path provided at mean.io and felt there was too much of a heavy lift for a newbie trying to ramp up on four technologies at the same time. I opted for installing each of them by hand so I can see the type of problems can occur.

I installed the following:

Sublime Text – Nice editor and I started using it for Arduino development as well

MongoDB –  I used the homebrew approach.

$ brew install mongodb --with-openssl
$ sudo mkdir /data/dbmd 
$ whoami    
youraccount
$ sudo chown youraccount /data/db
# Default no authentication required so user beware.
# launch mongodb
$ mongodb

Node Version Manager (NVM) – Used to manage different versions of node.js. Note I have Xcode installed and you may need the command line tools later.

$ curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.2/install.sh | bash
$ nvm install node
$ nvm ls # list of node versions installed
$ nvm 
$ nvm alias default 7.8.0 # I want to keep version 7.8.0 as default

If the NVM is too much of a hassle, get node directory from node.js via download
Node.js – It is already newer than the version I have (7.8.0). This is an easy install and should not pose any problems

Express Generator – another straight forward install for light weight web framework

$ npm install -g express-generator  
# change to a directory where you want to install the express templates. e.g. min is chrapchp/Dev/nodes
$ express HomeSensors # what I called my app
$ cd HomeSensors
$ npm install

I installed the following as well based on what I thought I needed for this learning exercise.

Package/ToolURLDescriptionInstallation
log4jslog4jslog4js based logging services for node.jsnpm install log4js -S
monkmonkwrapper to mongodb that is simpler yet not as powerful as mongoosenpm install monk -S
nodemonnodemonlistens for file changes and restarts server npm install nodemon -g
dummy-jsondummy-sontool to generate JSON files used for my testingnpm install dummy-json -g
RobomongorobomongoMongoDB managerdownload and point to mongoDB instance (default localhost:27017)
Bluebirdbluebirdpromise library implementationnpm install bluebird -S
SerialPortserial portserial port driver for node.jsnpm install serialport -S # have 4.0.7
xbee-apixbee-apixbee API for node.jsnpm install xbee-api -S

Off to learning this stuff.

Empowering the Many

Hello MEAN stack

A few years ago I had boat loads of temperature envelop data of my house and outside temperature. When I was looking for quotes to re-insulate my old house, an insulation vendor expressed interest in purchasing my before and after analysis and results. I did not proceed with a full re-insulation of my house but did end up loosing my data which was 100% my fault. I did not back up to a NAS and experiences a hard drive failure.

Fast forward today. There is lots of talk of IoT, Analytics, and cloud services. Many, I feel are putting lipstick on their outdated products so buyer beware.  That said, the various IoT ecosystems provided through services such as Microsoft Azure, etc. are making it easier to mashup, collect, aggregate, and analyze data. Alarm Management, historians may become moot at some point unless vendors provide added value services such as predictive analytics and performance management solutions.

My interests these days revolve around machine learning and visual analytics but I do like to keep on top of some technology that can be used to marry IoT with the enterprise. With the handful of XBee devices lying around, I’ve set my eyes to ramp up on the MEAN (MongoDB, ExpressJS, Node.js, AngularJS stack and see what I can come up with for my own use at home. I chose a Typescript/Javascript environment as I can get by with basic open source tools and decent editors without having to get something like visual studio.

Key System Architecture Components

 

1-configure XBee end devices to sleep and send to coordinator AI/DI data. (I’ve tested this a few years ago so I know it works) (Temperature, ambient light, etc) Mesh network using API mode.

2. 0 or more routers to relay the messages from the end devices to the coordinator

3. 1 coordinator that feeds into the system via serial port

4. Node.JS+ Express to handle the configuration of the I/O wired to the XBees. e.g. scaling, tag name, etc. MongoDB to persist the data, and angularJS to render the UI.

5. There are three IoT platforms ( GE Predix , XivelyThingSpeak, and Azure IoT )  that I have accounts with that I would like to push data to to test it out. I have two SCADA and one HMI system that I am also going to test out the IIoT readiness.

6. My home power monitoring has been running for 8 years on arduino and XBee. The next step is to push data rather than poll from the host to see what that SCADA system can do.

Further down the horizon the inclusion of some  MQTT flavour and and node.js integration.