The software running on Raspberry Pi is like any (production) system which benefits from proper logs. Logs are crucial to debugging, monitoring, alerting, and just keeping the systems up and running without interruptions. Depending on your setup, you might have several Raspberry Pis (other any IoT devices) so the logs from all the devices need to be aggregated and accessible in a single place.
There are many alternatives for centralized logging and Google Cloud Logging is one of the options. But why choose it? Well, it’s practically free at least on a small scale and it provides centralized logging with easy integration. You can also easily create log-based metrics and alerts based on these which provides visibility to your devices.
Create a service account
First, you need to create a service account to allow writing the logs from Raspberry Pi. You can create the service account from GCP console: https://cloud.google.com/docs/authentication/production The “Log Writer” permissions are needed at least.
Once the service account is ready, you need to create a new key and download it as a JSON file. This file contains the authentication credentials for the logging driver and you need to provide the JSON file to it by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable. I’m running the application using Docker, so the easiest way to configure Docker using systemd. Many Linux distributions use systemd to start the Docker daemon so you can override the environment variables by adding configuration file to /etc/systemd/system/docker.service.d/.
This will pass the correct environment variable for all the Docker containers and to the logging driver. Remember to restart the Docker after this:
sudo systemctl daemon-reload
sudo systemctl restart docker
Using the Google Cloud Logging driver
There are different ways of uploading the logs to GCP. An easy way is to use the Google Cloud Logging driver which can set as a default logging driver for the Docker containers. You can do it with the
docker run --log-driver=gcplogs ...
Or just by adding the following JSON to
/etc/docker/daemon.json so this logging driver will be used by all the Docker containers.
gcp-meta-name specifies the name of the instance where the logs are coming from. If you have multiple Raspberry Pi devices sending logs to the same log bucket, you can use Raspberry Pi’s serial number to differentiate the devices. Run the following command on Raspberry Pi and add the output as
cat /proc/cpuinfo | grep Serial | cut -d ' ' -f 2
Another way is to use Fluentd, which can forward the logs to various places, including to Google Cloud Logging.
Setting up Fluentd requires a bit more work though. I decided to build a separate Docker image for Fluentd which contains the necessary configurations.
I’m using the ARMv7 specific image of Fluentd as the base image and install fluent-plugin-google-cloud output plugin to it. This plugin allows pushing the logs to Cloud Logging API.
I also got plenty of permission issues with the Fluentd plugins unless I gave updated permission to all of them. Didn’t really understand why this happens but changing the file permissions to 644 fixed the issue.
FROM fluent/fluentd:v1.11.2-debian-armhf-1.0USER rootRUN apt-get update
RUN apt-get install -y make gcc g++ libc6-dev ruby-dev libffi-dev \
rubyRUN echo 'gem: --no-document' >> /etc/gemrc
RUN gem install fluent-plugin-google-cloud -v 0.6.25.1RUN chmod 644 /usr/local/bundle/gems/fluent-plugin-google-cloud-0.6.25.1/lib/fluent/plugin/*.rbCOPY ./config/fluent/fluent.conf /fluentd/etc/fluent.confUSER fluent
The configuration file is quite simple. I’m also using the JSON filter plugin as the applications are logging everything as JSON.
detect_json true buffer_type file
buffer_queue_full_action block num_threads 2
The docker-compose files starts the application container(s) and the Fluentd instance.
- "24224:24224" mycontainer:
There you go! Logs are automatically uploaded from the Raspberry Pi and available on GCP: