How to use Google Cloud Logging with Raspberry Pi

Antti Havanko
4 min readSep 30, 2020

The software running on Raspberry Pi is like any (production) system which benefits from proper logs. Logs are crucial to debugging, monitoring, alerting, and just keeping the systems up and running without interruptions. Depending on your setup, you might have several Raspberry Pis (other any IoT devices) so the logs from all the devices need to be aggregated and accessible in a single place.

There are many alternatives for centralized logging and Google Cloud Logging is one of the options. But why choose it? Well, it’s practically free at least on a small scale and it provides centralized logging with easy integration. You can also easily create log-based metrics and alerts based on these which provides visibility to your devices.

Create a service account

First, you need to create a service account to allow writing the logs from Raspberry Pi. You can create the service account from GCP console: https://cloud.google.com/docs/authentication/production The “Log Writer” permissions are needed at least.

Once the service account is ready, you need to create a new key and download it as a JSON file. This file contains the authentication credentials for the logging driver and you need to provide the JSON file to it by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable. I’m running the application using Docker, so the easiest way to configure Docker using systemd. Many Linux distributions use systemd to start the Docker daemon so you can override the environment variables by adding configuration file to /etc/systemd/system/docker.service.d/.

Example: /etc/systemd/system/docker.service.d/service-account-env-variables.conf:

[Service]
Environment="GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json"

This will pass the correct environment variable for all the Docker containers and to the logging driver. Remember to restart the Docker after this:

sudo systemctl daemon-reload
sudo systemctl restart docker

Using the Google Cloud Logging driver

There are different ways of uploading the logs to GCP. An easy way is to use the Google Cloud Logging driver which can set as a default logging driver for the Docker containers. You can do it with the --log-driver option:

docker run --log-driver=gcplogs ...

Or just by adding the following JSON to/etc/docker/daemon.json so this logging driver will be used by all the Docker containers.

{
"log-driver": "gcplogs",
"log-opts": {
"gcp-meta-name": "10000000c8ac179f",
"gcp-project": "my-gcp-project",
"mode": "non-blocking",
"max-buffer-size": "2m"
}
}

gcp-meta-name specifies the name of the instance where the logs are coming from. If you have multiple Raspberry Pi devices sending logs to the same log bucket, you can use Raspberry Pi’s serial number to differentiate the devices. Run the following command on Raspberry Pi and add the output as gcp-meta-name.

cat /proc/cpuinfo | grep Serial | cut -d ' ' -f 2

Using Fluentd

Another way is to use Fluentd, which can forward the logs to various places, including to Google Cloud Logging.

Setting up Fluentd requires a bit more work though. I decided to build a separate Docker image for Fluentd which contains the necessary configurations.

I’m using the ARMv7 specific image of Fluentd as the base image and install fluent-plugin-google-cloud output plugin to it. This plugin allows pushing the logs to Cloud Logging API.

I also got plenty of permission issues with the Fluentd plugins unless I gave updated permission to all of them. Didn’t really understand why this happens but changing the file permissions to 644 fixed the issue.

FROM fluent/fluentd:v1.11.2-debian-armhf-1.0USER rootRUN apt-get update
RUN apt-get install -y make gcc g++ libc6-dev ruby-dev libffi-dev \
ca-certificates \
liblz4-1 \
ruby
RUN echo 'gem: --no-document' >> /etc/gemrc
RUN gem install fluent-plugin-google-cloud -v 0.6.25.1
RUN chmod 644 /usr/local/bundle/gems/fluent-plugin-google-cloud-0.6.25.1/lib/fluent/plugin/*.rbCOPY ./config/fluent/fluent.conf /fluentd/etc/fluent.confUSER fluent

The configuration file is quite simple. I’m also using the JSON filter plugin as the applications are logging everything as JSON.

<source>
@type forward
port 24224
bind 0.0.0.0
</source>
<filter **>
@type parser
key_name log
reserve_data true
<parse>
@type json
</parse>
</filter>
<filter **>
@type add_insert_ids
</filter>
<match **>
@type google_cloud
enable_metadata_agent false
vm_id "<GCP_PROJECT_ID>"
zone "<GCP_ZONE>"
split_logs_by_tag false
use_metadata_service false
detect_json true
buffer_type file
buffer_path /fluentd/log/fluentd.buffer
buffer_queue_full_action block
num_threads 2
use_grpc true
</match>

The docker-compose files starts the application container(s) and the Fluentd instance.

services:
fluentd:
image: myfluentimage
container_name: fluentd
ports:
- "24224:24224"
mycontainer:
image: mycontainer
container_name: mycontainer
depends_on:
- fluentd
logging:
driver: fluentd
options:
fluentd-address: 0.0.0.0:24224
fluentd-async-connect: "true"

There you go! Logs are automatically uploaded from the Raspberry Pi and available on GCP:

Cloud Logging console

--

--