The idea behind IoT sensors is a network connected device that provides some data about something. Temperature, humidity, brightness, distance, electric power consumption, and many more. All the data is collected or submitted somewhere, and something else can react on this data. For example a temperature sensor in a house notices that the temperature goes down, smart heating could fire up and raise the temperature. Another use case could be refilling oil-fired heatings. I have myself such one. Once in a while I need to order fuel oil as soon as the fuel level reaches a certain threshold. A smart sensor could trigger notifications or even orders to my local oil dealer, so I get a delivery before I run out of fuel.

But how to transport sensor data? Either you use MQTT or do it on your own, for example using logstash.

I’m a big fan of the ELK stack, and I maintain logstash-gelf, a GELF connector for various Java-based logging frameworks. I came to the point the ELK stack is suitable for much more than log or transactional data. Event sourcing with the ELK stack allows a very detailed insight into applications and provides real-time monitoring of a business. logstash has versatile connectors and one of them is the TCP input combined with JSON. logstash allows you to pipeline the data somewhere: Towards ElasticSearch, triggering nagios events, sending E-Mails and many more.

I created a test setup incorporating a RaspberryPi carrying a ultrasonic range sensor and an ELK stack setup.


The RaspberryPi is sending its data regularly to logstash using the TCP input using JSON. JSON is the simplest data format available on IoT platforms. Any binary protocol would work out as well but is harder to implement. Sensor data is received by logstash and forwarded to ElasticSearch. The Kibana dashboard allows to select the data and create visualizations for the data.

Here is the Python code running on the RaspberryPi gathering the distance value from the sonic sensor and submitting it to logstash (Find here more details on wiring the sonic sensor):

import socket
import json
import time
from distancemeter import get_distance,cleanup

# Logstash TCP/JSON Host
JSON_PORT = 9400

if __name__ == '__main__':
        s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        s.connect((JSON_HOST, JSON_PORT))

        while True:
            distance = get_distance()
            data = {'message': 'distance %.1f cm' % distance, 'distance': distance, 'hostname': socket.gethostname()}

            print ("Received distance = %.1f cm" % distance)

    # interrupt
    except KeyboardInterrupt:
        print("Program interrupted")

The logstash config contains two config elements: The input and the output. The input is a simple TCP input accepting messages in JSON format. The output goes straight to ElasticSearch:

    port => 9400
    codec => "json_lines"

    host => "localhost"
    port => 9200
    index => "distance-%{+YYYY.MM.dd}"

The visualization on the Kibana frontend displays tabular data and a trend of the distance: Kibana Visualization

Playing around with the setup is great fun. The setup is minimal with maximum effect. I posted the code on so you can try it yourself. The Github samples also contain a demo using MQTT that is straightforward as well.

@jordansissel is having a talk on March 11 about time series data playing around with sensors. Don't miss it!