Prometheus is a time series database. I am currently using Prometheus with Kafka, having the jmx agent expose beans for Prometheus to ingest with Grafana for visualization. It's great, but temporary. Prometheus is designed to be an ephemeral cache and does not try to solve distributed data storage. To that end, Prometheus provides a "remote_write" configuration option to POST data sampling to an endpoint for the ingest point for persistence.
The protocol is a snappy compressed protobuf that contains the data sampling.
Most of the ecosystem around this feature is Go based. To the point where Googling around on how to do this in other languages suggests stripping out gogolang items from the protobuf spec files. I could not find a single example of doing this in another language, so here you go internet.
1. Get the .proto files Prometheus uses for the "remote_write" protobuf. They can be found here. Make sure you get the ones that match the version of the Prometheus server you are running.
You will then need gogo.proto, which can be found here.
In summary, you now have remote, types, and gogo .proto files.
2. Compile these .proto files with protoc into your language of choice. For this tutorial I am going to use python3. You could build protoc from source, but I found it easier to just download the precompiled binary. They can be found here. You want the protoc-3.11.2-PLATFORM.zip
Directory layout for these instructions:
Order doesn't matter, for some reason it won't generated imported items...? I assume this is because I don't know enough about the protobuf system.
In any event, you should then see the following:
3. Spin up a server:
Make sure you have pip installed (or otherwise) the google protobuf and snappy libraries.
4. Set the remote_write config section in prometheus.yml for your Prometheus server:
5. Start the python server and start/restart Prometheus. You should shortly see data coming in fairly frequently.
--
That was an intense afternoon. The original documentation and suggestions indicated that this was not doable outside of Go, which seemed weird considering the whole point of an agnostic data format like protobuf is specifically to avoid eco system bullshittery.
Happy coding!
The protocol is a snappy compressed protobuf that contains the data sampling.
Most of the ecosystem around this feature is Go based. To the point where Googling around on how to do this in other languages suggests stripping out gogolang items from the protobuf spec files. I could not find a single example of doing this in another language, so here you go internet.
1. Get the .proto files Prometheus uses for the "remote_write" protobuf. They can be found here. Make sure you get the ones that match the version of the Prometheus server you are running.
- remote.proto
- types.proto
You will then need gogo.proto, which can be found here.
In summary, you now have remote, types, and gogo .proto files.
2. Compile these .proto files with protoc into your language of choice. For this tutorial I am going to use python3. You could build protoc from source, but I found it easier to just download the precompiled binary. They can be found here. You want the protoc-3.11.2-PLATFORM.zip
Directory layout for these instructions:
./ protoc/ //unzipped binaries from above output/ //destination for our language files imports/ remote.proto types.proto gogoproto/ gogo.proto
./protoc/bin/protoc --proto_path=./imports --python_out=./output/ imports/remote.proto ./protoc/bin/protoc --proto_path=./imports --python_out=./output/ imports/types.proto ./protoc/bin/protoc --proto_path=./imports --python_out=./output/ imports/gogoproto/gogo.proto
Order doesn't matter, for some reason it won't generated imported items...? I assume this is because I don't know enough about the protobuf system.
In any event, you should then see the following:
ls -R output/ output/: gogoproto remote_pb2.py types_pb2.py output/gogoproto: gogo_pb2.py
3. Spin up a server:
from http.server import HTTPServer, BaseHTTPRequestHandler from google.protobuf.json_format import MessageToJson from io import BytesIO import remote_pb2 import snappy class SimpleHTTPRequestHandler(BaseHTTPRequestHandler): def do_GET(self): self.send_response(200) self.end_headers() self.wfile.write(b'Hello, world!') def do_POST(self): content_length = int(self.headers['Content-Length']) body = self.rfile.read(content_length) msg = remote_pb2.WriteRequest() msg.ParseFromString(snappy.uncompress(body)) json = MessageToJson(msg) print (json) httpd = HTTPServer(('localhost', 8000), SimpleHTTPRequestHandler) httpd.serve_forever()
Make sure you have pip installed (or otherwise) the google protobuf and snappy libraries.
4. Set the remote_write config section in prometheus.yml for your Prometheus server:
remote_write: - url: 'http://localhost:8000/receive'
5. Start the python server and start/restart Prometheus. You should shortly see data coming in fairly frequently.
--
That was an intense afternoon. The original documentation and suggestions indicated that this was not doable outside of Go, which seemed weird considering the whole point of an agnostic data format like protobuf is specifically to avoid eco system bullshittery.
Happy coding!
0 Comments On This Entry
Tags
My Blog Links
Recent Entries
-
-
-
-
Prometheus Remote Write with non Go Languages
on Jan 14 2020 08:25 PM
-
Recent Comments
Search My Blog
1 user(s) viewing
1 Guests
0 member(s)
0 anonymous member(s)
Google
0 member(s)
0 anonymous member(s)