Configuration as Code framework based on protobuf and Starlark
Modern services are comprised of many dynamic variables, that need to be changed regularly. Today, the process is unstructured and error prone. From ML model variables, kill switches, gradual rollout configuration, A/B experiment configuration and more - developers want their code to allow to be configured to the finer details.
Protoconf is a modern approach to software configuration, inspired by Facebook's Configerator.
Using Protoconf enables:
This is roughly how configuration is consumed by a service. This paradigm encourages you to write software that can reconfigure itself in runtime rather than require a restart:
As Protoconf uses Protobuf and gRPC, it supports delivering configuration to all major languages. See also: Protobuf overview.
Step by step instructions to start developing with Protoconf, with an example from an imaginary Python web crawler service. See full example under examples/
.
Install the protoconf
binary (see build from source)
Write a Protobuf schema under protoconf/src/
(syntax guide https://developers.google.com/protocol-buffers/docs/proto3)
protoconf/src/crawler/crawler.proto
syntax = "proto3";
message Crawler {
string user_agent = 1;
int32 http_timeout = 2;
bool follow_redirects = 3;
}
message CrawlerService {
repeated Crawler crawlers = 1;
enum AdminPermission {
READ_WRITE = 0;
GOD_MODE = 1;
}
map<string, AdminPermission> admins = 2;
int32 log_level = 3;
}
Write validators (optional)
.proto
file, with a .proto-validator
suffixprotoconf/src/crawler/crawler.proto-validator
load("crawler.proto", "Crawler", "CrawlerService")
def test_crawlers_not_empty(cs):
if len(cs.crawlers) < 1:
fail("Crawlers can't be empty")
add_validator(CrawlerService, test_crawlers_not_empty)
def test_http_timeout(c):
MIN_TIMEOUT = 10
if c.http_timeout < MIN_TIMEOUT:
fail("Crawler HTTP timeout must be at least %d, got %d" % (MIN_TIMEOUT, c.http_timeout))
add_validator(Crawler, test_http_timeout)
Write a config
A Starlark .pconf
file. Your code can be modular, export functions (ideally in .pinc
files), and build a complete custom stack for your configuration needs.
For example: protoconf/src/crawler/text_crawler.pconf
load("crawler.proto", "Crawler", "CrawlerService")
def default_crawler():
return Crawler(user_agent="Linux", http_timeout=30)
def main():
crawlers = []
for i in range(3):
crawler = default_crawler()
crawler.http_timeout = 30 + 30*i
if i == 0:
crawler.follow_redirects = True
crawlers.append(crawler)
admins = {'superuser': CrawlerService.AdminPermission.GOD_MODE}
return CrawlerService(crawlers=crawlers, admins=admins, log_level=2)
Compile with protoconf compile
, this will create a materialized config file under protoconf/materialized_configs/
For example: protoconf compile protoconf/ crawler/text/crawler
will create protoconf/materialized_config/crawler/text_crawler.materialized_JSON
{
"protoFile": "crawler/crawler.proto",
"value": {
"@type": "https://CrawlerService",
"admins": {
"superuser": "GOD_MODE"
},
"crawlers": [
{
"userAgent": "Linux",
"httpTimeout": 30,
"followRedirects": true
},
{
"userAgent": "Linux",
"httpTimeout": 60
},
{
"userAgent": "Linux",
"httpTimeout": 90
}
],
"logLevel": 2
}
}
Run the Protoconf agent in dev mode
protoconf agent protoconf/
Prepare your application to work with Protobuf configs coming from Protoconf
Compile your .proto
schema, this will generate an object to work with.
For Python you can use grpcio-tools
, for example:
pip3 install grpcio-tools
python3 -m grpc_tools.protoc --python_out=. -I../protoconf/src ../protoconf/src/crawler/crawler.proto
Other languages can use the protoc
binary (https://developers.google.com/protocol-buffers/docs/tutorials).
Install the Protoconf Python library:
pip3 install -r python/requirements.txt python/
examples/
. The code mainly consists of:from protoconf import ProtoconfSync
from crawler.crawler_pb2 import CrawlerService
protoconf = ProtoconfSync()
crawler_service = protoconf.get_and_subscribe("crawler/text_crawler", CrawlerService, got_config)
print("config:", crawler_service)
def got_config(new_crawler_service):
print("got a new config:", new_crawler_service)
Commit all changes under protoconf/
(including the .materialized_JSON
files)
protoconf agent
protoconf update
on changed .materialized_JSON
filesgit clone https://github.com/protoconf/protoconf.git
cd protoconf && go install ./cmd/protoconf
consul agent -dev
)bazel run protoconf agent
bazel run protoconf compile "$(pwd)/examples/protoconf" crawler/text_crawler.pconf
bazel run protoconf insert "$(pwd)/examples/protoconf" crawler/text_crawler.materialized_JSON
bazel run //examples/grpc_clients/go_client
, the client will get the config from the agent and will listen to changesexamples/protoconf/src/crawler/text_crawler.pconf