This project aims to provide an improved experience when using Protobuf / gRPC in a modern Python environment by making use of modern language features and generating readable, understandable, idiomatic Python code. It will not support legacy features or environments (e.g. Protobuf 2). The following are supported:
- Protobuf 3 & gRPC code generation
- Both binary & JSON serialization is built-in
- Python 3.6+ making use of:
- Enums
- Dataclasses
async
/await
- Timezone-aware
datetime
andtimedelta
objects - Relative imports
- Mypy type checking
This project is heavily inspired by, and borrows functionality from:
- https://github.com/protocolbuffers/protobuf/tree/master/python
- https://github.com/eigenein/protobuf/
- https://github.com/vmagamedov/grpclib
This project exists because I am unhappy with the state of the official Google protoc plugin for Python.
- No
async
support (requires additionalgrpclib
plugin) - No typing support or code completion/intelligence (requires additional
mypy
plugin) - No
__init__.py
module files get generated - Output is not importable
- Import paths break in Python 3 unless you mess with
sys.path
- Import paths break in Python 3 unless you mess with
- Bugs when names clash (e.g.
codecs
package) - Generated code is not idiomatic
- Completely unreadable runtime code-generation
- Much code looks like C++ or Java ported 1:1 to Python
- Capitalized function names like
HasField()
andSerializeToString()
- Uses
SerializeToString()
rather than the built-in__bytes__()
- Special wrapped types don't use Python's
None
- Timestamp/duration types don't use Python's built-in
datetime
module
This project is a reimplementation from the ground up focused on idiomatic modern Python to help fix some of the above. While it may not be a 1:1 drop-in replacement due to changed method names and call patterns, the wire format is identical.
First, install the package. Note that the [compiler]
feature flag tells it to install extra dependencies only needed by the protoc
plugin:
# Install both the library and compiler
$ pip install "betterproto[compiler]"
# Install just the library (to use the generated code output)
$ pip install betterproto
Now, given you installed the compiler and have a proto file, e.g example.proto
:
syntax = "proto3";
package hello;
// Greeting represents a message you can tell a user.
message Greeting {
string message = 1;
}
You can run the following:
$ protoc -I . --python_betterproto_out=. example.proto
This will generate hello.py
which looks like:
# Generated by the protocol buffer compiler. DO NOT EDIT!
# sources: hello.proto
# plugin: python-betterproto
from dataclasses import dataclass
import betterproto
@dataclass
class Hello(betterproto.Message):
"""Greeting represents a message you can tell a user."""
message: str = betterproto.string_field(1)
Now you can use it!
>>> from hello import Hello
>>> test = Hello()
>>> test
Hello(message='')
>>> test.message = "Hey!"
>>> test
Hello(message="Hey!")
>>> serialized = bytes(test)
>>> serialized
b'\n\x04Hey!'
>>> another = Hello().parse(serialized)
>>> another
Hello(message="Hey!")
>>> another.to_dict()
{"message": "Hey!"}
>>> another.to_json(indent=2)
'{\n "message": "Hey!"\n}'
The generated Protobuf Message
classes are compatible with grpclib so you are free to use it if you like. That said, this project also includes support for async gRPC stub generation with better static type checking and code completion support. It is enabled by default.
Given an example like:
syntax = "proto3";
package echo;
message EchoRequest {
string value = 1;
// Number of extra times to echo
uint32 extra_times = 2;
}
message EchoResponse {
repeated string values = 1;
}
message EchoStreamResponse {
string value = 1;
}
service Echo {
rpc Echo(EchoRequest) returns (EchoResponse);
rpc EchoStream(EchoRequest) returns (stream EchoStreamResponse);
}
You can use it like so (enable async in the interactive shell first):
>>> import echo
>>> from grpclib.client import Channel
>>> channel = Channel(host="127.0.0.1", port=1234)
>>> service = echo.EchoStub(channel)
>>> await service.echo(value="hello", extra_times=1)
EchoResponse(values=["hello", "hello"])
>>> async for response in service.echo_stream(value="hello", extra_times=1)
print(response)
EchoStreamResponse(value="hello")
EchoStreamResponse(value="hello")
Both serializing and parsing are supported to/from JSON and Python dictionaries using the following methods:
- Dicts:
Message().to_dict()
,Message().from_dict(...)
- JSON:
Message().to_json()
,Message().from_json(...)
For compatibility the default is to convert field names to camelCase
. You can control this behavior by passing a casing value, e.g:
>>> MyMessage().to_dict(casing=betterproto.Casing.SNAKE)
Sometimes it is useful to be able to determine whether a message has been sent on the wire. This is how the Google wrapper types work to let you know whether a value is unset, set as the default (zero value), or set as something else, for example.
Use betterproto.serialized_on_wire(message)
to determine if it was sent. This is a little bit different from the official Google generated Python code, and it lives outside the generated Message
class to prevent name clashes. Note that it only supports Proto 3 and thus can only be used to check if Message
fields are set. You cannot check if a scalar was sent on the wire.
# Old way (official Google Protobuf package)
>>> mymessage.HasField('myfield')
# New way (this project)
>>> betterproto.serialized_on_wire(mymessage.myfield)
Protobuf supports grouping fields in a oneof
clause. Only one of the fields in the group may be set at a given time. For example, given the proto:
syntax = "proto3";
message Test {
oneof foo {
bool on = 1;
int32 count = 2;
string name = 3;
}
}
You can use betterproto.which_one_of(message, group_name)
to determine which of the fields was set. It returns a tuple of the field name and value, or a blank string and None
if unset.
>>> test = Test()
>>> betterproto.which_one_of(test, "foo")
["", None]
>>> test.on = True
>>> betterproto.which_one_of(test, "foo")
["on", True]
# Setting one member of the group resets the others.
>>> test.count = 57
>>> betterproto.which_one_of(test, "foo")
["count", 57]
>>> test.on
False
# Default (zero) values also work.
>>> test.name = ""
>>> betterproto.which_one_of(test, "foo")
["name", ""]
>>> test.count
0
>>> test.on
False
Again this is a little different than the official Google code generator:
# Old way (official Google protobuf package)
>>> message.WhichOneof("group")
"foo"
# New way (this project)
>>> betterproto.which_one_of(message, "group")
["foo", "foo's value"]
Google provides several well-known message types like a timestamp, duration, and several wrappers used to provide optional zero value support. Each of these has a special JSON representation and is handled a little differently from normal messages. The Python mapping for these is as follows:
Google Message | Python Type | Default |
---|---|---|
google.protobuf.duration |
datetime.timedelta |
0 |
google.protobuf.timestamp |
Timezone-aware datetime.datetime |
1970-01-01T00:00:00Z |
google.protobuf.*Value |
Optional[...] |
None |
For the wrapper types, the Python type corresponds to the wrapped type, e.g. google.protobuf.BoolValue
becomes Optional[bool]
while google.protobuf.Int32Value
becomes Optional[int]
. All of the optional values default to None
, so don't forget to check for that possible state. Given:
syntax = "proto3";
import "google/protobuf/duration.proto";
import "google/protobuf/timestamp.proto";
import "google/protobuf/wrappers.proto";
message Test {
google.protobuf.BoolValue maybe = 1;
google.protobuf.Timestamp ts = 2;
google.protobuf.Duration duration = 3;
}
You can do stuff like:
>>> t = Test().from_dict({"maybe": True, "ts": "2019-01-01T12:00:00Z", "duration": "1.200s"})
>>> t
Test(maybe=True, ts=datetime.datetime(2019, 1, 1, 12, 0, tzinfo=datetime.timezone.utc), duration=datetime.timedelta(seconds=1, microseconds=200000))
>>> t.ts - t.duration
datetime.datetime(2019, 1, 1, 11, 59, 58, 800000, tzinfo=datetime.timezone.utc)
>>> t.ts.isoformat()
'2019-01-01T12:00:00+00:00'
>>> t.maybe = None
>>> t.to_dict()
{'ts': '2019-01-01T12:00:00Z', 'duration': '1.200s'}
First, make sure you have Python 3.6+ and pipenv
installed, along with the official Protobuf Compiler for your platform. Then:
# Get set up with the virtual env & dependencies
$ pipenv install --dev
# Link the local package
$ pipenv shell
$ pip install -e .
There are two types of tests:
- Manually-written tests for some behavior of the library
- Proto files and JSON inputs for automated tests
For #2, you can add a new *.proto
file into the betterproto/tests
directory along with a sample *.json
input and it will get automatically picked up.
Here's how to run the tests.
# Generate assets from sample .proto files
$ pipenv run generate
# Run the tests
$ pipenv run test
- Fixed length fields
- Packed fixed-length
- Zig-zag signed fields (sint32, sint64)
- Don't encode zero values for nested types
- Enums
- Repeated message fields
- Maps
- Maps of message fields
- Support passthrough of unknown fields
- Refs to nested types
- Imports in proto files
- Well-known Google types
- OneOf support
- Basic support on the wire
- Check which was set from the group
- Setting one unsets the others
- JSON that isn't completely naive.
- 64-bit ints as strings
- Maps
- Lists
- Bytes as base64
- Any support
- Enum strings
- Well known types support (timestamp, duration, wrappers)
- Support different casing (orig vs. camel vs. others?)
- Async service stubs
- Unary-unary
- Server streaming response
- Client streaming request
- Renaming messages and fields to conform to Python name standards
- Renaming clashes with language keywords
- Python package
- Automate running tests
- Cleanup!
Copyright © 2019 Daniel G. Taylor