brocaar/chirpstack-api

Support for Python client.

sophiekovalevsky opened this issue · 17 comments

Is this a bug or a feature request?

Feature Request

What did you expect?

Have Python stubs where programmers can communicate with chirpstack api. I would like to make this contribution and I am in the middle of generating the stubs. However, I do have some questions regarding how to generate and structure files.

  • How would you manage the as as a keyword in Python?
    When I try to import the files, I got an invalid syntax error due the use of "as" that in this case it's a reserved keyword in Pyhton, when generating the client code.

  • How would be the tree structure folder you expect to have for python folder?
    Currently, this is the output that I got, this is something that you would like to sort in a way that can fit with the current js and rust folders?
    src
    └── gen
    ├── as_pb
    │   ├── as
    │   │   ├── as_pb2_grpc.py
    │   │   └── as_pb2.py
    │   ├── external
    │   │   └── api
    │   │   └── as
    │   │   └── external
    │   │   └── api
    │   │   ├── application_pb2_grpc.py
    │   │   ├── application_pb2.py
    │   │   ├── device_pb2_grpc.py
    │   │   ├── device_pb2.py
    │   │   ├── deviceProfile_pb2_grpc.py
    │   │   ├── deviceProfile_pb2.py
    │   │   ├── deviceQueue_pb2_grpc.py
    │   │   ├── deviceQueue_pb2.py
    │   │   ├── frameLog_pb2_grpc.py
    │   │   ├── frameLog_pb2.py
    │   │   ├── fuotaDeployment_pb2_grpc.py
    │   │   ├── fuotaDeployment_pb2.py
    │   │   ├── gateway_pb2_grpc.py
    │   │   ├── gateway_pb2.py
    │   │   ├── gatewayProfile_pb2_grpc.py
    │   │   ├── gatewayProfile_pb2.py
    │   │   ├── internal_pb2_grpc.py
    │   │   ├── internal_pb2.py
    │   │   ├── multicastGroup_pb2_grpc.py
    │   │   ├── multicastGroup_pb2.py
    │   │   ├── networkServer_pb2_grpc.py
    │   │   ├── networkServer_pb2.py
    │   │   ├── organization_pb2_grpc.py
    │   │   ├── organization_pb2.py
    │   │   ├── profiles_pb2_grpc.py
    │   │   ├── profiles_pb2.py
    │   │   ├── serviceProfile_pb2_grpc.py
    │   │   ├── serviceProfile_pb2.py
    │   │   ├── user_pb2_grpc.py
    │   │   └── user_pb2.py
    │   └── integration
    │   └── as
    │   └── integration
    │   └── integration_pb2.py
    ├── common
    │   └── common
    │   └── common_pb2.py
    ├── geo
    │   └── geo
    │   ├── geo_pb2_grpc.py
    │   └── geo_pb2.py
    ├── gw
    │   └── gw
    │   └── gw_pb2.py
    ├── nc
    │   └── nc
    │   ├── nc_pb2_grpc.py
    │   └── nc_pb2.py
    ├── ns
    │   └── ns
    │   ├── ns_pb2_grpc.py
    │   ├── ns_pb2.py
    │   └── profiles_pb2.py
    └── protobuf
    └── google
    └── protobuf
    └── empty_pb2.py

What happened?

Not Required

What version are your using?

Not Required

How can your issue be reproduced?

Not Required

Could you share your log output?
Not Required

That would be great!

Answers to your questions:

With regards to the as I had the same issue for Rust, the Rust Protobuf compiler automatically transformed this into as_pb, I think that would be the best approach for Python too.

For Go and Rust, the "exported" types are following the Protobuf package. There can be multiple files for one package (like with as/external/api), but I try to avoid ending up with many different sub-packages / modules.

So I guess a tree structure (of the packages / modules) would look like:

as_pb
  integration
  external
    api
common
geo
gw
nc
ns

Then in Python, this would be used like:

from chirpstack_api import ns
from chirpstack_api.as_pb import integration
from chirpstack_api.as_pb.external import api

What do you think? Do you think this makes sense / would be the best approach?

After some initial testing I finally make it work the api with python. However, there are couple of points to discuss since it will require to do additional steps to make it work, which I am listing here:

  1. The compiler protocol buffers and grpc client interface that I have used for Python 3 is grpcio-tools . This comes with protocol buffer compiler and gRPC plug-in. The protobuf structure directory is as following:
protobuf
├── as
├── common
├── geo
├── gw
├── nc
└── ns

If I generate the messages and services equivalent for Python with e.g. internal.proto which includes an import definition "import "as/external/api/user.proto";":

python -m grpc_tools.protoc -I../protobuf/as/external/api -I=../protobuf -I=/googleapis --python_out=src/gen/as_pb --grpc_python_out=src/gen/as_pb internal.proto 

This will generate the corresponding *_pb2_grpc.py and *_pb2.py files within src/gen/as_pb, with the following lines:

For internal_pb2.py:

import importlib as_dot_external_dot_api_dot_user__pb2 = importlib.import_module('as.external.api.user_pb2')

For internal_pb2_grpc.py:

import internal_pb2 as internal__pb2

As it can be observed, the first issue is that grpc tools does not replace special keywords (as in this case). Then, when importing the module "as.external.api.user_pb2" it will throw an syntax error.

The second issue is that the generated python code won't work if the plan is to make the compiling protos as part of a Python package. In this case, it's required to use absolute imports (e.g. import chirpstack-api_package.src.ge.as_pb.internal_pb2) or relative imports (e.g. from .as_pb import internal_pb2), I prefer the absolute imports in anycase.

The import generated "as.external.api.user_pb2" is due the use of import "as/external/api/user.proto"; within internal.proto file. To avoid the keyword used in python when generating the equivalent proto files, I can suggest:

  • Generate an additional step in Makefile that replace all the as import as as_pb
  • Generate an additional step to replace generated import statements in compiled python files.

In my opinion, I will tend to go with the first option. But I am not sure how this will affect the rest of the stack. I prefer to make to go with the first option rather than second one since it's better not touching the generated python files.

What are your thoughts about the aformentioned?

About the second issue, I was able to generate the imports as following (taking the same internal.proto file example):

python -m grpc_tools.protoc -I../protobuf -I=/googleapis --python_out=src/gen --grpc_python_out=src/gen ../protobuf/as/external/api/internal.proto

This will create the folders as/external/api and the imports:

For internal_pb2.py:

import importlib as_dot_external_dot_api_dot_user__pb2 = importlib.import_module('as.external.api.user_pb2')

For internal_pb2_grpc.py:

from as.external.api import internal_pb2

In this case, when planning to generate a python package this is almost close to what it's desired. I was planning a structure directory as:

chirpstack-api
└── chirpstack_api
    ├── as_pb
    │   ├── external
    │   │   └── api
    │   └── integration
    ├── common
    ├── geo
    ├── gw
    ├── nc
    └── ns

Which is quite close to what you recommended. The main addition is that I wrapped in a way to make it a python package. In this matter, the import statement from the compiling protos should look like:

 from chirpstack_api.as_pb.external.api import internal_pb2

I found out that to achieve this, I needed to change the protobuf directory structure to match exactly the python package structure, this will require as well a modification in the import section within the proto files. This is how folder structure looks like:

 protobuf
 └── chirpstack_api
     ├── as_pb
     │   ├── external
     │   │   └── api
     │   │       └── application.proto (import "chirpstack_api/as_pb/external/api/user.proto";)
     │   └── integration
     ├── common
     ├── geo
     ├── gw
     ├── nc
     └── ns

In this way when compiling:

python -m grpc_tools.protoc -I../protobuf -I=/googleapis --python_out=src/gen --grpc_python_out=src/gen ../protobuf/chirpstack_api/as_pb/external/api/internal.proto`

It will create the chirpstack_api/as_pb/external/api/ folder with the python files using the following import:

from chirpstack_api.as_pb.external.api import internal_pb2

In a nutshell, to generate the python compiled proto files the following should be fix:

  • Replace in someway the as word since grpc_tools won't replace it automatically
  • Achieve in someway the right imports to host files within a python package

Aside of that, once the python package generated is, I can modify the code within package so that developers use all generated stubs and messages classes as following:

from chirpstack_api import ns
from chirpstack_api.as_pb import integration
from chirpstack_api.as_pb.external import api

I think this approach is okey, then, we can stick with it.

Maybe I can do an initial commit into my fork for you to test locally, however, as this will take me a time to polish the code from my side I would like to first know your thoughts about it.

Best regards,

@brocaar, I created a branch for yoo to test it and let me know your feedback.

@sophiekovalevsky thanks! I'm planning to do some testing next week and provide you with feedback.

@brocaar that sounds great. We stay in touch.

Happy new hear @sophiekovalevsky !

As it can be observed, the first issue is that grpc tools does not replace special keywords (as in this case). Then, when importing the module "as.external.api.user_pb2" it will throw an syntax error.

The second issue is that the generated python code won't work if the plan is to make the compiling protos as part of a Python package. In this case, it's required to use absolute imports

My Python skills are a bit rusty, but I think this issue can be solved without making changes to the protobuf/ directory. What if we use the following output structure for Python (sub-set of the final structure, just to validate if this is working):

python/
  src/
    __init__.py
    gen/
      as/
        __init__.py
        as_pb2.py
  • python/src/gen is in this case the directory that is used by the gRPC Protobuf compiler.
  • Then for each directory, we add a __init__.py file to make the modules in that folder accessible as one. See for the example content below.
  • In src/__init__.py the as module is re-exported as as_pb.

src/gen/as/__init__.py

from .as_pb2 import *

src/__init__.py

import os
import sys
from pathlib import Path
sys.path.append(os.path.join(str(Path(__file__).parent), 'gen'))

import importlib

as_pb = importlib.import_module('as')

(I believe this also fixes the absolute imports that are generated by the gRPC Protobuf compiler as the ./gen folder is added to the Python path).


Looking forward to your thoughts!

Actually, the above probably also works with:

python/
  src/
    __init__.py
    as/
      __init__.py
      as_pb2.py

In which case src/__init__.py would contain:

import sys
from pathlib import Path
sys.path.append(str(Path(__file__).parent))

import importlib

as_pb = importlib.import_module('as')

That way there is no need to re-export everything, only the as module :)

@brocaar, Happy new year to you as well. I hope that you have had a great moment with your loved ones.

Thank you for your feedback.

If I understood your thoughts well, I tried to replicate your ideas and this is what I got:

When I use python/src/gen as the directory that is used by gRPC Protobuf compiler to generate the as *_pb2_grpc.py and *_pb2.py:

python -m grpc_tools.protoc -I../protobuf --python_out=src --grpc_python_out=src ../protobuf/as/as.proto

The imports will be:

For as_pb2.py

from gw import gw_pb2 as gw_dot_gw__pb2

which is I think it's okey.

However, as regards to as_pb2_grpc.py, it will:

from as import as_pb2 as as_dot_as__pb2

Then, when imported from src/as/__init__.py as:

from .as_pb2 import *

It will throw a syntax error. Because of from as sentence in as_pb2_grpc.py

Is that what you meant? If so, the issue with as as reserved keyword is still persistent.

I will try tomorrow morning another workaround and let you know if that works following the ideas that you have exposed. One way I can think of is, trying to use the following:

python -m grpc_tools.protoc -I../protobuf -I../protobuf/as --python_out=src/as --grpc_python_out=src/as as.proto

And generate the output folder manually throughout the makefile, then add an __init__.py file to each folder that we want them as package and from there import modules as relatives.

I will let you know soon.

Ah, you are right! Interestingly, for some imports the generator is using importlib, e.g. for the API definitions, device_pb2.py contains (that was the file I was testing with, and I assumed that it did so because of the as. import prefix, I was hoping it would do for all the imports):

as_dot_external_dot_api_dot_frameLog__pb2 = importlib.import_module('as.external.api.frameLog_pb2')

Let me look into the impact of changing the protobuf/as folder into protobuf/as_pb tomorrow. I don't think there is an other possibility, except maybe reporting this as a grpc_tools bug or feature request. For Go it is possible to define the import path in the .proto file, potentially this could work for Python to, e.g. option python_module = "chirpstack_api.as_pb". However, I don't expect that this will be implemented soon (I already found some discussions on the grpc repo about this).

Yeah, I agree with you in the sense that it would be a great feature if option works in the same way that it works in Go.

I have inspected the official documentation from gRPC when working with Python as far as I know there is no equivalent of option.

This morning, I tried a couple of combinations trying to obtain the desired import structures. I discover that if you separate proto and grpc objects. You can manage to get the desired protos import within grpc output with:

python -m grpc_tools.protoc -I../protobuf -I=$(GOOGLEAPIS_PATH) --grpc_python_out=src ../protobuf/as_pb/external/api/deviceProfile.proto

It will generate deviceProfile_pb2_grpc.py and will import the proto definition as:

from as_pb.external.api import deviceProfile_pb2 as as__pb_dot_external_dot_api_dot_deviceProfile__pb2

In this regards, I created a symbolic link within protobug/as_pb pointing to protobuf/as and it worked. However, when I tried to generate the proto objects I did it get any great result. What I found is that, if there is any dependency from another proto file, as it's the case of deviceProfile that relies on import "as/external/api/profiles.proto";, then it will generate does not matter how I would set the output the following:

import importlib
as_dot_external_dot_api_dot_profiles__pb2 = importlib.import_module('as.external.api.profiles_pb2')

Where as.external.api.profiles_pb2' follows the import "as/external/api/profiles.proto"; statement

Then, unless the import structure change in each protos file we will still have the same result.

In addition, when I tried to import modules in main __init__.py as:

common = importlib.import_module("common")
gw = importlib.import_module("gw")    

I got an error throwing the following:
TypeError: Couldn't build proto file into descriptor pool!

The way that I temporarly managed to fix this was using the pure python implementions, being said:

pip uninstall protobuf
pip install --no-binary=protobuf protobuf

Having these aformentioned experiences, then I tend to think that we have less option that we can grab on, whatsoever.

I know that chirsptack ecosystem it's already mature to start doing these kind of changes since it will affect the whole structure. Then, I would completely understand your point of view in sticking to the current one.

Let me know your thoughts, I am looking forward to read them.

Best regards,

@sophiekovalevsky I have been trying to make some changes to the Protobuf structure, but this has the side-effect that the Go code structure is incorrect. This probably can be fixed with some renames / moving files, which is doable but not ideal.

I have made this Protobuf proposal: protocolbuffers/protobuf#7061.

Do you mind if we wait until making changes to the Protobuf structure until we get some feedback on that issue? Feel free to also share your thoughts on that issue.

@brocaar, sure I can wait.

I will watch the issuethat you have opened and share my thoughts once we got some response back.

Thanks for all the effort on this,

Unfortunately I have not seen any response on my issue for supporting a python_package option. Would it be an idea to move forward with the following idea:

I'm a bit hesitant to change the protobuf/as to protobuf/as_pb as this issue is (currently) specific to Python. It also means that for every language which does support "as", we need to "undo" this. What if the Makefile for Python would copy the protobuf to a temporary folder (e.g. python/protobuf), rename protobuf/as to protobuf/as_pb and using sed fixes the imports that contain the as folder to as_pb?

That way we can move forward with Python without affecting the support for other languages. Hopefully this copying of the protobuf files, rename and updating the imports can be made obsolete in the future when a python_package like option becomes available.

I think that this approach will temporarly fix our issue while we wait for support in python_package.
I am not really sure if I will able to make this changes today, then I prefer to move this when I come back from travelling which would be around two weeks from now on.
I am thrilling to see this implementation being merge to the api. Thanks @brocaar for the effort, we are almost getting to the finish line on this.

@brocaar, I just came back from holidays.

I've pushed a branch with all modifications discussed. May you please take a look at?

Summing up, this feature introduces:

  • Documentation in how to use the python library

  • Replace with sed as folder to as_pb

  • Replace internal packages (common, gw, ns) to match absolute imports python package structure

  • Setup file to make it available in official python repositories (possible pypi?)

Let me know your feedback. Before requesting a merge I would wait instead for your comments to see if there is something that I missed up, so that I can deliver a cleaner commit(s).

Thanks in advance,

Thanks again @sophiekovalevsky for all the work on this! I'll review this in the next days. Please feel free to already create a pull-request. Usually I squash merge, so having multiple commits is not an issue and will not pollute the git history :-)

@brocaar alright then :D