TypeError: the JSON object must be str, bytes or bytearray, not SchemaRegistryClient in consumer_ccsr.py
sejongk opened this issue · 0 comments
sejongk commented
Description
I followed instructions of 'Avro And Confluent Cloud Schema Registry' in the confluent docs(Python: Code Example for Apache Kafka) and met a type error when running examples/clients/cloud/python/consumer_ccsr.py
$ python consumer_ccsr.py -f /home/.confluent/librdkafka.config -t test2
Traceback (most recent call last):
File "consumer_ccsr.py", line 48, in
name_avro_deserializer = AvroDeserializer(ccloud_lib.name_schema, schema_registry_client, ccloud_lib.Name.dict_to_name)
File "/usr/local/lib/python3.7/site-packages/confluent_kafka/schema_registry/avro.py", line 278, in __init__
self._reader_schema = parse_schema(loads(schema_str)) if schema_str else None
File "/usr/local/lib/python3.7/json/__init__.py", line 341, in loads
raise TypeError(f'the JSON object must be str, bytes or bytearray, '
TypeError: the JSON object must be str, bytes or bytearray, not SchemaRegistryClient
TroubleShooting
I changed the line 48~50
name_avro_deserializer = AvroDeserializer(ccloud_lib.name_schema, schema_registry_client, ccloud_lib.Name.dict_to_name)
into
name_avro_deserializer = AvroDeserializer(schema_registry_client, ccloud_lib.name_schema, ccloud_lib.Name.dict_to_name)
And it works
Environment
- GitHub branch: 6.2.0-post
- confluent-kafka 1.7.0