When creating a load job programmatically, load_job.schema has to be a list of bigquery.SchemaField objects
rconnol opened this issue · 2 comments
rconnol commented
i wrote a recursive function to walk a flattened schema_map and convert everything to bigquery.SchemaField:
but this could probably be done somewhere higher up instead of post generation, and would be more performant. This works well for me, but could be helpful for others.
def walk_schema(s):
result = []
for field in s:
if field.get('fields', None):
field['fields'] = walk_schema(field['fields'])
if field.get('type', None):
field['field_type'] = field.pop('type')
field = bigquery.SchemaField(**field)
result.append(field)
return result
bxparks commented
Thanks for the note.. but what's the action to take for this? I don't like leaving issues open.
- You can add the code into
generate_schema.py
as a helper function (with unit tests and documentation) - Maybe we should create a wiki page,
- I suppose we could add it to the README.md, but that clutters up the README.md.
I don't use the python client. I don't do much with BigQuery these days, so I'm going to depend on the requester to push this forward.
bxparks commented
Closing this since it's not clear what action should be performed here. This issue is still searchable.