GoogleCloudDataproc/spark-bigquery-connector

`bigquery.tables.create` Permission required on appending data to a BigQuery Table

Opened this issue · 2 comments

I am trying to restrict the permissions of a service account to only be able to execute DML statements (e.g. Insert, Update and Delete queries) to a BigQuery Table.

I have created a custom IAM Role derived from BigQuery Data Editor predefined role and essentially removed other unnecessary permissions including the bigquery.tables.create permission.

I have assigend this custom role to the Service Account, but upon execution it outputs an error: "Permission bigquery.tables.create denied on dataset..."

Here is the code snippet on how I append data to the table:

save_df_stream = ( df_stream.writeStream
    .outputMode("append")
    .format("bigquery")
    .options(**options_config)
    .trigger(availableNow = True)
    .start()  
)

Hi @smic-datalabs-jdcastro ,

Can you please share the options_config that you are using?

Hi @smic-datalabs-jdcastro ,

Can you please share the options_config that you are using?

Hi @isha97,

Just a bunch of custom fields:

{
  "partitionType": ...,
  "partitionField": ...,
  "temporaryGcsBucket": ...,
  "project": ...,
  "dataset": ...,
  "table": ...,
  "checkpointLocation": ...,
  "allowFieldAddition": True
}