GoogleCloudDataproc/spark-bigquery-connector

Permission bigquery.tables.create required on already materialised view

Closed this issue · 1 comments

Hi,

When trying to load data from an already materialised view, it seems that the connector still requires the bigquery.tables.create permission.

Option 1:

spark.read.format("bigquery").option("table", <project.dataset.materiazed view name>).load()

Option 2:

spark.read.format("bigquery").option("dataset", <dataset>).load(<materiazed view name>)

Error:

{
  "code": 403,
  "errors": [
    {
      "domain": "global",
      "message": "Access Denied: Dataset xxx:xxx: Permission bigquery.tables.create denied on dataset xxx:xxx (or it may not exist).",
      "reason": "accessDenied"
    }
  ]
}

Does it require a different configuration/option setup or is this a limitation of the connector?

Thanks!

@k-charalampous This is a limitation of the Read API which can't read directly from the materialized view so the connector would need the create permission on the dataset.