GoogleCloudDataproc/spark-bigquery-connector
BigQuery data source for Apache Spark: Read data from BigQuery into DataFrames, write DataFrames into BigQuery tables.
JavaApache-2.0
Issues
- 1
- 4
Flakey behavior when writing to BigQuery
#1170 opened - 4
- 5
- 1
- 1
- 3
- 9
- 2
Bug: Enabling predicate pushdown fails
#1146 opened - 1
Next release?
#1145 opened - 2
Support spark 3.5
#1142 opened - 1
- 2
- 2
does spark read from bq multiple times when joining?
#1137 opened - 2
When writing to a BQ table with Integer-range partitioning it fails with complain about time partitioning
#1135 opened - 2
Flakey behavior when writing to BigQuery
#1131 opened - 2
- 1
Best practice to deal with query parameters?
#1128 opened - 6
- 9
- 2
BQ labels are not inserted in information schema
#1123 opened - 4
Add read support for Linked datasets
#1120 opened - 3
- 2
- 4
When writing to a BQ table with Integer-range partitioning it fails with complain about time partitioning
#1113 opened - 1
- 4
Bad string encoding while reading from BQ
#1100 opened - 1
- 2
- 0
- 5
Spark Write BIGNUMERIC issue
#1072 opened - 3
HTTP PROXY Support NTCredentials
#1071 opened - 2
- 3
- 3
Add Integer based partition Support
#1065 opened - 1
Not able to partition based on string column.
#1064 opened - 0
- 3
- 1
Getting "wrong column count" error when writing to an existing BigQuery table using `direct` write method
#1056 opened - 2
Writing empty dataframe in direct mode produces an extra temp table with random number suffix
#1051 opened - 1
BigQuery direct write error with "Could not convert Spark schema to protobuf descriptor"
#1050 opened - 0
DATETIME parsing in PySpark
#1047 opened - 1
Add a way to disable map type support.
#1046 opened - 2
Try to clarify the pushAllFilters setting
#1045 opened - 1
- 2
Indirect write drops policy tags
#1043 opened - 1
- 3
- 0
- 0
Writer retry policy missing correct parameters
#1034 opened