ch10 - sparksql- Inserting into tables query not working Cannot safely cast 'count': string to bigint
shanmugavel04 opened this issue · 0 comments
shanmugavel04 commented
Query in the book:
INSERT INTO partitioned_flights
PARTITION (DEST_COUNTRY_NAME="UNITED STATES")
SELECT count, ORIGIN_COUNTRY_NAME FROM flights
WHERE DEST_COUNTRY_NAME='UNITED STATES'
LIMIT 12
In Spark 3.0, The above query returns the below error in SQL statement: AnalysisException: Cannot write incompatible data to table 'default
.partitioned_flights
':
- Cannot safely cast 'count': string to bigint
so, modified the query as below to cast the count column as an integer.
INSERT INTO partitioned_flights
PARTITION (DEST_COUNTRY_NAME = "UNITED STATES")
SELECT ORIGIN_COUNTRY_NAME, cast(count as int) count1 FROM flights
WHERE DEST_COUNTRY_NAME = "UNITED STATES"
LIMIT 12
could you please check on this?