microsoft/sql-spark-connector

Unable to delete rows

Opened this issue · 1 comments

Hi team, could you please kindly elaborate if it is possible to delete rows by using sql-spark-connector.

I tried to run a delete query but got the below error:
com.microsoft.sqlserver.jdbc.SQLServerException: A nested INSERT, UPDATE, DELETE, or MERGE statement must have an OUTPUT clause.

Then I added the OUTPUT clause, but got a different error:
com.microsoft.sqlserver.jdbc.SQLServerException: A nested INSERT, UPDATE, DELETE, or MERGE statement is not allowed in a SELECT statement that is not the immediate source of rows for an INSERT statement.

It seems like there is a limitation when users trying to delete rows.

The overwrite mode seems to be a workaround, but actually not efficient, especially when interacting with large data size.