microsoft/mssql-jdbc

Driver throws ArrayIndexOutOfBoundsException from internal code

qingwei91 opened this issue · 10 comments

Driver version

  • 12.4.1-jre11
  • 12.2.0-jre11
  • 11.2.1-jre11

SQL Server version

Microsoft SQL Server 2019 (RTM-CU16) (KB5011644) - 15.0.4223.1 (X64)

Client Operating System

Linux (Redhat)

JAVA/JVM version

JDK 11

Table schema

Irrelevant

Problem description

When trying to call a stored procedure that uses Table Valued Parameters, the driver throws exception with the following stack trace.

I was able to write less rows using the same code, but I cant pinpoint if the problem is due to data size, this happens when I was passing 100k++ rows using table valued params.

If there's something wrong with the input data, I expect driver to report something that's more actionable

Expected behavior

It should either not throw or provide clearer error message if there's indeed a user error.

Actual behavior

I get error from System.arraycopy, coming deep inside the driver.

Error message/stack trace

Caused by: java.lang.ArrayIndexOutOfBoundsException: arraycopy: last destination index 26 out of bounds for byte[17]
	at java.base/java.lang.System.arraycopy(Native Method)
	at com.microsoft.sqlserver.jdbc.TDSWriter.writeInternalTVPRowValues(IOBuffer.java:5118)
	at com.microsoft.sqlserver.jdbc.TDSWriter.writeTVPRows(IOBuffer.java:4995)
	at com.microsoft.sqlserver.jdbc.TDSWriter.writeTVP(IOBuffer.java:4908)
	at com.microsoft.sqlserver.jdbc.DTV$SendByRPCOp.execute(dtv.java:386)
	at com.microsoft.sqlserver.jdbc.DTV.executeOp(dtv.java:1657)
	at com.microsoft.sqlserver.jdbc.DTV.sendByRPC(dtv.java:1902)
	at com.microsoft.sqlserver.jdbc.Parameter.sendByRPC(Parameter.java:1189)
	at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.sendParamsByRPC(SQLServerPreparedStatement.java:757)
	at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doPrepExec(SQLServerPreparedStatement.java:1158)
	at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:643)
	at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:567)
	at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7675)
	at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:4137)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:272)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:246)
	at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:544)

Any other details that can be helpful

Add any other context about the problem here.

JDBC trace logs

Hi @qingwei91,

We'll look into this. While we're doing so, do you have repro code to reproduce the issue?

Hi @Jeffery-Wasty , I will try to make one

From what I observe, it happens when we have BigDecimal in Table Value Param, and its scale or precision are too large, I havent found the exact value though

Regarding BigDecimal, we've done work recently around this and are interested whether a previous version of the driver, 12.2, would work for you. This version makes changes to how precision and scale are computed from BigDecimal values, but was reverted due to performance issues we believe may have been caused by the changes. If you are able to test this, I would be interested in seeing the results.

Hi @Jeffery-Wasty , thanks for the tip, it however does not fix my issue.

My colleague says that he seen this before and was managed to fix it by forcing scale to a lower number, I will try that out, and see if I can find a way to trigger it.

Hi, I manage to reproduce this by setting scale of BigDecimal to 40, so looks like there's some assumption around how big the number can, if that's indeed by design it would be nicer to throw a more friendly error if possible

I will try to create a minimal reprod, but it might take a while.

Max precision with BigDecimal is SQL Server is 38. Setting a scale of 40, means the precision would need to be at least 40, which would cause an issue. If that is indeed the source of the issue (can you try with scale 38, i.e BigDecimal(38,38)), then the error produced should reflect that, and that's something we'll look into updating.

Hi @Jeffery-Wasty , so 38 precision is indeed the maximum the problematic code can handle

https://github.com/microsoft/mssql-jdbc/blob/main/src/main/java/com/microsoft/sqlserver/jdbc/IOBuffer.java#L5115

The DDC.convertBigDecimalToBytes produce a byte array where the length of bytearray can exceed 19, which then cause the code to fail.

I believe SQL Server cant handle precision larger than 38, I guess that's where the hardcoded 17 bytes come from, the library should probably do some checking at a higher level, I am using SQLSErverDataTable, this region might be relevant: https://github.com/microsoft/mssql-jdbc/blob/main/src/main/java/com/microsoft/sqlserver/jdbc/SQLServerDataTable.java#L232-L236

Our thoughts as well, this should have been caught earlier up, to enforce the 38 limit. We'll make those changes when we can and leave this issue open until we do.

Hi @qingwei91,

I'm still interested in whether you're able to provide a repro for this.

When I try to create a table with BigDecimal column with precision/scale larger than 40, the driver successfully prevents this and errors out. When I try this in SSMS, I'm prohibited from creating column values with precision or scale larger than 38. I'm not clear on the process you are using to utilize BigDecimal objects with precision/scale greater than 38. A repro would help in this regard.

Hi @Jeffery-Wasty , can you try this?

https://github.com/qingwei91/mssql-debug

the readme contains the db setup (with docker)

The main code will reproduce the problem, it is a scala project using sbt to build, you can run by sbt run