FirebirdSQL/firebird-odbc-driver

EXECUTE STATEMENT changed for the worse in ODBC 3.0

Closed this issue · 22 comments

In version 2.0 I used sql "PREPARE" to prepare a statement for use in a sql "EXECUTE USING DESCRIPTOR" which is executed in a loop, and it worked fine. But in version 3.0, with no code change, only the first row is successfuly written and the subsequent executions generate different erros.

To avoid the problem I put the sql "PREPARE" immediatly before the sql "EXECUTE USING DESCRIPTOR", and it worked. But the processing time became almost three times greater, so I turned back to version 2.0.

My question is: Is there some resource to achieve the 2.0 performance?

Hi @Fabio-LV!

Well, I've done a small investigation and written a simple test ODBC C program (see attach). I cannot reproduce the bug you mentioned...
odbc_prepare_test.zip
This code does just the same things you wrote above -

  1. Case 1: Binds the params & prepares the statement only once and then runs an insert loop
  2. Case 2: Performs bind&prepare every time in a loop, before each insert

What I see - both cases work ok without any errors. And obviously Case2 is much slower than Case1:

./a.out
=== Connected OK
=== Setting AUTOCOMMIT OFF
=== (Re)create test table...
(1) insert 100000 recs -> 1925274 us, ave = 51940.6588 rec/s
(2) insert 100000 recs -> 11253640 us, ave = 8886.0138 rec/s
=== Disconnected

Can you please give me more details? Is it some program code (what language?..) or it's some tool/utility? What "diffirent errors" are you getting?

Regards
Yuri

Hi, Yuri

Well, at last some time to be in the lab.
Thank you a lot for your sample - its simplicity enlightened the room. Our original tables involved in that peculiar process are a bit more complex, and we have done our tests using them.
The scenario is this: a group of 8 tables of a database that pass through periodical examination and those elected for processing must have some rows transfused to a mirrored table in another database. The problem is that the selected tables are known only at runtime, thus forcing us to process them dynamically.
We executed your code, and for our great surprise it worked fine! The general feeling was, "the more I pray, the more ghosts appear!" So we began to sniff out about the columns, paying attention to the various error messages that used to pop up, for the driver doesn't return a specific error, but different ones!
In our tests we added a column "CREATION_DATE DATE NOT NULL" after your "FVALUE," and voilà! the ghost showed up. The error message is:

"(SQLCODE: -625) [ODBC Firebird Driver] validation error for column "FORPAG_NEW"."CREATION_DATE", value "*** NULL ***".

We tested the same code under ODBC driver 2.0.5 and it worked fine, as ever.

Environment:

S.O.: Windows 11 64bits
Firebird database: v.5.0 64bits
Application/Cliente: 32 bits
Firebird ODBC Driver: v.2.0.5 & v.3.0 installed

Hi @Fabio-LV

In our tests we added a column "CREATION_DATE DATE NOT NULL" after your "FVALUE," and voilà! the ghost showed up.
The error message is:
"(SQLCODE: -625) [ODBC Firebird Driver] validation error for column "FORPAG_NEW"."CREATION_DATE", value "*** NULL ***".

Well... If you just add a NOT NULL column to the table and do not bind it and fill it's value, you are bound to get an error like this, because the server cannot insert a NULL to a NOT NULL constrainted column. BTW it's not an ODBC driver error but a forwarding from the FB SQL server.

odbc_prepare_test2.zip

I wrote another test bundle of 5 simple tests:))) (with additional CREATION_DATE DATE column):

  1. CREATION_DATE allows NULL, populated with some value, and every 10th value is NULL
  2. CREATION_DATE NOT NULL and is populated every time
  3. CREATION_DATE NOT NULL, not binded, not populated
  4. CREATION_DATE NOT NULL, populated with some value, and every 10th value is NULL
  5. CREATION_DATE allows NULL, not binded, not populated

I ran it with ODBC 3.0 and 2.0.5 drivers and as expected test cases 1,2 and 5 passed, and 3,4 failed with [ODBC Firebird Driver][Firebird]validation error for column "TEST_TABLE"."CREATION_DATE", value "* null *". Cannot see any difference between 3.0 and 2.0.5...

Maybe you can provide me with your table metadata & sql query? Or maybe even the source code that makes inserts. I suspect we have some tricky conditions here and I cannot catch them yet.

Regards

Hi Yuri

As we proceeded testing the column types we use in our databases, a pattern begun to be distinguishable:

  1. Column types CHAR, VARCHAR, SMALLINT, INTEGER, BIGINT, FLOAT and DOUBLE PRECISION don't present any problem: no errors, and data is rightly written in the output table no matter how many times the "EXECUTE USING" is executed in the loop after a "PREPARE.

  2. For column types DATE, TIME, TIMESTAMP, NUMERIC and DECIMAL, except in the case where "NULL" is purposely put into the host variable bound to a "NOT NULL" column, the first row is correctly recorded: no error, and data precisely written; but the next time the command is executed, although the host variable remains intact, inexplicably "NULL" seems to substitute its original value in the dispatching to the DBMS and the error 'validation error for column ... value, "*** NULL ***"' issues.

  3. This erroneous behavior occurs only in an "EXECUTE USING" that is processed more than once under the same prepared statement: nowhere else this "phenomenon" happens in any SQL input/output operation, be it in static or dynamic mode - at least until now. We have tested our systems under ODBC driver 3.0 exhaustively.

We develop our systems with embedded SQL and we never had faced something like this before. We are not database experts, but the problem seems to be a driver failure, or else the first row couldn't be carried out successfully, as described in the second topic above. When the column is nullable no error pops up, but the column value is written as "NULL"! We've checked the outputs.

We present a detailed account of the results we got from our tests:

-> Columns type "SMALLINT, CHAR, VARCHAR": Ok, as seen in the previous test

-> Column "CREATION_TIMESTAMP" (NOT NULL) = "2024-11-15 21:40:11"

First row Ok, data rightly recorded; error in the 2nd row:
"(SQLCODE: -625) [ODBC Firebird Driver] validation error for column "FORPAG_NEW"."CREATION_TIMESTAMP", value, "*** NULL ***".

-> Column "CREATION_TIMESTAMP" (Nullable) = "2024-11-15 21:40:11"

First row Ok; 9 rows no error, but "CREATION_TIMESTAMP" = "NULL" was recorded

-> Column "CREATION_TIMESTAMP" (NOT NULL) = NULL

Error in the 1st row:
"(SQLCODE: -303) [ODBC Firebird Driver] Dynamic SQL error-SQL error code = -303-conversion error from string ""

Obs.: The same happens with types "DATE" and "TIME"


-> Column "VALUE_DECIMAL" (NOT NULL) = 205.38

First row Ok, data rightly recorded; error in the 2nd row:
"(SQLCODE: -625) [ODBC Firebird Driver] validation error for column "FORPAG_NEW"."VALUE_DECIMAL", value, "*** NULL ***".

-> Column "VALUE_DECIMAL" (Nullable) = 205.38

First row Ok; 9 rows no error, but "VALUE_DECIMAL" = "NULL" was recorded


-> Column "VALUE_NUMERIC" (NOT NULL) = 321.17

First row Ok, data rightly recorded; error in the 2nd row:
"(SQLCODE: -625) [ODBC Firebird Driver] validation error for column "FORPAG_NEW"."VALUE_NUMERIC", value, "*** NULL ***".

-> Column "VALUE_NUMERIC" (Nullable) = 321.17

First row Ok; 9 rows no error, but "VALUE_NUMERIC" = "NULL" was recorded


-> Column "VALUE_INTEGER" (NOT NULL) = 222333444

10 rows Ok, no error, data rightly recorded

-> Column "VALUE_INTEGER" (Nullable) = 222333444

10 rows Ok, no error, data rightly recorded


-> Column "VALUE_BIGINT" (NOT NULL) = 555666777

10 rows Ok, no error, data rightly recorded

-> Column "VALUE_BIGINT" (Nullable) = 555666777

10 rows Ok, no error, data rightly recorded


-> Column "VALUE_FLOAT" (NOT NULL) = 999888777

10 rows Ok, no error, data rightly recorded

-> Column "VALUE_FLOAT" (Nullable) = 999888777

10 rows Ok, no error, data rightly recorded


-> Column "VALUE_DOUBLE_PRECISION" (NOT NULL) = 666555444

10 rows Ok, no error, data rightly recorded

-> Column "VALUE_DOUBLE_PRECISION" (Nullable) = 666555444

10 rows Ok, no error, data rightly recorded

Best regards

Hi Yuri

We've tested with this simple table:

ID SMALLINT NOT NULL
TEST_CODE SMALLINT NOT NULL
TEST_VARCHAR VARCHAR(20) NOT NULL
TEST_CHAR CHAR(1)

The first four columns never changed. The last we systematically changed to submit to test.

We see that this is a driver problem because:

  1. Without Firebird ODBC driver, be it 2.0 or 3.0 or whatever, we cannot connect to the database;
  2. All run fine under ODBC driver v.2.0: no errors, data perfectly recorded;
  3. The preprocessor doesn't translate the "EXECUTE USING" directly to FBClient in a way via driver v.2.0, and in another way via driver v.3.0: it cannot be "obvious" with one version and "non-obvious" with other version. We're using v.3.0 to run all our applications except that one that uses "EXECUTE USING" in a loop under the same "PREPARE";
  4. We've tested our system using PostgreeSQL and MySQL DBMS via ODBC connection and this problem doesn't occur using their respective ODBC drivers;
  5. We're not database specialists nor ODBC experts but we can see the obvious.

The case all exposed, if it is out of your province we'll thank you very much if you arrange a route for us to get some help about this problem that occurs only under Firebird ODBC driver v. 3.0.

We thank you a lot for your attention.

[And the second point - I understand you're using an embed sql and some
preprocessor like gpre to compile your pseudocode with all those EXECUTE
USING etc. I'm just wondering how this relates to the ODBC driver. AFAIC
gpre preprocessor translates directly to FBClient API isc calls, not to
ODBC. It seems you are using some non-obvious (to me) solution to access
your database:))]

Hi Yuri

Aiming to simplify the tests we've abandoned the use of a "descriptor" (a SQLDA structure specially designed for dynamic SQL operations) and used the most simplified form
"EXECUTE USING (list of host variables)"

[https://docwiki.embarcadero.com/InterBase/2020/en/EXECUTE]

The steps are as follows:

  1. Define the host variables with their appropriate values:
    (hostVar1, hostVar2, hostVar3, hostVar4, hostVar5, hostVar6);
  2. Create the string (InsertString) to be prepared:
    "INSERT INTO FORPAG_NEW VALUES (?, ?, ?, ?, ?, ?)"
  3. Prepare the statement using that string:
    "PREPARE InsertList FROM :InsertString"
  4. Process in a ten times loop:
    hostVar1 + 1 (primary key)
    "EXECUTE InsertLista USING :hostVar1, :hostVar2, :hostVar3, :hostVar4, :hostVar5, :hostVar6"

Obs.: The binding of columns and host variables is done by the preprocessor before dispatching the "EXECUTE USING".

In the first turn of the loop everything always goes fine: no error, and data correctly written. It can be seen that the binding is active and functional because when the column being tested is nullable the operation is accomplished, no error issues, but the column is recorded with value "NULL", even though the value in the host variable remain unchanged.

The platform we use for system development is NetExpress COBOL and we know almost nothing about its preprocessor (OpenESQL) which translates the SQL commands into the appropriate calls to access a particular database.

We wonder how could you debug the work of OpenESQL preprocessor in your workbench, and this is the reason we've tried to be so detailed and specific in our reports.

Best regards,
Fabio

Hi Yuri

Unfortunately relating to OpenESQL - work or results - we cannot give anything more than the source code before preprocessor, which we did in our last comment. The platform does all the "low level" job.

But we will be at your disposal in all that may be in our reach.

Regards

Hi again, Yuri

I think I have good news, at least apparently.

After I have received your solicitation for a "source code after preprocessor" I searched my platform to see if I could get something like this. I found a compiler directive that prints a trace of the preprocessor work during the execution of a program.

So, I created 2 trace lists:

  1. File "SQLTrace_DATE_NOTNULL.LOG", respecting the processing of a test table "FORPAG_NEW" with a column "CREATION_DATE NOT NULL", which writes the 1st row successfully and fails in the 2nd row;

  2. File "SQLTrace_DATE_Nullable.LOG", respecting the processing of the same table in which this column is nullable and writes the 1st row correctly, but the next 4 rows even though no error pops up the value recorded in the column is surprisingly "NULL".

I inserted comments marking the steps as they were passed during the debugging process. If the files be of some use for you, I'll be happy.

Regards

SQLTrace_DATE_NOTNULL.LOG
SQLTrace_DATE_Nullable.LOG

(Or maybe it's quite sufficient to look into the source code with native
ODBC calls after preprocessor. BTW if you can give me that source code
(after preprocessor) it will help.)

Well... If you just add a NOT NULL column to the table and do not bind it and fill it's value, you are bound to get an error like this, because the server cannot insert a NULL to a NOT NULL constrainted column. BTW it's not an ODBC driver error but a forwarding from the FB SQL server.

Maybe I missed the exact problem, but as a driver, you need to take care to mark parameters nullable on your side, as the server will describe NOT NULL columns as not-nullable, but that null state could be resolved by a trigger. As such, for flexibility, a driver should always change the description of parameters to be nullable.

Hi @Fabio-LV

The ODBC driver does not prohibit to fill a NOT NULL column with NULL. All integrity checks are performed on the server side.
If you have a trigger that fills a NOT NULL column, that trigger runs in the same transaction as the INSERT statement. And it will be no problem if you insert a NULL to a NOT NULL column because the trigger will overwrite the NULL value with some other value, and on the commit stage the integrity check will pass.

For ex, if I have a CREATION_DATE DATE NOT NULL in my table, and an active trigger like:

CREATE OR ALTER TRIGGER TEST_TABLE_BI FOR TEST_TABLE
ACTIVE BEFORE INSERT POSITION 0
as
begin
  if (new.creation_date is NULL) then
 begin
    new.creation_date = 'now';
 end
end

, I can freely insert NULLS to the TEST_TABLE.CREATION_DATE (or even do not bind this colums at all), and it should work ok.

Please try the build 3.0.0.14 from https://github.com/FirebirdSQL/firebird-odbc-driver/wiki

What about this?)

Hi Yuri

Congratulations! Result 80% satisfactory. Now I'm able of complete a considerable part of the job that transfuses rows from some tables to others. Before your action I only could process the first row of the first table with success.

But some errors keep showing up, and as they are few, at least till now (I couldn't have done exhaustive tests yet), a particular (and weird) one I submit to your investigation right now.

In a table there's a row, the 14th, that inexplicably fails. I checked the data in the columns and they are correct, but the "EXECUTE USING" always fails. So I elaborated a bundle where I create host variables corresponding to the table's columns and containing the same values and processed them in a loop of ten times. As in the v.3.0.0.12 the first time the row is written it passes Ok and data is correctly recorded; but in the second turn the execution issues the same error as reported before (validation error for column ... value, "*** NULL ***") and the column type is a simple "SMALLINT".

The test created a SQL trace log file (SQLTraceTabCONTRATO_DOC_NEW_NotNull) showing the preprocessor steps which I commented for each phase of the process.
SQLTraceTabCONTRATO_DOC_NEW_NotNull.log

In advance I want to say that I've begun to receive a sporadic fatal message during the debugging process of two other tables: "attempt to access item beyond bounds of memory" in the execution of the "EXECUTE USING" command. It didn' happen before. The same test passes fine in v.2.0.5.

In time: we don't use triggers of any kind - our insertions are based in setting the sequence up before to get the value for the primary key.

I think I'll be able to return to intensive tests soon.

Regards

Oops! Sorry, forgot to insert the values of the host variables in the log file for you to check them. Now it is completed.

SQLTraceTabCONTRATO_DOC_NEW_NotNull.log

Hi @Fabio-LV

I'm sorry the last fix didn't solve the problem as I expected.

The memory corruption errors look scary and really are the worst things for a C/C++ program)) But the good thing is that all the previous misbehaviour was related to the memory issues, and all that stuff with "NOT/NOT NULL" is just a camouflage for a more serious errors. Now the masks are off)))

Well, I definetely have an inaccurate code somewhere inside the driver, I will check it asap and get back to you in a few days.

Regards.

Hi @Fabio-LV

Can I please ask you to test 3.0.0.15 build? I hope it will work)
from https://github.com/FirebirdSQL/firebird-odbc-driver/wiki

Sorry for awhile taken.

Regards

Hi Yuri

Well, the job is done - I set my seal on it. I've done all possible tests respecting a "PREPARE" with a subsequent loop of "EXECUTE USING", with and without "SQLDA" data descriptors and using all kinds os column types. No SQL error returned and no problems of "out of memory" occurred.

Version 3.0.0.15 is Ok. Congratulations

Fabio

Hi @Fabio-LV !

It's really great to hear it)
However, if you engage any issues, feel free to call me.

Regards
Yuri