averemee-si/oracdc

Some columns values are getting encrypted

ankitsingh95 opened this issue · 6 comments

Hi Team,
I am facing an issue using ORACDC some column values coming are encrypted in Kafka topic but in the database, it's in plain text or numbers.

{"op":"c","ts_ms":1614254318558,"before":{"ORA_ROW_ID":"AAASIbAAFAAAAf0AAC"},"after":{"ORA_ROW_ID":"AAASIbAAFAAAAf0AAC","ID1":"ew==","NAME":"amit12"},"source":{"instance_number":1,"version":"19.0.0.0.0","instance_name":"ndli23","host_name":"BUILD-LMSPERF-APP-U2-63-92","dbid":683527118,"database_name":"NDLI23","platform_name":"Linux x86 64-bit","query":"insert into "UNKNOWN"."OBJ# 74267"("COL 1","COL 2") values ('c20218','616d69743132')","pdb_name":null,"owner":"BLANK_SETUP_CONFIG","table":"HELLO12","scn":5701118,"ts_ms":1614091240000}}

I have a table with the name HELLO12 and have two columns id, name

data in the table:
123 amit12

in topic

"ID1":"ew==","NAME":"amit12"

Note: I am not opting any encription options

Hello,

Could you please provide output of command

desc HELLO12;

Thanks,
Aleksei

desc BLANK_SETUP_CONFIG.HELLO;
Name Null? Type


ID NUMBER(38)
NAME VARCHAR2(29)

Regards,
Ankit

Hi Ankit,

Thanks!

NUMBER(38) is too big even for Java Long data type and mapped to org.apache.kafka.connect.data.Decimal which stores data in Kafka topic as byte[]

Hope this answers your question

Regards,
Aleksei

Hi Aleksei,

I facing issues on multiple tables.

{
"op":"c",
"ts_ms":1614264234618,
"before":{
"ID":"TEtD"
},
"after":{
"ID":"TEtD",
"CREATED_BY":"System",
"CREATED_DATE":"AXdmmwOG",
"ENTITY_STATE":"ACTIVE",
"MODIFIED_BY":null,
"MODIFIED_DATE":null,
"UPDATE_COUNTER":null,
"AUTH_CODE":"SERVICE_ACCOUNT_ACCESS"
},
"source":{
"instance_number":1,
"version":"19.0.0.0.0",
"instance_name":"ndli23",
"host_name":"BUILD-LMSPERF-APP-U2-63-92",
"dbid":683527118,
"database_name":"NDLI23",
"platform_name":"Linux x86 64-bit",
"query":"BLANK_SETUP_CONFIG.AUTHORITY",
"pdb_name":null,
"owner":"BLANK_SETUP_CONFIG",
"table":"AUTHORITY",
"scn":0,
"ts_ms":1614264234618
}
}

SQL> desc BLANK_SETUP_CONFIG.AUTHORITY;
Name Null? Type


ID NOT NULL NUMBER(19)
CREATED_BY VARCHAR2(255 CHAR)
CREATED_DATE NUMBER(19)
ENTITY_STATE VARCHAR2(255 CHAR)
MODIFIED_BY VARCHAR2(255 CHAR)
MODIFIED_DATE NUMBER(19)
UPDATE_COUNTER NUMBER(19)
AUTH_CODE VARCHAR2(255 CHAR)


ID CREATED_BY CREATED_DATE ENTITY_STATE MODIFIED_BY MODIFIED_DATE UPDATE_COUNTER AUTH_CODE
5000000 System 1612334170687 ACTIVE ADMIN_ACCESS
5000001 System 1612334170986 ACTIVE CONFIGURATION_ACCESS
5000002 System 1612334171000 ACTIVE READ_ONLY_ACCESS
5000003 System 1612334171014 ACTIVE SERVICE_ACCOUNT_ACCESS

Hi Ankit,

This is expected behaviour. README contains following for data types: mapping

|Oracle RDBMS Type|JSON Type|Comment                                                       |
|:----------------|:--------|:-------------------------------------------------------------|
|NUMBER           |int8     |NUMBER(1,0) & NUMBER(2,0)                                     |
|NUMBER           |int16    |NUMBER(3,0) & NUMBER(4,0)                                     |
|NUMBER           |int32    |NUMBER(5,0) & NUMBER(6,0) & NUMBER(7,0) & NUMBER(8,0)         |
|NUMBER           |int64    |Other Integers between billion and 1,000,000,000,000,000,000  |
|NUMBER           |float64  |Oracle NUMBER without specified SCALE and PRECISION           |
|NUMBER           |bytes    |org.apache.kafka.connect.data.Decimal - all other numerics    |

NUMBER(19) can be greater than 1,000,000,000,000,000,000 and this is why this column serialized as org.apache.kafka.connect.data.Decimal. org.apache.kafka.connect.data.Decimal is standard Kafka Connect data type and supported/processed by most of Kafka Connect Sink connectors.

In addition - could you please update this issue with SCHEMA part of Kafka message.

Regards,
Aleksei

Closed due to no response within two weeks