I can't catch the error in inserting data with limited RAM
AlekseyVerba opened this issue · 2 comments
Hello
i limited my RAM memory to 40mb.
i'm trying to insert to Stream data to my database. But i get error 'Code: 241. DB::Exception: Memory limit (for query) exceeded: would use 41.64 MiB (attempt to allocate chunk of 4246320 bytes), maximum: 38.52 MiB.: (avg_value_size_hint = 34, avg_chars_size = 31.2, limit = 8192): (while reading column updated_date): (while reading from part /var/lib/clickhouse/store/c56/c567b5c9-f33a-46a8-80db-e28f28147b16/e21fe341116be3b31d9c043b1ad5926f_1_1_0/ in table 7edf24d4_8685_4935_8e2d_1ce4f58efe7e
.test (c567b5c9-f33a-46a8-80db-e28f28147b16) located on disk default of type local, from mark 0 with max_rows_to_read = 8192): While executing MergeTreeThread. (MEMORY_LIMIT_EXCEEDED) (version 23.8.2.7 (official build))'
And after i get the following error 'SyntaxError: Unexpected token C in JSON at position 0'
and at this code
`
let sourceStream: internal.Readable;
try {
sourceStream = await sourceInstance.query(
query,
SourceQueryResultType.Stream,
);
} catch (err) {
throw err;
}
try {
await this.chc.insert({
table: `"${this.tableName}"`,
values: sourceStream,
// database: this.dbName,
clickhouse_settings: {
date_time_input_format: 'best_effort',
},
});
} catch(err) {
console.log('GET ERROR')
console.log(err)
throw err
}
`
i can't catch the error in inserting data to ch
Node.js - 18.17.1
Clickhouse - 23.7.4.5
Can you please provide a minimal runnable reproduction for this case?
This includes the table definition, a sample dataset, a full code snippet (where sourceInstance
is coming from?), and the names/values of modified server settings.
i'm sorry, i had the error in my application