dyedgreen/deno-sqlite

Add serialization / deserialization

Closed this issue · 3 comments

For some use cases, I/O performance problems could be alleviated if one were able to deserialize a database from a Uint8Array buffer using the sqlite3_deserialize function, work with the database in memory, and then serialize it back using the sqlite3_serialize function. For example, bun:sqlite does this and it is very useful. Would it be possible to add this capability to deno-sqlite?

Hm, this is definitely possible in principle, but I’m unsure what the API for this should be?

Maybe something like:

const read = new DB(someUint8Array);

const myNewData = read.serialize();

(You’d also loose some guarantees around reads / writes that are usually very nice to have (really you should probably never need to do this / use something like Rust if you need that kind of performance)? Also you’d increase the size of the WASM binary slightly)

The API you describe is more or less how bun:sqlite does it.

I think const read = new DB(someUint8Array); would be nice to have for completeness, but not essential. If it were not there, one could work around it by writing someUint8Array to a temporary file on disk that is later opened in memory mode.

On the other hand, const myNewData = read.serialize(); would be an amazing feature to have for apps that use SQLite more as an internal file format than a transactional relational database. It would allow using the quick writes/reads/updates of the memory model, with the added benefit of being able to persist the data to the filesystem when there is a need.

Take my use case, for example: I use JS/TS generators to model discrete event simulations (they allow me to write the actors of the simulation in a very easy way). I like Deno because it lets me run several instances of the simulations in parallel. I was thinking of using deno-sqlite to store structured log information and statistics that I would later chart/analyze with another tool. Writing every single log individually to disk would be slow. Writing them to an in-memory SQLite database that I could later dump to disk would solve that problem.

EDIT: I just realized that the lack of read.serialize() for the use case above can be worked around with an SQL statement: VACUUM INTO <filename>.

I think having a good API to serialize / deserialize databases from memory can still be very useful.