Handle large results
Opened this issue · 4 comments
- The submission size limit is 100kB
- For JavaScript, the heap is 20MB, so we can decompressed the whole solution at once
- LZMA (best general compression algorithm) in JavaScript
- delta compression -- when it works, it works very well
- pruning of
null
values - Base91 instead of Base64 -- haven't tried, uncertain if the increased compression rate is worth the additional code?
Need to implement all the above, and simplify the code, get rid of chunking.
My idea is to have the solver try to string together a compression pipeline for each submission; it can try all permutations, and pick the best one.
Portability is an issue. It may make sense to keep both LZString and LZMA-JS. Modular approach, as a compression algorithm gets implemented in a particular language, the solver can use it in assembling the pipeline for that language.
https://github.com/LZMA-JS/LZMA-JS/blob/master/demos/simple_node_demo.js
(main) Jans-Mac-mini:brute-lee rdancer$ pbpaste | wc -c
951790
(main) Jans-Mac-mini:brute-lee rdancer$ pbpaste| lzma | base91 | wc -c
51225
Implement delta compression (see solutions/901/JavaScript.js for a proof of concept)
function compress(buffer) {
let _buffer = Array.from(buffer)
for (let i = _buffer.length - 1; i > 0; i--) {
_buffer[i] -= _buffer[i-1]
}
return LZString.compressToBase64(JSON.stringify(_buffer))
}
/**
* This is the function that is included with the solution
* @param {String} compressedString
* @returns {Array}
*/
function decompress(compressedString) {
buffer = JSON.parse(LZString.decompressFromBase64(compressedString))
for (let i = 1; i < buffer.length; i++) {
buffer[i] += buffer[i-1]
}
return buffer
}
function test(buffer) {
_buffer = decompress(compress(buffer))
return _buffer.every((val, i) => val === buffer[i])
}
test(buffer) // true
/**
* Given an expected result array, this function will return a compressed string
*
* Note that buffer is a global variable, and is appended to every time we get an additional expected result
*/
var buffer = []
function prep(expected_result) {
expected_result = expected_result.filter(x => x !== null) // note: must not call mySolution() from null-returning methods
buffer = buffer.concat(expected_result)
return compress(buffer)
}
Proofs-of-concept LZMA solutions/1929/JavaScript.js, and solutions/295/JavaScript.js using a modified version of the LZMA-JS library.
Even with LZMA, which is to my knowledge the best available general compression algorithm, most solutions are still too big and too random: 1310, 1451, 1632, 1707, 1720, 1721 -- all way over the submission size limit even with the compression and delta encoding.