Native browser CompressionStream (gzip) for saves#247
Native browser CompressionStream (gzip) for saves#247Zarniwoops wants to merge 4 commits intoAda18980:5.5from
Conversation
d52125d to
34392eb
Compare
|
ah, I totally missed this |
|
The main difference with this one is that I was trying to keep the changes as minimal as I could. It's also opt-out by default. You have to enable it in the config options. |
|
Ill make a few changes I think, but @ewhac I do think a minimalist approach is preferred here |
| let binary = ''; | ||
| for (let i = 0; i < compressed.length; i++) { | ||
| binary += String.fromCharCode(compressed[i]); | ||
| } | ||
| // return 'data:application/vnd.straightlaced.kinkydungeon.save.game+gzip;version=2;base64,' + btoa(binary); | ||
| return 'gzip:' + btoa(binary); |
There was a problem hiding this comment.
Couple of things:
- You're basically creating a second copy of the compressed data in the conversion from
Uint8Arrayto a string. This copy is also twice as large, as JS strings are in UTF16. Recall that I have save codes as much as 7 MiB in size after gzip compression (I can share them with you for testing if you like). I suggest usingUint8Array.prototype.toBase64(), to encode the compressedUint8Arraydirectly, as recommended by Mozilla's docs. - Strongly recommend generating a data URL with a full MIME type. This will allow both the user and the code to immediately distinguish between save code types (yes, you will be expanding to outfits/wardrobe in the future), as well as add a version field letting you evolve the save format over time.
…am (gzip). Slightly smaller, 10x faster
…liant data url save format
Use Uint8Array.toBase64() instead of String.fromCharCode loop + btoa(), avoiding a redundant UTF-16 copy of the compressed data. Loader validates save type from MIME when present, rejects non-game saves, and falls back to LZString for classic save codes.
34392eb to
075f481
Compare
|
Alright, I updated it. |
|
toBase64() doesn't work on the current electron bundle of the game. So now it's auto-detecting and using the best method. |
| // Use toBase64() where available (Chrome 137+), fall back to btoa for older engines | ||
| let b64; | ||
| if (typeof compressed.toBase64 === 'function') { | ||
| b64 = compressed.toBase64(); |
There was a problem hiding this comment.
Nice. But, as you point out, toBase64() is crazy-new (sorry; didn't know that when I suggested it).
However, readAsDataURL() has been around since Chrome 6 for FileReader, and Chrome 7 for Crap; FileReaderSync. In fact, I think I'm going to re-work my own version to use FileReaderSync, since it'll get rid of at least one await.FireReaderSync() is only available inside WebWorkers.
Lemme noodle on this for a bit...
|
Here's my implementation, which is basically a copy of what I wrote for /**
* Compress a string using native browser gzip, returns a data: URI with MIME type
*/
async function KDCompressGzip (input: string): Promise<string> {
const mime_type = 'application/vnd.straightlaced.kinkydungeon.save.game+gzip;version=2';
const encoder = new TextEncoder();
// b_js -- blob of JSON
const save_b_js = new Blob ([encoder.encode (input)]);
const save_pipe = save_b_js.stream().pipeThrough (new CompressionStream ('gzip'));
const resp = new Response (save_pipe, { headers: [[ "Content-Type", mime_type ]]});
// b_z -- blob compressed
const save_b_z = await resp.blob();
// Snarfed from MDN Web docs. We do not catch the failure case.
async function toBase64DataURL (blob: Blob): Promise<string> {
return await new Promise ((resolve, reject) => {
const reader = Object.assign (new FileReader(), {
onload: () => resolve (reader.result as string),
onerror: () => reject (reader.error)
});
reader.readAsDataURL (blob);
});
}
// Maybe return the Promise directly?
return await toBase64DataURL (save_b_z);
}Only significant difference from my version is that I catch errors from All the APIs used are sufficiently old that anyone using an even older browser has far more problems going on. That said, I have no wish to antagonize anyone by harping on this any further. |
This uses the native CompressionStream (async pain in the ass) instead of doing compression in pure js.
It's 10 times faster, and surprisingly, a bit smaller.
I don't think I made too much of a mess with async, but it's not as clean as I'd like.