Skip to content

Native browser CompressionStream (gzip) for saves#247

Open
Zarniwoops wants to merge 4 commits intoAda18980:5.5from
Zarniwoops:feature/gzip-compression
Open

Native browser CompressionStream (gzip) for saves#247
Zarniwoops wants to merge 4 commits intoAda18980:5.5from
Zarniwoops:feature/gzip-compression

Conversation

@Zarniwoops
Copy link
Copy Markdown
Contributor

This uses the native CompressionStream (async pain in the ass) instead of doing compression in pure js.

It's 10 times faster, and surprisingly, a bit smaller.

I don't think I made too much of a mess with async, but it's not as clean as I'd like.

@Zarniwoops Zarniwoops force-pushed the feature/gzip-compression branch from d52125d to 34392eb Compare December 25, 2025 15:21
@YukiMonsai
Copy link
Copy Markdown

ah, I totally missed this
Is this different from #223 ?

@ewhac ewhac mentioned this pull request Feb 13, 2026
@Zarniwoops
Copy link
Copy Markdown
Contributor Author

The main difference with this one is that I was trying to keep the changes as minimal as I could. It's also opt-out by default. You have to enable it in the config options.

@Ada18980
Copy link
Copy Markdown
Owner

Ill make a few changes I think, but @ewhac I do think a minimalist approach is preferred here

Copy link
Copy Markdown
Contributor

@ewhac ewhac left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment.

Comment thread Game/src/base/KinkyDungeon.ts Outdated
Comment on lines +7277 to +7282
let binary = '';
for (let i = 0; i < compressed.length; i++) {
binary += String.fromCharCode(compressed[i]);
}
// return 'data:application/vnd.straightlaced.kinkydungeon.save.game+gzip;version=2;base64,' + btoa(binary);
return 'gzip:' + btoa(binary);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Couple of things:

  • You're basically creating a second copy of the compressed data in the conversion from Uint8Array to a string. This copy is also twice as large, as JS strings are in UTF16. Recall that I have save codes as much as 7 MiB in size after gzip compression (I can share them with you for testing if you like). I suggest using Uint8Array.prototype.toBase64(), to encode the compressed Uint8Array directly, as recommended by Mozilla's docs.
  • Strongly recommend generating a data URL with a full MIME type. This will allow both the user and the code to immediately distinguish between save code types (yes, you will be expanding to outfits/wardrobe in the future), as well as add a version field letting you evolve the save format over time.

Use Uint8Array.toBase64() instead of String.fromCharCode loop + btoa(),
avoiding a redundant UTF-16 copy of the compressed data.

Loader validates save type from MIME when present, rejects non-game
saves, and falls back to LZString for classic save codes.
@Zarniwoops Zarniwoops force-pushed the feature/gzip-compression branch from 34392eb to 075f481 Compare February 21, 2026 21:44
@Zarniwoops
Copy link
Copy Markdown
Contributor Author

Alright, I updated it.

@Zarniwoops
Copy link
Copy Markdown
Contributor Author

toBase64() doesn't work on the current electron bundle of the game. So now it's auto-detecting and using the best method.

Comment thread out/saveworker.js
Comment on lines +29 to +32
// Use toBase64() where available (Chrome 137+), fall back to btoa for older engines
let b64;
if (typeof compressed.toBase64 === 'function') {
b64 = compressed.toBase64();
Copy link
Copy Markdown
Contributor

@ewhac ewhac Feb 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice. But, as you point out, toBase64() is crazy-new (sorry; didn't know that when I suggested it).

However, readAsDataURL() has been around since Chrome 6 for FileReader, and Chrome 7 for FileReaderSync. In fact, I think I'm going to re-work my own version to use FileReaderSync, since it'll get rid of at least one await. Crap; FireReaderSync() is only available inside WebWorkers.

Lemme noodle on this for a bit...

@ewhac
Copy link
Copy Markdown
Contributor

ewhac commented Feb 22, 2026

Here's my implementation, which is basically a copy of what I wrote for super_saver (couldn't really find a way to simplify it):

/**
 * Compress a string using native browser gzip, returns a data: URI with MIME type
 */
async function KDCompressGzip (input: string): Promise<string> {
	const mime_type = 'application/vnd.straightlaced.kinkydungeon.save.game+gzip;version=2';

	const encoder = new TextEncoder();
	// b_js -- blob of JSON
	const save_b_js = new Blob ([encoder.encode (input)]);
	const save_pipe = save_b_js.stream().pipeThrough (new CompressionStream ('gzip'));

	const resp = new Response (save_pipe, { headers: [[ "Content-Type", mime_type ]]});
	// b_z -- blob compressed
	const save_b_z = await resp.blob();

	// Snarfed from MDN Web docs.  We do not catch the failure case.
	async function toBase64DataURL (blob: Blob): Promise<string> {
		return await new Promise ((resolve, reject) => {
			const reader = Object.assign (new FileReader(), {
				onload:  () => resolve (reader.result as string),
				onerror: () => reject (reader.error)
			});
			reader.readAsDataURL (blob);
		});
	}

	// Maybe return the Promise directly?
	return await toBase64DataURL (save_b_z);
}

Only significant difference from my version is that I catch errors from toBase64DataURL() and fall back to LZString.

All the APIs used are sufficiently old that anyone using an even older browser has far more problems going on. That said, I have no wish to antagonize anyone by harping on this any further.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants