1
0
Fork 0
mirror of https://git.sr.ht/~seirdy/seirdy.one synced 2024-09-19 20:02:10 +00:00

Update with feedback on page size

Added the Web Bloat Size Calculator and a snippet on how people on
trains experience a connection speed drops.

Thanks, u/Snapstromegon from Reddit!
This commit is contained in:
rohan kumar 2020-11-27 10:25:03 -08:00
parent 55543cf9b2
commit 46512c9046
No known key found for this signature in database
GPG key ID: 1E892DB2A5F84479
2 changed files with 21 additions and 5 deletions

View file

@ -61,7 +61,9 @@ Ultimately, surveillance self-defense on the web is an arms race full of trade-o
For users on slow connections, lazy loading is often frustrating. I think I can speak for some of these users: mobile data near my home has a number of "dead zones" with abysmal download speeds, and my home's Wi-Fi repeater setup occasionally results in packet loss rates above 60% (!!).
Users on poor connections have better things to do than idly wait for pages to load. They might open multiple links in background tabs to wait for them all to load at once, or switch to another window/app and come back when loading finishes. They might also open links while on a good connection before switching to a poor connection; I know that I often open 10-20 links on Wi-Fi before going out for a walk in a mobile-data dead-zone.
Users on poor connections have better things to do than idly wait for pages to load. They might open multiple links in background tabs to wait for them all to load at once, or switch to another window/app and come back when loading finishes. They might also open links while on a good connection before switching to a poor connection; I know that I often open 10-20 links on Wi-Fi before going out for a walk in a mobile-data dead-zone. A Reddit user reading an earlier version of this article described a similar experience when travelling by train:
=> https://i.reddit.com/r/web_design/comments/k0dmpj/an_opinionated_list_of_best_practices_for_textual/gdmxy4u/ u/Snapstromegon's comment
Unfortunately, pages with lazy loading don't finish loading off-screen images in the background. To load this content ahead of time, users need to switch to the loading page and slowly scroll to the bottom to ensure that all the important content appears on-screen and starts loading. Website owners shouldn't expect users to have to jump through these ridiculous hoops.
@ -126,7 +128,7 @@ It might seem odd to create a lossless WebP from a lossy PNG, but I've found tha
The 250kb club gathers websites at or under 250kb, and also rewards websites that have a high ratio of content size to total size.
=> https://250kb.club/ https://250kb.club/
=> https://250kb.club/ The 250kb Club
Motherfucking Website generated a lot of buzz when it was created:
@ -135,3 +137,7 @@ Motherfucking Website generated a lot of buzz when it was created:
Motherfucking Website inspired several unofficial sequels that tried to gently improve upon it. My favorite:
=> https://bestmotherfucking.website/ Best Motherfucking Website
The WebBS calculator compares a page's size with the size of a PNG screenshot of the full page content, encouraging site owners to minimize the ratio of the two:
=> https://www.webbloatscore.com/ Web Bloat Score Calculator

View file

@ -126,7 +126,10 @@ They might open multiple links in background tabs to wait for them all to load a
once, or switch to another window/app and come back when loading finishes. They might
also open links while on a good connection before switching to a poor connection; I
know that I often open 10-20 links on Wi-Fi before going out for a walk in a
mobile-data dead-zone.
mobile-data dead-zone. A Reddit user reading an earlier version of this article
described a [similar
experience](https://i.reddit.com/r/web_design/comments/k0dmpj/an_opinionated_list_of_best_practices_for_textual/gdmxy4u/)
riding the train.
Unfortunately, pages with lazy loading don't finish loading off-screen images in the
background. To load this content ahead of time, users need to switch to the loading
@ -196,9 +199,12 @@ Most of my images will probably be screenshots that start as PNGs. My typical fl
1. Lossy compression with `pngquant`
2. Losslessly optimize the result with `oxipng` and its Zopfli backend (slow)
3. Also create a lossless WebP from the lossy PNG, using `cwebp`
4. Include the resulting WebP in the page, with a fallback to the PNG using a `<picture>` element.
4. Include the resulting WebP in the page, with a fallback to the PNG using a
`<picture>` element.
It might seem odd to create a lossless WebP from a lossy PNG, but I've found that it's the best way to get the smallest possible image at the minimum acceptable quality for screenshots with solid backgrounds.
It might seem odd to create a lossless WebP from a lossy PNG, but I've found that
it's the best way to get the smallest possible image at the minimum acceptable
quality for screenshots with solid backgrounds.
Other places to check out
-------------------------
@ -209,3 +215,7 @@ rewards websites that have a high ratio of content size to total size.
Also see [Motherfucking Website](https://motherfuckingwebsite.com/). Motherfucking
Website inspired several unofficial sequels that tried to gently improve upon it. My
favorite is [Best Motherfucking Website](https://bestmotherfucking.website/).
The [WebBS calculator](https://www.webbloatscore.com/) compares a page's size with
the size of a PNG screenshot of the full page content, encouraging site owners to
minimize the ratio of the two.