My web pages are larger. But not because of script junk and graphics. The texts simply are that long.
It’s basically 1% markup and 99% visible text.
14kB club: “Amateurs!!!”
https://dev.to/shadowfaxrodeo/why-your-website-should-be-under-14kb-in-size-398n
a
14kBpage can load much faster than a15kBpage — maybe612msfaster — while the difference between a15kBand a16kBpage is trivial.This is because of the TCP slow start algorithm. This article will cover what that is, how it works, and why you should care.
Is it just the HTML that should be under 14kb? I think script, CSS, and image (except embedded SVGs) are separate requests? So these should individually be under 14kb to get the benefit?
Those additional requests will reuse the existing connection, so they’ll have more bandwidth at that point.
Interesting, didn’t know that’s how modern browsers worked. Guess my understanding was outdated from the HTTP/1 standard.
In an ideal world, there’s enough CSS/JS inlined in the HTML that the page layout is consistent and usable without secondary requests.
Something something QUIC something something
I actually read the link and they mention QUIC
there is a notion that HTTP/3 and QUIC will do away with the 14kB rule — this is not true. QUIC recommends the same 14kB rule.
Damn I was actually gonna add more context to my original comment about how QUIC is an overrated in place UDP upgrade for HTTP, but I didn’t wanna open my mouth because I haven’t read the QUIC spec.
Thank you for this lol
spoiler
Sliding windows are for losers, spam packets at gigabit rates or go home /s
A small critique of that project - a large portion of the websites included are simply personal sites for developers - nothing barely more technical than a business card or CV. I would exclude those or categorize them differently, as to me their “usefulness” seems relatively edge case.
I clicked on 6 sites
-
4 were personal portfolio sites
-
1 was a personal blog
-
1 was a web design company
Pretty disappointing, and I’m not going to keep clicking on more in the hopes I find something interesting
I clicked on random and I got a tic tac toe that’s apparently purpose made. Works fine too
-
In the FAQ they actually do address that and mention they’re reviewing those sites for removal.
They are useful for those people though. You can put a QR code or URL on your business card and it will give people all the information they need for your businessor something.
I don’t think anyone is arguing that having, like, websites are useful. But if they’re not particularly interesting then it doesn’t really fit here.
The point of something like this is generally to come up with interesting/creative/useful things within arbitrary resource limits. Not just a bunch of really really limited boring stuff.
That’s not one of their requirements. You might want to look at a different competition.
Yeah i dont get this complaint. This is just a label that people can qualify for, its not a competition or curated list of totally great websites. Its literally just like an energy efficiency sticker on a TV.
That’s cute. Go check out some 64k intros.
Oh man, reminds me of that amazing 3D FPS demo from 20 years ago.
.kkrieger for those who want to look it up
I need a CDN free single GET request club
Why exactly? Do you know what a CDN does and why its there in the first place?
Best I can do is a Firefox extension (Decentraleyes)
512kb? So much bloat…
I remember being amazed at Yahoo!'s load times on 56K. Pulled the code, 79K.
That’s a pretty cool site. I always wanted to set up my own simple, html page for a personal blog. Lots of inspiration in these pages. Crazy there’s functional web pages less than 2.03KB in size.
You can try Neocities!
I’ll look into it! Thanks!
It’s not hard to make a useful website that’s small. You just have to avoid using javascript libraries and keep images to a minimum. There are a number of static web page generators if you don’t want to write HTML yourself.
Keep in mind that a 50 kB page is about a 15 second load on a typical dial-up connection. Before high speed internet, almost everyone kept their web pages small.
First of all, I take a bit of umbrage at the author’s constant reference to “website size” without defining what this means until you dig into the FAQ. Just blithely referring to everything as “size” is a bit misleading, since I imagine most people would immediately assume size on disk which obviously makes no sense from a web browsing perspective. And indeed, they actually mean total data transferred on a page load.
Also, basically all this does is punish sites that use images. I run an ecommerce website (and no, I’m not telling you lunatics which one) and mine absolutely would qualify handily, except… I have to provide product images. If I didn’t, my site would technically still “work” in a broad and objective sense, but my customers would stage a riot.
A home page load on our site is just a shade over 2 megabytes transferred, the vast majority of which is product images. You can go ahead and run an online store that actually doesn’t present your customers any products on the landing page if you want to, and let me know how that works out for you.
I don’t use any frameworks or external libraries or jQuery or any of that kind of bullshit that has to be pulled down on page load. Everything else is a paltry (these days) 115.33 kB. I’mna go ahead and point out that this is actually less to transfer than jabroni has got on his own landing page, which is 199.31 kB. That’s code and content only for both metrics, also not including his sole image — which is his favicon, and that is for some inexplicable reason given the circumstances a 512x512 .png. (I used the Firefox network profiler to generate these numbers.)
Do you actually have to provide the image? Couldn’t you provide a pointer to the image? Like those thumbnails that are just links on the backends but appear as images when loaded
If you’re going to display pixels on the user’s screen, you have to send those pixels to the user. Magic still doesn’t exist. HTML img tags are indeed a “pointer,” but once the user’s browser has the path to that image file it will download the entire thing.
That said, there’s no reason to send an image that’s any bigger than it needs to be. Sending a scaled down thumbnail if you know it will be displayed small is sensible. Sending the entire 1200px wide or whatever image it is and just squashing it into a 100px wide box in the user’s browser is not.
Once the users browser has the path to that image…
I dunno why that didn’t occur to me, that makes sense
That’s how it works.
You may be thinking of “lazy loading,” where some scriptwork is used to delay downloading images until some time after the initial page load completes. This still requires all the data to be sent to the user — all of the data always has to be sent to the user eventually — but just not right away. This can have perceptible load time benefits, especially if whatever content you’re loading won’t be visible in the viewport initially anyway.
Tbh I’m just new to the computer science scene - I’ve taken one class so far on the fundamentals of programming and have only seen a real language in my free time as of yet.
It didn’t occur to me that the webpage which references another for an image would still be culpable for the space taken up by the image, because with on-disk memory management you can do tricks to reduce sizes with pointers and I just thought it would be analogous. It feels painfully obvious to me why that’s stupid now lol
It’s the same line of logic as when you see people post on a forum something like [img]c:\Users\Bob\Documents\My_Image.bmp[/img] and then wonder why it doesn’t work.
“But I can see it on my computer!”
Over the internet, the origin of all data is on someone else’s computer. All means all. And all of it needs to come down the wire to you at some point.
You’re on the right track in one regard, though, in a roundabout way with caching: Browsers will keep local copies of media or even the entire content of webpages on disk for some period of time, and refer to those files when the page is visited again without redownloading the data. This is especially useful for images that appear in multiple places on a website, like header and logo graphics, etc.
This can actually become a problem if an image is updated on the server’s side, but your browser is not smart enough to figure this out. It will blithely show the old image it has in its cache, which is now outdated. (If you force refresh a webpage by holding shift when you refresh or press F5 in all of the current modern browsers, you’ll get a reload while explicitly ignoring any files already in the cache and all the images and content will be fully redownloaded, and that’s how you get around this if it happens to you.)
And use SVG when you can; bitmaps which should be vectors are frequently big and ugly.
Thanks for sharing. Great inspiring collection.







