How to build an absurdly backwards-compatible website

131 points by zachflower 3 years ago | 64 comments
  • taftster 3 years ago
    > Let's face it: the internet is broken.

    This is where I disconnect. The internet is not broken. Maybe arguably the _web_ is broken, and specifically, web pages are broken (the HTTP protocol is still a wonderful thing).

    I wish technical authors would stop making the internet-is-broken meme when they really mean the web is broken. Sure, there's plenty broken with the internet in other ways (DNS, encryption, governments, ddos's, etc.), but let's not make the mistake that the internet equals the web.

    • dusted 3 years ago
      The Internet is broken too, we've been slowly lulled into a scenario where there are first-class and second-class and even third class netizens, and the third class, I'd not even consider as having an Internet connection.

      First class netizens are publicly routable nodes with statically allocated IP addresses. These have real, honest Internet connections, they get to participate in the global community of humans and their beloved machined.

      The second class is like the first but the service provider has imposed restrictions on their ability to communicate freely with other machines, such as blocking any packets they may send on specific protocols and ports, notably, TCP port 25, meaning these people cannot send their own email.

      The third class is like the second, but they do not have statically allocated IP addresses, they therefore cannot reliably and consistently participate in the global community and must jump some amount of hoops to even participate in a limited way.

      There is a hidden fourth class too, these cannot be considered as having a connection to the Internet, they cannot participate, only passively consume existing services, their machines are not connected to the internet, but another network, and their service-provider will just barely allow them to make requests to services on the real Internet, but it will not route any new connections to them, they are cut off and isolated, the silent majority. And this is where the brokenness of the Internet truly shines. This mode of connection should be illegal, and hiding yourself in this way should be an active choice on the part of the individual, not imposed by their "service provider".

      • blueflow 3 years ago
        > TCP port 25, meaning these people cannot send their own email

        They cannot run a mail server, you mean. For sending mails, you have port 587 plus authentication, which also solves the reputation issue with dynamic pools.

        The reputation issues with dynamic pools likely stem from malware-infected user machines sending spam.

        • dusted 3 years ago
          No, I mean sending the mail, outgoing, running a mailserver and receiving mail is generally not a problem.

          Let's say your email server is running on blueflow.person How do I send you an email on port 587 + authenticaion ? I'd need an account on your server to do that.. That's just silly! We can't expect everyone to have accounts on every mailserver, just to send email to eachother, that'd be like the postman having to have keys to every postbox to deliver letters!

          Port 25 is where your email server expects to receive emails from other mailservers (like mine).

        • southerntofu 3 years ago
          I agree with your analysis (thanks for sharing), but just for nitpicking, i believe it's fine if an ISP blocks port 25 by default to help combat unintended spam, as long as you have the option to unblock it.
        • eimrine 3 years ago
          > The internet is not broken.

          > (the HTTP protocol is still a wonderful thing).

          But almost all websites are requiring an HTTPS and you know what? I have a dozen of dumbphones who can not access webpages because HTTPS is no longer supported by vendor of device or absolutely can not access https because handshake process takes too long time for gprs (if no edge). If HTTP is so wonderful than why I can not use it without all the stupid security?

          • charcircuit 3 years ago
            That's like saying why can't I use HTTP without packet reliability from TCP or QUIC. Packet reliability, or security in your case, happens on a lower network layer. Just because HTTP lacks packet reliability, that doesn't mean that it now sucks.
            • taftster 3 years ago
              For reference, HTTPS is still HTTP. These are not two separate protocols. SSL/TLS doesn't change how HTTP works.

              The issue that the world has switched to HTTPS might be an interesting discussion. But the HTTP protocol itself still drives what's going on to send/receive data between client and server.

              • forgotmypw17 3 years ago
                A small portion of websites are still usable with HTTP, and I'm hoping this number actually grows.
              • mattigames 3 years ago
                "The internet is broken" he says; a moment later he greets his wife he met on tinder, then he opens a 3blue1brown video, a YouTube channel that he himself admits is the first teacher in his life that made math truly click in his brain, in the room right after his son is playing Minecraft with his best friend, a little girl from Japan who he has never met but already knows a bit of japanese thanks to their long chats over Discord. "Oh so broken" he sincerely lamented.
                • axiomsEnd 3 years ago
                  "The traffic infrastruture sucks", he lamented, stuck in traffic, but finally getting to the restaurant to meet his wife. Now, he is driving his kids every day to school, because low quality of infrastructure makes it dangerous to walk, but children are attending school, and even additional karate lessons. Then the children got their cars, and are now stuck in traffic, but they also get to their deatinations! And they also lament, that the infrastructure sucks.
                  • Cloudef 3 years ago
                    >wife he met on tinder

                    I chuckled on that part

                • paskozdilar 3 years ago
                  One organization that says "The Internet is broken" and really means it is GNU net [0].

                  [0] https://www.gnunet.org/en/

                  • 123pie123 3 years ago
                    >The internet is not broken.

                    With BGP it's just about limping along with trust

                    https://www.bleepingcomputer.com/news/security/major-bgp-lea...

                    • taftster 3 years ago
                      Yes, BGP is a good example of brokenness on the internet. Lower in the stack than web pages, though, which was my rant that "internet broken" for many authors often means "web broken".
                    • 3 years ago
                    • eyelidlessness 3 years ago
                      This is a cute and cool dive into taking backwards compatibility to absurd extremes. There’s one bit I might take overly-serious issue with (I hope the author will take this as non-serious criticism and an amused engagement with the thought process):

                      > That said, browsers that predate CSS do not know what to do with <style> tags, and as a result simply print the styles out at the top of the page. It's pretty ugly and, depending on how much CSS you have, borderline unusable.

                      > […]

                      > Forgotten Link Tags

                      I know external vs inline CSS is pretty much never going to be an answered question, but I feel like there’s a missed opportunity here. In particular because

                      - External CSS by link element addresses this much better for the backwards-compatible extremes: why even serve the contents of the CSS at all if the browser will just treat it as a comment? That’s just precious bandwidth wasted.

                      - Taken to extremes, external CSS relaxes the use CSS sparingly principle if you strategically break up the link tags to align well with likelihood of support/value ratio. Using media attributes to split CSS payloads can improve UX for old/underpowered mobile devices and even desktops/laptops with lower resolution screens. Detecting support for (ahem) @supports can open all kinds of CSS minimization doors for newer but lower power devices. Segmenting stylesheets by capabilities and preferences allows compatibility improvements regardless of point in time!

                      And a minor nit: if you’re dithering images for file size it’s worth comparing against a non-dithered version with a small color palette. The latter might actually be better for everyone!

                      Oof and that prompted a larger one: you can improve UX for ~every real visitor with <picture> tag and more efficient formats, and you could also sorta skirt the no-JS rule and upgrade pre-picture browsers to PNG with a simple inline onerror.

                      • zachflower 3 years ago
                        This is great, thank you!

                        I’ve been living in backend-land for so long I’ve never actually used the <picture> tag. I will have to take a stab at it and see how the legacy browsers treat it, because if I don’t have to use GIFs, I won’t.

                        As for the @media tags, I do utilize them to a degree, just to make everything render nicely on mobile and to support dark-mode. But (to put it cheekily) I’m more concerned with backwards compatibility than forwards compatibility :p

                        • eyelidlessness 3 years ago
                          > As for the @media tags, I do utilize them to a degree, just to make everything render nicely on mobile and to support dark-mode. But (to put it cheekily) I’m more concerned with backwards compatibility than forwards compatibility :p

                          I definitely recognized that! My thought was take that backcompat focus further and relegate whatever forward compat you do choose to support not to @media queries, but the media attribute on link tags[1]. Why should HTML 4 browser users download dark mode CSS they can’t use? ;)

                          1: https://developer.mozilla.org/en-US/docs/Web/HTML/Element/li...

                        • masswerk 3 years ago
                          Externally linked sources introduce a backward-compatibility problem of their own. E.g, I recall early versions of IE only supporting inline JS. So, if you want to support as much as possible as far back as possible, inlining is the way to go.

                          (Since any extensions had this problem, when they were introduced, they all support enclosing HTML comments. Otherwise, there wouldn't have been any chance for becoming broadly adopted.)

                          Regarding dithering: I recall a few sites with duotone dithering, which was a nice effect. And, of course, to bring down file sizes and optimize results, you could always manually compose from dithered and undithered images using the same palette.

                          • zachflower 3 years ago
                            > Externally linked sources introduce a backward-compatibility problem of their own. E.g, I recall early versions of IE only supporting inline JS. So, if you want to support as much as possible as far back as possible, inlining is the way to go.

                            That was part of the reason for inlining the CSS. The other part (which I didn’t explain in the post) actually came about because of Mosaic.

                            Even though it didn’t support CSS, it was aware of link tags and added a button to the top of the window for each one (literally linking to the referenced file). I couldn’t dig up a way to disable that within the code, so I went with the commented-out inline method to get the experience I was looking for.

                            • masswerk 3 years ago
                              Regarding Mosaic: One way around would have been outputting the stylesheet links via JS `document.write()` and user-agent filtering, since every browser that supports CSS also supports JS. Anyways, an interesting detail about Mosaic!

                              P.S.: Now I'm not sure, if this would generally work in the head section, since with older browsers the `document` object became only available as the body tag was encountered. Or was this just for the properties and `write()` was available anyways? (This behavior changed with NS4/IE4.)

                          • layer8 3 years ago
                            > color palette

                            The TFA failed to mention websafe colors. I still have that poster hanging on the wall: http://www.visibone.com/color/poster4x.html

                          • amiga-workbench 3 years ago
                            Regarding https, one thing I like to do on my personal websites is listen if the client actually wants to upgrade protocols instead of forcing https on everyone.

                              set $need_http_upgrade "$https$http_upgrade_insecure_requests";
                            
                              location / {
                                if ($need_http_upgrade = "1") {
                                  add_header Vary Upgrade-Insecure-Requests;
                                  return 301 https://$host$request_uri;
                                }
                            
                                index index.php index.html;
                                try_files $uri $uri/ /index.php?$query_string;
                              }
                            
                            Its pretty straightforward to do in nginx, and my websites remain usable in IE5, Contiki, various feature phones.
                            • RcouF1uZ4gsC 3 years ago
                              This has enabled man in the middle attacks even for clients that want to upgrade protocols. An ISP or owner of the Wi-Fi network can just quietly drop all upgrade security headers.
                              • pabs3 3 years ago
                                One other trick to do this is include a https resource (image/CSS/JS) in the http page, and on https use the HTTP header that forces https for the domain. So then if the resource loads successfully, future loads of the site go to https. Browsers that don't support Upgrade-Insecure-Requests often support the https-only header.
                                • zachflower 3 years ago
                                  Oooo I like this! I’ll have to steal it.

                                  Thanks for the tip!

                                  • goodoldneon 3 years ago
                                    Don’t do this. What if something strips the header between the client and your server? Always upgrade to HTTPS. Not doing so isn’t worth supporting 25 year old browsers
                                    • superkuh 3 years ago
                                      That depends what your website is. If it's for some commercial or sensitive thing then yeah, just HTTPS is okay. But if it's something of yours (and isn't just done to get you hired) then the downsides of HTTPS-only outweight the benefits. HTTP+HTTPS is perfect for human persons even if it's not for corporate persons.

                                      You're basically making it so that people can only visit your site if a third party corporation wants to maintain an account with you. There are benign organizations like LetsEncrypt but it still means giving up control to an entity that will eventually go bad. Just look at what happened to dot Org.

                                      And of course you prevent even moderately old systems from interacting with your web server. Depending on your accepted TLS cypher set you're probably excluding software from as late as 2017 by going HTTPs only.

                                      It's like wearing level 3 body armor when you go out to the park to walk to the dog. There are some people who have lives where that's necessary, but it really isn't for most. And the downsides outweigh the admittedly very safe protection.

                                      • boomlinde 3 years ago
                                        > What if something strips the header between the client and your server?

                                        Then that something would would equally likely intercept your initial HTTP request and serve you a TLS stripped version of the website.

                                        The real solution to this problem is for browsers to never implicitly make plaintext HTTP requests via the address bar. In general, they have become too clever in intepreting the content of the address bar. Firefox, for example, will gladly change the name and try a variety of protocols of the sort-of-address I'm requesting if it doesn't get a response to its initial request. I don't know if it's the case still, but it even used to blindly append ".com" to the name you entered in some cases, going so far as request an entirely different domain.

                                        I don't know what name will be resolved or what protocol will be used, and it may depend on network conditions (for example, Firefox will add "www." to the URL if the server happens to be down the moment I request it).

                                        This makes the address bar unpredictable, unreliable and unsafe. It is beyond me why it has been made such a complex problem. I guess it's more forgiving? I am wary of software that so readily trades security for convenience.

                                        • pabs3 3 years ago
                                          The thing could just strip the upgrade to https anyway. Lots of those sort of tricks are implemented in sslstrip:

                                          https://github.com/moxie0/sslstrip

                                          • eimrine 3 years ago
                                        • layer8 3 years ago
                                          Reading the article, I was just thinking there should have been an Accept-Protocol header, but now I see that a limited version of that exists as Upgrade-Insecure-Requests.
                                        • jefftk 3 years ago
                                          I don't think that the advice to offer non-HTTPS is good: it exposes your users to downgrade (SSL stripping) attacks. Even extremely old browsers supported HTTPS: it was added to Netscape in 1994, and Internet Explorer in 1995 (IE2). You shouldn't have to give up security for users of modern browsers in pursuit of backwards compatibility.

                                          (It might be a bit tricky to find an HTTPS configuration that supports for both modern and extremely old browsers, but it should be possible)

                                          • 0xbadcafebee 3 years ago
                                            I don't think it's possible to use modern HTTPS with old browsers. All the old ciphers are now insecure and obsolete. Even if you supported the old ciphers, what would be the point, since they're insecure? So just provide plain HTTP.

                                            For the majority of users, man-in-the-middle attacks (by someone other than your ISP) will never be an issue. It's mostly a theoretical problem. Your connection at home (and your laptop) is as safe as your Wifi connection. Your mobile connection is probably more secure. And there is no hacker sitting in your coffee shop waiting to p0wn your connection to Facebook or send you a 0day. HTTPS is necessary for the whole world to trust e-commerce, but saying everything has to be encrypted is ridiculous.

                                            The most likely MitM anyone will ever experience is DNS cache poisoning, and that's pretty rare.

                                            • Cloudef 3 years ago
                                              There's already a huge MitM between you and the server called cloudflare
                                            • GuB-42 3 years ago
                                              The site is compatible with IE1, HTTPS would break that. Unacceptable.

                                              And I don't believe it is possible to have an HTTPS configuration that suits both old and new browsers since new browsers regularly deprecate older versions of SSL/TLS. I think that anything less than TLS 1.2 is deprecated in many browsers now. and TLS 1.2 is from 2008, way too modern for a website focused on compatibility.

                                              For compatibility, HTTP is your only choice, secure protocols are likely to deprecate regularly as new vulnerabilities are found and stronger protocols are made.

                                              • forgotmypw17 3 years ago
                                                I disagree completely. Sometimes backwards compatibility is more important.

                                                There are applications where you want maximum security (e.g. banking) and there are others where it is not only not necessary, but also a hindrance (ART, for example)

                                                • david422 3 years ago
                                                  > not only not necessary, but also a hindrance

                                                  It's always necessary. We've learned that with http connections, middlemen can inject adware or other crap into the page. https://www.infoworld.com/article/2925839/code-injection-new...

                                                  • Wowfunhappy 3 years ago
                                                    Google, Apple, Microsoft, Raymond Hill, and others also have this ability, even with https, depending on your OS and browser. It all comes down to who you decide to trust.

                                                    You've made a judgement call that ISPs are inherently less trustworthy than every other party in the chain, but I don't think you should make that decision for everyone else, particularly given that you don't know what ISP they have.

                                                    • forgotmypw17 3 years ago
                                                      With HTTPS connections, compatibility problems or even a clock which is set wrong can keep someone from accessing important information.

                                                      Not to mention that you are at the mercy of the SSL authorities.

                                                  • Wowfunhappy 3 years ago
                                                    Is using TLS 1.0 any better than just using http?
                                                    • xdennis 3 years ago
                                                      > Even extremely old browsers supported HTTPS: it was added to Netscape in 1994, and Internet Explorer in 1995 (IE2)

                                                      Last year I traveled to my parents house and booted up my old computer which I haven't used from 2014. It was using Ubuntu 12.04, I think.

                                                      I could barely browse any websites in Firefox, because of TLS issues.

                                                    • westcort 3 years ago
                                                      My key takeaways:

                                                      1. When I say "as backwards compatible as possible," I mean that this website will be usable on as many browsers, connections, and hardware as I can reasonably support

                                                      2. Raw HTML is Your Friend Remember the <FONT> tag? And <TABLE> based layouts? What about <CENTER>? I do

                                                      3. I won't go into the technical details here (Low Tech Magazine does a far better job explaining it than I can), but the thing to know is that dithering allows you to reduce the filesize of your images by reducing the amount of information it takes to render them

                                                      4. OldWeb.Today The most frequently used tool in my arsenal, oldweb.today is a website that allows you to emulate a number of different retro browsers (from NCSA Mosaic to Netscape Navigator) from directly within your own browser

                                                      I practice a lot of these things on my website locserendipity.com, and the associated search engine, which is the only one that you can download with a working index, albeit it is limited to only around a million entries.

                                                      This experimental search engine indexes all of the .edu pages on DMOZ circa 2010. It also has a local index: https://locserendipity.com/edu.html?q=amateur%20radio

                                                      • rvieira 3 years ago
                                                        This is great. I'm not a front-end dev but I have a low-traffic personal site where I like to experiment. One of my hobbies is to find out how small, lean and compatible with old browsers I can make it, while still looking modern-ish (very subjective I know) and with _some_ interactivity that can degrade gracefully.

                                                        So far it's pretty much usable in IE6 and Mosaic 2 (1993).

                                                        • ulzeraj 3 years ago
                                                          > I use the Caddy webserver

                                                          I used to love Caddy. Not so after the move to 2.0 when they got the json formatted config thing going on. Of course there is still Caddyfile but most of the time I'm bashing my head on their json focused documentation page to find out how can I do that same thing from a config file. I'm pretty sure there might be somewhere but I don't want to read a book I just need a nice quick web server that runs from a single binary like v1 did.

                                                          • mid-kid 3 years ago
                                                            Supporting old web standards is neat, and you make your goal with this clear in your article, but I can't help but wonder if just using a subset of the newer standards isn't a better idea in the long run. HTML5 standardized a lot of the browser-specific features that plagued websites, and did away with a lot of ways to do the same thing with subtly different results. Supporting all these old methods and standards for compatibility purposes is part of the reason modern web browsers are so huge (the other part being overzealous standards), so I wonder if someone couldn't go and try to figure out a subset of the current standards that are simple to implement, so simpler browsers can be made for older machines, and people have an easier time making sites that can run on old machines without digging into arcane behavior of old browsers, and having to hackily support both.
                                                            • adamomada 3 years ago
                                                              Warning, off-topic

                                                              > My daily driver is a mid-2012 MacBook Pro that will stop receiving all updates (security included) from Apple by the end of this year. While I personally intend to keep it alive with Linux, this isn't a path that is readily available to most people. What are most people supposed to do in this situation to save some money and avoid adding more junk to our landfills?

                                                              I’m in the same boat with the same machine. Have you heard of Open Core Legacy patcher? The 2012 machine is apparently the cut-off for ‘everything works’ but I’m really curious about the definition of ‘works’. Also, the security implications of using it, of course.

                                                              (I’m assuming OP is the author)

                                                              [0] https://dortania.github.io/OpenCore-Legacy-Patcher/

                                                              • folli 3 years ago
                                                                A lot about the "how", but very little about the "why".
                                                                • viamedia 3 years ago
                                                                  Why not serve .txt and be done with it? Sometimes I wonder, who is the styling for - especially on personal sites/blogs. Repeat viewers/readers come for the text, if they do at all (otherwise it is their favorite reader).

                                                                  If someone is using a screen reader, low-contrast/high-contrast setting, translator - plain txt simply works. For more intricate content like schematics, equations, offer a link to PDF.

                                                                  I am not sure if the lack of mention of Lynx was an oversight.

                                                                  • fjfaase 3 years ago
                                                                    I started writing my personal website in 1995 and since than did not change much about usage of HTML. So, really no surprise that it still looked pretty okay when I opened it in the NSCA Mosaic 2. Of course, some images are missing and some JavaScript animations do not work. I do use NOSCRIPT. I am using a plain text editor (which has a function to follow local links and open them) for editing my website in HTML. I am also using a program for checking the HTML.
                                                                    • projektfu 3 years ago
                                                                      Don't forget to use web-safe colors and key your gifs to the system palette. 1995-era graphics cards usually supported 256 colors.
                                                                      • jhoechtl 3 years ago
                                                                        > Let's face it: the internet is broken.

                                                                        Again and again: It's not the internet broken, its only HTML plus the required interplay of a bazillion of specs.

                                                                        • ajsnigrutin 3 years ago
                                                                          For most people, that's 99% of "the internet".

                                                                          Almost all "other" protocols, have been replaced by HTML over HTTP. (from chat applications, web interfaces for mail, forums instead of newsgroups, etc.)

                                                                        • throwaway290 3 years ago
                                                                          For some reason I thought this would reveal some shortcut for ensuring unchanging URLs even after you switch stacks etc.
                                                                          • minroot 3 years ago
                                                                            Your site has broken CSS (on mobile)
                                                                            • adamddev1 3 years ago
                                                                              Very beautiful typography. Feels quite satisfying to read.
                                                                              • pingiun 3 years ago
                                                                                I must say I disagree. There's a reason all books and magazines don't use fixed width fonts: they read slower and with less comfort
                                                                                • taftster 3 years ago
                                                                                  White-on-black is hard for me to read and goes blurry. I had to switch to reader-mode to read this. Are you on mobile or desktop?
                                                                                  • adamddev1 3 years ago
                                                                                    Desktop (macbook air M1)