I’m not especially on board with the idea that maintaining an independent website means that site has to look like something from the 2000s. The design of my site isn’t deliberately old-school; I don’t think it has massive 2024 vibes either, it’s just … whatever’s required to make it look reasonable. It has a built-in dark mode depending on device preferences, everything is mobile-friendly because most of my casual internet browsing happens on my phone, I occasionally make use of bizarre nested :has()
and :not()
selectors – yes I’m a CSS3 simp. That said, I try to avoid JavaScript where possible, and I try to remember to use <noscript>
if I am implementing it, but that’s more because I want to minimise resource-intensiveness than some kind of statement about the aesthetic of the old web (whatever that may consist of).
Overall, I suppose I want my website to offer a pleasant browsing experience to humans. I don’t want it to extend that same kindness to crawlers and scrapers, on the other hand; it’s like the old classic “I don’t want to see or be seen by straight people” but instead of “straight people” it’s “search engines”. The location /
block in my nginx configuration includes add_header X-Robots-Tag "noindex, nofollow, nosnippet, noarchive";
; now I just have to hope the robots will be nice to me and do as they’re told 😌. I also recently added add_header Referrer-Policy "no-referrer";
, which, if I’ve succeeded in understanding increasingly obscure StackOverflow posts, should mean outgoing links from my site won’t tell other sites’ analytics services it’s the source of their traffic. I just think websites should be allowed to exist without informing others of their existence; just as humans should be allowed to look at websites without having their visits logged. If I ever put analytics on my site, please call the police because I’m acting under duress.
Mostly via YASnippet, I’m marking up new posts with microformats, which makes it easier to bridge them to the fediverse and generally comply with independent web principles; I also [went through every page and manually, sob] added OpenGraph meta elements to the page headers, so link previews show the title and an image when I’m sharing links to my site elsewhere. Discord embeds include the theme colour, which is neat, but leave out the description, possibly because I’ve combined name="description"
and property="description"
into the same tag, although I’m told this is a normal thing to do. Pleroma only seems to work if the trailing slash is included, which I suppose isn’t too great an inconvenience. Facebook works very well (not that I would ever share a link to my site on Realname Internet); Tumblr and Bluesky are similar.
Re Tumblr … I’ve been experimenting with it a lot recently. The announcement that they’re moving it to the WordPress backend perhaps suggests that they’re finally going to come good on that promise of fediverse integration from like two years ago. I had a wee go at integrating microformats into a Tumblr theme and bridging it to the Fediverse via Bridgy Fed, but switched it off after a few days, because it tends to obfuscate sources and doesn’t play well with the NPF (although, to be honest, what does). Obviously Tumblr is bad because it includes adverts and sells everyone’s data to A “I” developers and disables custom themes on new blogs by default; but the custom theme editor is much better than Dreamwidth’s, imo, the option to turn off reblogs on individual posts independently of whether those posts are public is a plus point over the fediverse services I’m familiar with, and the general customisation options are still far ahead of most other “sign up for a free account on this platform” services.