IndieWeb is Punk
: thingelstad.com
@jthingelstad as I wrote this the Punk Rock musings of @danielpunkass kept coming to mind. đ
@jthingelstad Yes!! Thatâs why I vibed so much with Micro.blog when I finally gave it a chance a little over a year ago. I started going down the indie web rabbit hole and realized thereâs a whole community of people doing this underground blogging thing, or even zines all on their own without trying to get sponsorships or sell ad space. I think it came for a lot of people when Twitter went to shit. People are sick of corporate social media and rage bait algorithms. Canât agree more about how punk it feels doing this stuff everyday. Itâs so authentic here.
@jthingelstad Love the analogy. It seems very fitting.
@jthingelstad I donât usually post links to my own blog(s), but this seems relevant. Some good links in it. bloftin2speaks.micro.blog/2024/12/1⊠Full credit to https://micro.blog/grubz
Large language models and their associated bots are bad for the indie web in at least three ways: 1) their logistical consequences are bad for bandwidth, 2) their social consequences are bad for guides, and 3) their citational consequences are bad for surfability. These consequences are worth highlighting in light of how LLM-based chatbots have been used and endorsed on the indie web. The indie web may mean different things to different people, but if weâre thinking of it at all in terms of favoring small sites over corporate exploitation, then the indie web as a concept and a practice is fundamentally at odds with what LLMs are doing to the web.
Part of the inspiration for this post comes from a thread at the 32-Bit Cafe, but what has sustained the motivation is my repeated encounters with how LLMs have been put forward. Chatbots keep being suggested as a form of coding assistance in pieces like Welcome to The Web We Lost, The Internetâs Hidden Creative Renaissance, and a certain website about HTML. Recently the company that makes Firefox has announced that it intends to join corporate bandwagon by implementing all new security hazards. By chance I found out that an indie web directory site has implemented bot-generated summaries. Then I found an upcoming indie web project and saw that it has accepted a LLM feature request from someone referring to LLMs and their ilk as âa basic need.â
Running into stuff like this, repeatedly, has motivated me to put together this post.
Note that in order to distinguish itself, this post will try to avoid the more heavily-trod ground in LLM criticism. That means no descriptions of the environmental impact, no warnings about the looming economic consequences of the investment bubble, and no artistic, aesthetic, or spiritual appeals about the loss of âsoulâ or âhumanity.â As salient as those points may be, I expect youâve already heard them before, and none of them are necessary to make the case that LLMs are bad for the indie web.
1) Bad For BandwidthLLMs are fed data from scraper bots that are notorious for overloading bandwidth, which means disrupting legitimate traffic from actual visitors and potentially driving up the cost of hosting. In extreme cases, they may even knock websites offline. Declaring your policies in a robots.txt file is not sufficient to stop them.
At this point there have been countless posts about this, so for those new to this issue, hereâs a selection for you:
On multiple occasions this problem has also impacted the IndieWeb wiki, which now has a dedicated page about LLM traffic.
Make no mistake, there is a distinct asymmetry at play here. Megacorporations can hammer the servers of smaller companies, hobbyist projects, public research efforts, and indie personal sites, but turnabout is not fair play. The disparity should be immediately noticeable to anyone acquainted with spurious DCMA takedowns or how Nintendo has responded to unauthorized emulators. Major IP holders get to be very fussy about policing whatever they claim as their turf, and yet now these megacorporations are being granted social license to run roughshod all over us, overloading bandwidth and chewing up the public web, regardless of permission or consent. They donât care about consent. Consent is for paupers.
To be clear, this point is not an invitation to litigate the complexities of copyright law. This is a point about inequity of interference. Even if a given website is entirely in the public domain, it still wouldnât be right for a megacorporation to scrape the thing so hard as to knock it offline. If indiscriminate scraping is a necessary condition of the industry, as the suits have claimed it is, then that means the fundamental logistics of the industry are bad for the logistics of the indie web.
2) Bad For GuidesReliance on chatbots is bad for guides, by which I mean they undermine the living, breathing people who provide others with guidance. For many such people, developing the right frame of reference and maintaining motivation can be contingent upon connecting with and understanding their audience of learners. If those learners become more disconnected and elusive, then our guides will be the worse off for it.
Providing good guidance is not just about being knowledgeable, but about familiarizing yourself with the gap between what you know and what the learner knows, in order to identify a path between the two. Without a strong grasp of learner perspectives, a guide can end up creating a tutorial that falls short â the kind that says âitâs very simpleâ about something that is not simple or âitâs easyâ about something that is not easy. This is the problem that Annie was parodying in How I, a non-developer, read the tutorial you, a developer, wrote for me.
See also the classic âdraw the rest of the owlâ:
To mitigate this problem, what you need is plenty of exposure to beginner perspectives, and beginner perspectives are what every community stands to lose out on when people are encouraged to turn to chatbots instead. Chatbots end up absorbing peopleâs questions, obscuring them from living guides. In fact, avoiding interactions with real people can even be a part of the botsâ appeal, in that it means getting to dodge unpleasant social interactions with those who interact poorly with beginners.
When learners overall turn elsewhere, that loss can be de-motivating to people who want to help. Plenty have spoken about how the expectation of chatbot use has undermined the sense of purpose behind writing reference materials. Take for instance the perspective of the culinary guides who are being discouraged from continuing to share their expertise:
When searching on Google for Chinese cooking traditions, a casual cook may be satisfied by the [Bot-Generated] Overview. But that may draw from The Woks of Life blog, a comprehensive English-language resource for Chinese cooking, according to Sarah Leung, one of its co-creators. Her family has spent years building out reference material on techniques, traditions and culture, she said. â[Bot] summaries have almost completely overtaken results about various Chinese ingredients, many of which had no information online in English before individual creators like us wrote about them.â
The shift has her questioning whether itâs worth publishing new reference guides at all. âIn all likelihood, no one will ever discover those pages,â she said.
Believing that no one will ever discover your articles, tutorials, walkthroughs, or reference materials can make the whole effort feel pointless, and under these conditions, people are more inclined to withdraw.
3) Bad for SurfabilityTurning to chatbots for answers can result in a web thatâs increasingly disconnected and worse to browse. Good browsing comes from an abundance of link trails, and link trails are exactly what people are being cut off from discovering or creating when they rely on machine-generated summaries instead. This is especially detrimental for the part of the web that relies on links for surfability.
Surfability for the indie web can only come from a culture of links that allows you to click around. Reading one response post leads you to another. Opening a personal site leads you to a blogroll or a button wall. Finding a directory lets you discover a whole array of websites to explore. If exploring the indie web is what we want, as opposed to loading one single page as a novelty and then getting sucked back into a billionaireâs feed, then the indie web needs this handcrafted surfability.
Surfability is exactly what we stand to lose to LLMs because LLMs are notorious for separating people from sources. The LLM-based chatbots tested in a study by the Tow Center mistook the source of a quote more than half of the time, and thatâs when they were directly prompted to find it. In practice, whatâs more likely is that synthetic text wonât direct people to sources at all. Bot-generated âoverviewsâ are reducing the click rate on search results, raising concerns about the prospect of less linking in our future. At scale, that would mean fewer trails and pathways to follow between different sites, replaced by more and more dead ends.
That looming possibility leads me to think of this segment from a video about plagiarism online:
Stephen Spinksâ column is extremely moving to read and genuinely important... and no one watching [the plagiaristâs] video had the chance to learn his name. [The plagiarist] made a lot of money repeatedly re-uploading a video about the erasure of queer people â and he did it by erasing queer people. [...]
Good writing about queer living is hard to find and easy to lose, and in obscurity, it becomes even easier to pretend it was yours. None of the money [the plagiarist] makes will go to the people who wrote the great lines his viewers enjoyed. They get to rot in the very obscurity he pretends to criticize.
âHarry Brewis, Plagiarism and You(Tube), âThe Costâ
Compounding obscurity is one of the risks we face from an increasing reliance on chatbots. When people donât get told where things come from in the first place, they miss out on the chance to cite them, which means missing out on the chance to link them, which results in pages with fewer links, which means fewer pathways available to surf the web â a web that becomes less of a web, increasingly threadbare, disconnected, and frayed.
Handcrafted OverviewLLMs and scraper bots are detrimental the indie web in many ways. They are bad for bandwidth, bad for guides, and bad for surfability. This isnât an exhaustive list of all their harms, just some of the ones most salient to the creation, maintenance, and exploration of personal websites. To the extent that the indie web aligns itself with collaborative values, small personal sites, and a DIY ethos of curiosity and exploration, it is conceptually at odds with extractive corporate technologies that sap our resources, obfuscate our guides, undermine link culture, and discourage us from sharing.
đŹ Reply via Dreamwidth (no account required), Pillowfort, Webmention, or Email.
đ Shared to This Week in the IndieWeb and Octothorpes: indieweb, llm, ai.
đ Have you linked to this elsewhere? Let me know.
Places where this has been linked:Re: by KhĂŒrt Williams, sent in to IndieNews and shared by Nicholas Ferrell
On Defining the Indie Web By Its EnemiesKhĂŒrt Williams has written a response post noting that my post is worth reading and that it does contain substantive points, which I appreciate. Unfortunately that post also describes my opening paragraph as âredefiningâ the indie web, calls for foregrounding the positive, and argues against some more extreme takes that do not precisely reflect what appears in this post. What appears in this post is âif weâre thinking of [the indie web] at all in terms of favoring small sites over corporate exploitation,â a conceptualization which already foregrounds the positive and is too precedented to warrant being called a redefinition on my part.
For instance, some examples that predate this post:
Compared to some of these, my own choice of words is relatively circumspect.
Regardless, oppositional definitions do not in principle trouble me any, and for a concept like the indie web I donât think they warrant any particular handwringing.
For further discussion of how or how not to define the indie web, the more relevant post would be Which Part of the Indie Web Ethos is the Bigger Priority.
âCoyote, February 8th, 2026
â Related reading:
đ§ The broken underline effect for links on this page was created by combining gradient underlines with gradient stripes. Did you notice that they can be repaired?
Coyote is a blogger and essayist interested in how to build a better web. You can find its contacts and more of its essays at Coyoteâs Link Hub.