AI is collapsing the cost of building for the web. When anyone can generate a website by describing what they want, the tools we use stop mattering. What matters is what we build them on. Here are eight assumptions about web content I believe will still be true in ten years — and they're the assumptions I built Capuzzella on.
The Authoring Layer Keeps Changing. The Output Never Does.
The Internet started with HTML. A researcher at CERN wrote angle brackets into a text file, pointed a server at it, and the web was born. That was 1991. In the thirty-five years since, the way we author web content has been rewritten half a dozen times. The way we deliver it has not.
In the early days, you wrote HTML by hand. Every page was a static file. If you wanted to change a heading, you opened the file, edited the text, and saved it. This worked until sites grew beyond a handful of pages and non-technical people needed to publish.
Server-side scripting arrived in the mid-1990s — PHP, Perl CGI, ASP. Pages were no longer static files but programs that assembled HTML on the fly. A database held the content, a template held the layout, and the server stitched them together on every request. The output was still an HTML file sent over HTTP. But now the authoring layer had a database, a programming language, and a deployment process.
Then came the CMS era. WordPress launched in 2003 and gave non-developers an admin panel to write and publish content without touching code. Drupal, Joomla, and dozens of others followed. The authoring layer now included user accounts, WYSIWYG editors, plugin ecosystems, and theme marketplaces. The output was still HTML over HTTP.
JavaScript frameworks changed the picture again. jQuery in 2006, then Angular, React, and Vue turned the browser into an application runtime. Pages became interactive. But the initial delivery mechanism remained: an HTML document, fetched over HTTP, rendered by a browser.
The headless CMS movement of the 2010s separated content from presentation entirely. Contentful, Strapi, Sanity — they stored content as structured data behind an API and let frontend teams consume it however they wanted: websites, mobile apps, digital signage. The authoring layer became an API. The output, when it reached a browser, was still HTML.
By the early 2020s, the machinery between "someone writes a sentence" and "a visitor sees a web page" had become staggering. A single content change could trigger a multi-stage CI/CD pipeline — linting, unit tests, integration tests, system tests, staging deployments, visual regression checks, accessibility audits, performance budgets, cache invalidation across a CDN — all orchestrated by YAML files longer than the content itself. Teams employed full-time DevOps engineers just to keep the publishing pipeline alive. The authoring layer had become a deployment monster. The output was still a handful of HTML files.
And now, AI. Large language models can read HTML, understand its intent, and rewrite it based on a natural-language instruction. The authoring interface becomes a conversation. You describe what you want, and the HTML changes. No admin panel, no template language, no drag-and-drop builder. Just intent in, markup out.
Six eras. Six completely different authoring paradigms. One constant output: an HTML file served over HTTP, rendered by a browser. Every generation of tooling was convinced it had found the definitive way to create web content. Every generation was replaced. The delivery layer never was.
That pattern made me think about what will actually remain long-term. Not which framework will win, or which AI model will be best, but what truths about web content will still hold a decade from now. If you're building something meant to last, you need to build on axioms, not trends.
Eight Axioms for the Future of Web Content
1. Entities will always need a place to publish
Companies, organizations, governments, individuals — any entity that wants to communicate with the world needs a place to put that communication. This has been true since the first business put up a sign outside its door. The web made it global.
Social media platforms offer reach, but they don't offer ownership. Your Instagram profile belongs to Meta. Your LinkedIn page operates under Microsoft's rules. Algorithms decide who sees what. A website is the one place on the Internet where you control the message, the design, the structure, and the availability. That need doesn't go away because AI can generate content faster. If anything, it intensifies — when everyone can publish, having a place that's distinctly yours matters more.
2. Trust is anchored by domains
When you visit apple.com, you trust that Apple controls what's on that page.
That trust isn't based on the content itself — it's based on the domain name and the
infrastructure behind it: DNS, TLS certificates, ICANN governance. This system is imperfect,
but it's the best mechanism the Internet has for tying digital content to real-world identity.
AI-generated content makes this more important, not less. When anyone can produce professional-looking text and imagery in seconds, the question "who published this?" becomes the only reliable signal. Domains answer that question. They are the trust layer of the web, and no technology on the horizon replaces them.
3. The more ways content is accessible, the better
A web page viewed in a browser is one delivery channel. The same content served through an API can feed a mobile app, a voice assistant, a search engine snippet, an AI agent, or a system that hasn't been invented yet. Content locked inside a single rendering pipeline is content with an expiration date.
The headless CMS movement understood this. But you don't need a headless CMS to achieve it — you need content that's stored in a format machines can read and an interface that lets them request it. HTML is already machine-readable. An API on top of it extends the reach. The principle is simple: don't trap content behind a single interface.
4. Speed is a feature, not an optimization
Users have always rewarded fast-loading pages — with attention, with trust, with conversions. Google has used page speed as a ranking signal since 2010. This isn't a trend; it's a consequence of human impatience, which is biological. The average visitor forms a judgment about your site in under a second. If the page hasn't loaded by then, the judgment is made for you.
Human attention is finite and getting more contested every year. There are still only 24 hours in a day and roughly 8 seconds of initial attention per page visit. In an ocean of AI-generated content competing for that attention, the page that loads in 200 milliseconds will always beat the one that loads in 4 seconds. Every layer of abstraction between content and delivery is latency you have to justify. Most of the time, you can't.
5. When production costs fall, brand becomes the differentiator
This is an economic argument, and it applies far beyond the web. When the cost of producing something drops dramatically, supply increases and the product itself becomes commoditized. What differentiates one offering from another shifts from capability to identity.
AI is doing this to web content right now. Generating a competent landing page takes minutes, not weeks. The barrier to having a web presence is approaching zero. When everyone has a website, the website itself isn't the advantage. The brand is. The visual identity, the tone of voice, the consistency across touchpoints — these become the moat. Companies that invest in brand will stand out. Companies that treat their website as a commodity will look like everyone else.
6. Humans are visual. Design stays central.
We process visual information faster than text. We form impressions of a website in 50 milliseconds. We judge credibility by design before we read a single word. This isn't a cultural preference — it's biology. The visual cortex occupies roughly 30% of the human brain.
AI can generate layouts and color schemes, but it can't replace the strategic decisions behind a visual identity: what a brand should feel like, how it should differ from competitors, what emotions it should evoke. Corporate design — the systematic application of visual identity across every touchpoint — will remain a human-driven, strategically important discipline. The tools will get better. The need for intentional design won't diminish.
7. Brand communication requires collaboration
A company's website isn't written by one person. Marketing writes the copy. Design sets the visual direction. Legal reviews the claims. Product provides the specifications. Leadership approves the messaging. This is true at a five-person startup and at a fifty-thousand-person enterprise.
AI accelerates the creation of content, but it doesn't eliminate the need for review. If anything, it increases it — when content can be generated in seconds, the bottleneck shifts from production to approval. Workflows that let teams draft, review, and publish together aren't overhead. They're how organizations maintain quality and coherence when the speed of content creation outpaces the speed of human judgment.
8. Content outlives the system that created it
A blog post written in WordPress in 2005 might still be valuable in 2030. But the WordPress version that created it is long gone. The plugins it depended on have been abandoned. The PHP version it ran on is end-of-life. If that content was locked inside a proprietary database schema or a rendering engine that no longer exists, it's gone too.
This has happened before. Proprietary word processors died and took people's documents with them. Flash content vanished when browsers dropped the plugin. Every closed format eventually becomes a trap. Content stored as plain HTML in files on a filesystem has no such dependency. It can be read by any text editor, served by any web server, processed by any program. It's the most portable format the web has ever produced — and portability is what gives content a lifespan longer than the tools that made it.
Will Even These Axioms Hold?
Honest answer: I don't know. It's worth asking what could break them.
Could entities stop needing a place to publish? Only if communication itself becomes unnecessary. Could domains lose their trust function? Perhaps, if a better identity layer emerges — blockchain-based naming, decentralized identity, something we haven't imagined. Could brand stop mattering? Only if humans stop being influenced by aesthetics and identity, which would require a change in neurobiology, not technology.
The point isn't that these axioms are guaranteed to hold forever. The point is that they're more durable than any specific technology. Betting on "React will be the dominant framework" is a five-year bet at best. Betting on "entities need a trusted, branded place to publish accessible content" is a bet on human nature and economic fundamentals. Those change slowly.
Build on Truths, Not Trends
The web's authoring layer will be rewritten again. The AI tools we use today will be replaced by better ones. The frameworks, the runtimes, the deployment platforms — all of it will churn. That's not a risk. That's the pattern.
What doesn't churn is the underlying need: entities that want to publish, audiences that need to trust, content that should be accessible everywhere, brands that must differentiate, designs that must resonate, and teams that need to collaborate before hitting publish.
If you're building a product, a company, or even just a personal website, anchor it to the things that don't change. Let the tools evolve around you. The output is still the same — an HTML file, served over HTTP, rendered by a browser. It has been for thirty-five years. I'd bet on another thirty-five.