Year round learning for product, design and engineering professionals

Engineering for the Agentic Web When 50% of Your Traffic is Robots — Janna Malikova at AI Engineer Melbourne 2026

Janna Malikova at AI Engineer Melbourne 2026

Engineering for the Agentic Web When 50% of Your Traffic is Robots

Five years ago, a website's analytics were simple: count the humans who visited. Maybe worry about malicious bots, but mostly you were measuring human attention. Today, something fundamental has shifted. Half the traffic to a typical website is no longer human. It's AI systems—unknown browsers, automated agents, LLM embeddings, search crawlers that have become active participants in the internet rather than passive indexers.

This isn't a transition period that will pass. This is the new baseline. And it creates an entirely new engineering challenge: how do you present your content to these non-human agents? What do they need? What should they see? How do you speak to both humans and machines simultaneously?

The Death of Traditional SEO

The traditional web optimization story was about getting human attention. You wrote for search engines to some degree—keywords, metadata, structure—but ultimately you were optimizing for humans to find and read your content. SEO was about speaking the language that search algorithms understood so humans could find you.

But the agents visiting your site aren't just search engines anymore. They're AI systems making decisions. They're language models ingesting context for reasoning. They're automated agents trying to complete tasks on behalf of users. The metadata and structure you expose matters, but not in the old way. You're not trying to convince a ranking algorithm to show you higher in results. You're trying to give a reasoning system the information it needs to make good decisions and take useful actions.

An LLM agent visiting your site to help a user complete a task doesn't care if it's ranked #1. It cares if the information is clearly structured, unambiguous, and specific. It cares if it can find exactly what it needs without noise. It cares if the content is low-noise and actionable.

This is a fundamental reframing of how a business presents itself online. You're not writing for Google anymore. You're writing for GPT-4 and Claude and systems that don't exist yet. You're not optimizing for click-through rates. You're optimizing for reliable agent interactions.

Restructuring for Agents

What does this look like in practice? First, it means identifying what traffic is agent traffic. Not all unrecognized visitors are agents—some are humans with unusual browser strings, some are legitimate tools you want to work. But understanding your traffic composition is the first step. Are you really at 50%? 30%? 70%? What kinds of agents are visiting?

Once you know, you can make deliberate choices about what information to expose and how. Consider a restaurant website. A human visitor wants to see pretty photos and read descriptions. An agent visiting to help a customer make a reservation needs precise hours, accurate phone numbers, clear pricing, and unambiguous location information. These aren't compatible—you need both. But you can structure the page so that the machine-readable information is clearly available, not buried in styled text.

This extends to every page and every interaction. When you expose your API documentation, agents will read it. When you publish pricing, agents will reference it. When you describe your product, agents might generate marketing copy from it. The question isn't whether they'll interact with your content—they will. The question is whether you help them interact with it accurately or whether you force them to guess and hallucinate.

The omni-channel aspect is crucial too. Content that works well for human readers might not work for agents. An image with beautiful alt-text might not help an agent that needs structured JSON. A marketing page written with personality might confuse a system trying to extract facts. You need multiple representations of the same information, optimized for different consumers. Some of that can be automated—extract structure, publish it clearly, let systems use it. Some of it requires deliberate content strategy.

Learning from Reality

Janna Malikova has been living on the frontier of this shift. Working as a software engineer in Brisbane, she's observed the traffic changes happening in real time and built systems to handle the implications. Her work in frontend performance and security intersects directly with this challenge: how do you serve two very different audiences simultaneously without making your site broken for either one?

The lessons she's learned go beyond just technical implementation. They're about recognizing that the web is changing. The era of human-only traffic and traditional SEO is over. The new era is one where your website needs to be legible to autonomous agents—not instead of humans, but alongside them. It's about re-architecting how a business presents itself for an AI-driven internet.

This doesn't mean adding special robot-only pages or creating a hidden API-first interface. It means designing for transparency and structure throughout. It means thinking of your website as a data source as much as a communication medium. It means recognizing that agents and humans want different things, and good design serves both.

Janna Malikova will be sharing insights from this experience—including how to identify agent traffic, realign your context strategy, and design for the agentic web—at AI Engineer Melbourne 2026, June 3-4.

delivering year round learning for front end and full stack professionals

Learn more about us

Web Directions South is the must-attend event of the year for anyone serious about web development

Phil Whitehouse General Manager, DT Sydney