Artificial Intelligence Archives - Analytics Platform - Matomo https://matomo.org/blog/category/artificial-intelligence/ Tue, 07 Apr 2026 12:02:09 +0000 en-US hourly 1 https://matomo.org/wp-content/uploads/2018/11/cropped-DefaultIcon-32x32.png Artificial Intelligence Archives - Analytics Platform - Matomo https://matomo.org/blog/category/artificial-intelligence/ 32 32 Matomo announces new chatbot tracking in its AI Assistants suite, offering comprehensive insights into AI traffic https://matomo.org/blog/2026/03/new-feature-matomo-ai-assistants-tracking/ Wed, 18 Mar 2026 06:00:00 +0000 https://matomo.org/?p=91389 With the 5.8.0 release of its platform, Matomo introduces new capabilities designed to help organizations better understand and measure the growing impact of AI assistants on website traffic.The new version provides efficient, privacy-first tools that help teams better understand their traffic in the age of AI and generate stronger insights for decision-making.

Wellington, 18 March 2026 – Matomo, the world’s leading privacy-first and ethical open-source analytics platform, announced the release of Matomo 5.8.0, the latest version of its web-analytics platform bringing new capabilities to enhance analytics insights in the age of AI. With this release, Matomo introduces AI chatbot tracking, a new report within its AI Assistant tracking capabilities. This report helps organisations understand how AI chatbots interact with their websites. Combined with the existing AI Agents report, it provides a clearer view on how AI tools access and analyse website content.

AI chatbots are redefining web analytics

Nowadays, companies face a new challenge: distinguishing between human visitors and interactions generated by AI chatbots. Without this differentiation, web analytics data becomes increasingly unreliable, affecting executive decision-making.

AI web traffic is not limited to a single type of interaction. It includes AI agents, which can autonomously browse or perform actions on websites, AI chatbots, which access and analyse website content to generate responses within their interfaces; and AI referrals, where users visit a website after clicking on links suggested by AI tools.

Matomo users can identify AI referrals directly in their acquisition reports, helping them understand how AI platforms influence website traffic and search visibility.

With its AI Assistants solution, Matomo offers dedicated tracking for AI chatbots and AI agents, providing businesses with a clearer picture of how AI interactions impact website traffic. This prevents AI interactions from distorting marketing attribution, traffic metrics, and conversion reporting.

“Without visibility into AI-driven traffic, analytics data becomes less reliable and less actionable. This challenge affects every team that relies on website data, from marketing to leadership. It’s crucial to be able to distinguish between human visitors and AI assistants in order to make informed business decisions,” said Matthieu Aubry, CPO and Co-founder of Matomo.

This release is part of Matomo’s broader strategy to position itself as a leader in AI-driven web analytics. Build on privacy-first foundation, Matomo enables organisations to understand both AI and human engagement on their websites while remaining fully compliant with data protection regulations. This privacy-led-by-design approach ensures that analytics insights remain accurate, trustworthy, and align with modern data expectations.

Privacy-first analytics built for the age of AI

As AI chatbots and automated agents increasingly interact with websites, many analytics tools struggle to interpret this traffic because they rely heavily on cookies and user identifiers, mechanisms that AI systems often reject or bypass. Built on first-party data and transparent measurement, Matomo’s privacy-first architecture allows organisations to detect and analyse these interactions more reliably. This provides clearer visibility into how AI systems access, read, and engage with website content

“As a privacy-first analytics platform, Matomo is independent from third-party cookies and invasive tracking, which is totally adapted to the requirements of the AI era. This release reinforces our human-first analytics approach, enabling our clients to accurately detect AI traffic while maintaining trustworthy web analytics results,” said Adam Taylor, CEO of Matomo.

About Matomo:

Matomo is a leading web analytics platform that helps organisations understand their audience and gives them full control over their data. More than 1.4 million websites in over 190 countries use Matomo to improve digital experiences and make confident decisions. 

Available on-premise or in the cloud, Matomo combines powerful analytics with data ownership, flexibility, and privacy through its open-source foundation. 

More information: Matomo.org

Media contact

Elise Duchateau
Head of Marketing and Communications
Matomo (InnoCraft Ltd.)
marketing@matomo.org

]]>
From humans to AI agents: understanding the new web traffic https://matomo.org/blog/2026/03/humans-agents-understanding-ai-web-traffic/ Mon, 16 Mar 2026 09:13:52 +0000 https://matomo.org/?p=91354 With AI Assistants being an integral part of our private and professional life, many website owners and marketers wonder about how these systems affect traffic.

Often, their organic traffic is flat. But their content keeps showing up in ChatGPT answers. Something is clearly happening, but it’s not reflected in their analytics.

This is the new normal for a lot of teams. AI systems are interacting with websites in fundamentally different ways: some send real visitors, some read your content quietly in the background, and some never send anyone at all.

Understanding the difference is the first step to making sense of what you’re seeing:

  1. The different types of AI systems interacting with websites
  2. The difference between human visitors and automated traffic

Once you know this, tools like Matomo can help you measure what’s happening.

Understand the different types of AI on the web

When people talk about “AI traffic,” they often mix very different technologies together.
Not all AI systems behave the same way — and they affect your website in different ways.

Understanding these categories already removes much of the confusion around “AI traffic.”

Here are four types you’re likely to encounter.

AI chatbots: answer engines for users

These are tools like:

  • ChatGPT
  • Gemini
  • Perplexity
  • Claude
  • AI-powered search assistants

Users type questions and receive answers written by the AI.

Sometimes these answers include links to sources. When a user clicks one of those links, they visit your website.
In analytics, this appears as referral traffic.

AI chatbots can also influence traffic when they’re not sending visitors. This happens when the AI provides a full answer inside its interface, and users don’t see the need to click the source link. In some cases, AI chatbots don’t even add a source link to their output. Both cases result in what is known as zero-click behaviour. Your content may still be used as a source, but no visit happens. And while technology can’t track human visits that aren’t happening, there are solutions to track non-human visits, performed by AI crawlers, scrapers and agents.

AI crawlers: automated content readers

AI companies also operate automated programs that read websites. These are called crawlers.

They visit pages automatically to:

  • Discover content
  • Collect information
  • Update AI systems

These visits are not human. They’re automated requests made by software.

AI scrapers: targeted data collectors

Scrapers are similar to crawlers but more selective. Instead of reading entire websites, they extract specific pieces of content, such as:

  • Article text
  • Headlines
  • Product details
  • Structured data

This data may be used for training AI models or generating answers. Again, these visits are automated.

AI agents: autonomous digital assistants

A newer category is AI agents. Agents are designed to perform actions on behalf of users.
For example, an AI agent might:

  • Search multiple websites
  • Compare products
  • Fill out forms
  • Complete tasks online

You might ask yourself how AI agents differ from AI chatbots. The difference is that AI chatbots require user prompts for each step, while AI agents can act autonomously once given an initial instruction.

One important detail: AI systems can play multiple roles
The same AI ecosystem can behave in different ways.
For example: A chatbot may send human visitors when users click links. The same company may run crawlers that read your content automatically. Some systems may fetch pages in real time while generating answers.
The key difference for analytics is simple: Who initiated the visit — a human or an automated system?

Overview of AI types and what they do

AI typeWhat it doesHow it affects traffic
ChatbotsAnswer user questionsMay send human visitors or reduce visits
CrawlersAutomatically read websitesGenerate automated traffic
ScrapersExtract specific content
AgentsPerform tasks onlineMay resemble human sessions

How AI changes website traffic

Imagine you run a blog about marketing tools. Over time, you might notice several subtle changes:

  • Some informational blog posts receive fewer visits because AI tools answer basic questions directly.
  • Traffic patterns shift, with different landing pages receiving visits compared with previous months.

These different interactions can make traffic patterns look unusual at first glance. But once you understand the different actors, the effects become easier to interpret.

AI influences website traffic in three main ways:

AI sending real visitors

When users click links inside AI chatbots, they arrive on your website like any other visitor.
In Matomo, this traffic is visible in the Acquisition report, appearing as a dedicated referrer channel type. In a dedicated report, you can even see the metrics for multiple chatbots.

AI reducing clicks (zero-click behaviour)

Sometimes AI tools answer a question completely inside their interface. Users get the information they need without visiting the website. This means your content still influences the answer, but the visit never happens.

As a website owner or marketing team, over time you may notice fewer visits to informational content or changes regarding which landing pages are visited.

While analytics can’t measure visits that never occur, you can monitor visit trends over time, to get an understanding of the shifts that are happening. And keep in mind that zero-click behaviour doesn’t necessarily mean your content is less relevant. In many cases, it means the content is summarised or referenced by AI systems instead of generating direct visits.

To understand these shifts, it’s useful to monitor changes in landing pages, queries, and referral sources over time.

AI generating automated traffic

Crawlers, scrapers, and some agents generate non-human visits. With popular traffic analysis solutions, these visits often remain untracked and stay invisible. This is where Matomo comes into play. It offers visibility into AI traffic through different report angles.

How Matomo helps you stay oriented

When traffic patterns change, the goal is simple: separate signal from noise. To do this, start with the following quick check:

Quick check: how to spot AI-related traffic in Matomo

  1. Look for AI chatbot referrals: 
Go to AcquisitionReferrals and check whether AI platforms appear as traffic sources.
  2. Monitor landing page trends over time
: If AI tools answer questions directly, visits to informational pages may decline. Compare traffic patterns over time.
  3. Inspect automated AI traffic
: Use AI Assistant tracking to see visits and engagement metrics for AI chatbots and AI agents.
  4. Focus on long-term patterns
: AI-related changes usually appear gradually. Comparing months or quarters helps reveal meaningful trends.

If you want to explore these signals in more detail, the following sections explain how to investigate them in Matomo.

Keen about testing Matomo’s AI tracking capabilities yourself? Start your 21-day free trial and make the invisible visible!

For real visitors coming from AI: identify referral sources

Look at referral reports in Acquisition to see whether new sources, including AI platforms, are sending visitors.

You can analyse things like:

  • How this traffic channel performs, compared with other channels like Organic or Social.
  • How human traffic coming from AI chatbots changes over time and adds to goal conversions, and what happens in individual sessions coming from AI chatbots.
  • What the visitors do after they land on your website, coming from an AI chatbot (e.g., which transitions happened).

Learn more here: How to track and analyse traffic from AI Assistants (like ChatGPT) in Matomo reports

This helps answer questions like:

  • Is this traffic growing over time?
  • How are visitors from AI tools behaving?
  • Do they convert differently from traditional search visitors?

For automated traffic: inspect AI Assistant traffic

To gain visibility into non-human visits and to be able to act on it, you can use Matomo’s AI Assistant tracking. It offers a dedicated report for both AI Chatbots and AI Agents. And here’s what they do:

  • AI Chatbots: This report contains three different sub reports, which help you answer the following questions:
    • How many requests from AI chatbots does your website get? And how do the chatbots behave during these visits, e.g. what’s the number of unique visited URLs, orphaned pages, or the click-through-rate?
    • How do metrics like visits and pageviews develop over time?
    • Which AI chatbots are accessing your website, and which pages are they visiting each?
  • AI Agents: This report not only analyses AI traffic but also allows you to compare it to human visits. It offers two sub reports that provide insights regarding the following:
    • How many AI Agent visits are there, and how do the AI Agents behave? For example, how many actions are they performing, what’s their average visit duration and bounce rate, and more.
    • How do these metrics develop over time?

With both detailed reports, and the possibility to investigate behaviour over time, teams don’t need to waste time caring about daily fluctuations. Instead, Matomo allows to analyse longer-term patterns, helping teams compare months or quarters to see how traffic sources are shifting.

Making sense of the new traffic landscape

AI is not a single technology. It is an ecosystem of chatbots, crawlers, scrapers, and agents interacting with websites in different ways. Some bring visitors.
 Some reduce clicks.
 Some generate automated traffic.

In many cases, AI crawlers are discovering and analysing content that may later appear in AI-generated answers.

In that sense, AI systems can be seen as a new type of audience: not human readers, but systems that interpret and redistribute information across AI platforms.

That may sound complex, but the basics of analytics remain the same:

  • Know your traffic sources.
  • Separate humans from automation.
  • Monitor trends over time.
  • Make decisions based on your own data.

One advantage of privacy-first analytics platforms like Matomo is that they provide visibility into automated traffic.

Instead of hiding these signals behind aggressive filtering or opaque modelling, Matomo allows teams to observe how AI systems interact with their websites.

AI hasn’t made analytics more complicated. It has made the question more precise: are you looking at humans or machines? Once you can answer that, the rest of your analysis stays the same.

Matomo gives you the visibility to ask that question and answer it, whether it’s a chatbot sending referral traffic or a crawler reading your pages in silence.

]]>
How AI is reshaping web analytics and how to measure real human traffic in 2026  https://matomo.org/blog/2026/02/how-ai-is-reshaping-web-analytics-and-how-to-measure-real-human-traffic-in-2026/ Tue, 17 Feb 2026 16:30:53 +0000 https://matomo.org/?p=90645 Web analytics used to feel simple. 

You installed a tracker, watched your traffic go up or down, checked conversions, and trusted that what you were seeing represented real people doing real things on your site. If sessions grew, you assumed you were winning. If they dropped, you assumed something was wrong. 

That mental model no longer works. 

As AI assistants increasingly replace traditional search and browsing, many marketers are reassessing their analytics stack. The challenge is no longer just collecting data, it is understanding whether your data reflects real human behaviour or AI traffic. This is where privacy-first web analytics is becoming strategically important. 

Today, a growing share of what appears in dashboards isn’t human at all. It’s AI assistants, automated agents, scrapers and LLM crawlers that “visit” pages without ever intending to behave like users. 

From a server perspective, all of this looks like traffic. 
From a marketer’s perspective, it often looks like chaos. 

We now have more data than ever, and less reliable signals than ever. 

How AI is changing web analytics 

When many of us started working in analytics, the story was simple: people came to a site, they clicked around, and their behaviour told us something meaningful about intent. 

That story has quietly changed. 

We are no longer only measuring people. We are measuring other kinds of actors on the web, including AI tools and automated systems that interact with pages in ways that mimic users but don’t actually represent them. 

If we don’t separate human from automated behaviour, we end up making decisions based on noise while thinking we’re acting on insight. 

You’ve probably already seen this in your own data: sudden spikes from odd referrers, pages that rack up visits without meaningful engagement, or traffic patterns that don’t match what sales, support, or real customers are telling you. 

A lot of this isn’t “classic spam bots.” It’s AI systems pre-fetching pages, querying sites for structured data, or scanning content on behalf of users who never actually land on your website themselves. 

If you treat all of that as equal to human visits, your growth story starts to blur. 

You might celebrate “activity” while your real audience is quietly shrinking. In that case, you’re not optimising for people, you’re optimising for ghosts. 

Why traditional web analytics fails with AI traffic 

Most mainstream analytics platforms were designed in a cookie-based era where a “visit” mostly meant a person with a browser. 

AI doesn’t play by those rules. 

It often comes without typical identifiers, doesn’t interact with consent banners, accesses pages in unusual ways, and moves through sites without anything resembling a normal journey. It doesn’t scroll like a person, it doesn’t follow neat funnels, and it doesn’t “convert” in ways marketers expect. 

As a result, tools built primarily around identifiers and linear user journeys can misclassify activity in both directions, sometimes counting machines as people, and sometimes filtering out real users who behave in unexpected ways. 

That’s why a new, very practical question has become central for many teams: 

“How much of our traffic is actually human?” 

Why human-first analytics matters in an AI world 

Something deeper is changing in how serious analysts think about data. 

The goal today is clean, trustworthy, human traffic

This is where privacy-first analytics platforms have gained unexpected relevance. Because they don’t depend heavily on third-party cookies or invasive tracking, they tend to focus more on real interactions, what people actually do on a site, rather than stitching together identity across the web. 

That approach turns out to be surprisingly well suited for the AI era. When your measurement is grounded in genuine behaviour rather than synthetic identifiers, it becomes easier to spot what looks like real engagement versus automated activity. 

In other words, tools built for privacy are increasingly becoming tools that help protect the meaning of your data. 

How Matomo separates AI traffic from human traffic 

A growing number of teams are now looking for analytics tools that can detect AI traffic rather than treating every visit the same. 

Rather than pretending AI activity doesn’t exist, Matomo allows you to identify and separate traffic coming from known AI assistants and tools as its own channel in reports. 

Matomo product screenshot showing the "AI Assistants" menu.

This isn’t just a cosmetic label. It changes how you interpret your data. 

Instead of staring at one blended traffic line and guessing what is real, you can compare what recognised AI tools do on your site, and what real humans actually do. 

You can see whether a spike came from people or from machines. You can tell whether a page is really engaging your audience or simply being read at scale by automated systems. 

For analysts, this moves the conversation from endless debate: “Is this real?” to evidence: “Here’s what humans did versus what AI did.” 

Many mainstream analytics platforms still blend human and automated visits together. They are powerful for reporting, but they don’t give teams a clear way to separate AI traffic from real users. By contrast, platforms that explicitly surface AI-assistant traffic, such as Matomo,  provide clearer, more trustworthy insights in an AI-heavy web. 

When human traffic is under pressure, that clarity becomes more important, not less. 

The bigger shift marketers need to grasp 

For years, many organisations treated raw traffic as a proxy for success. More sessions felt like more attention. More pageviews felt like more impact. 

AI has broken that assumption. 

In a world where a growing share of “traffic” can be machine activity, and where many users now get answers without clicking, visit volume is no longer a reliable indicator of human interest. 

If your KPIs are still built mainly around total sessions, you risk optimising for activity that doesn’t represent your audience at all. 

Privacy-first platforms like Matomo have long emphasised meaningful behavioural signals over surveillance-style tracking. That perspective now feels less like a compliance requirement and more like a strategic advantage. 

If what you care about is understanding people, not just counting hits, that approach aligns better with today’s reality. 

AI and web analytics: what marketing teams have to consider 

Should we optimise for AI discoverability? (Yes, but separately) 

It is not smart to ignore AI discoverability. 

In fact, optimising for AI is becoming a legitimate marketing strategy in its own right. Still, it sits alongside human optimisation, and doesn’t replace it. 

You now effectively have two audiences: 

  • Human users who click, browse, compare, and convert. 
  • AI systems which not only read, summarise, reference, and recommend, but increasingly act as agents that directly interact with websites, navigating pages, retrieving information, and completing tasks on behalf of the users.

 Each requires its own optimisation and measurement approach.

For AI discoverability, you care about whether your content is clearly structured, factually precise, and easy for systems to interpret, and whether your brand is represented accurately inside AI responses. 

That’s a valid objective, but it is not the same as human engagement. 

The real mistake many teams make is mixing everything into one headline KPI called “traffic.” 

A better model is: 

  • One set of metrics for human performance 
  • One set of metrics for AI visibility and presence 

This is exactly where tools like Matomo become useful: they help you see these two worlds separately instead of mashed together. 

If your analytics tool can’t do that, you may not have the full visibility needed in an AI-first web. 

Is AI increasing or decreasing website traffic? 

For many websites, AI is more likely to reduce real human traffic over time. 

As more people get answers inside assistants, fewer will feel the need to click through, especially for informational queries. Gartner predicts that search engine volume will drop by 25% by 2026 as users increasingly rely on AI chatbots and others virtual agents instead of visiting websites. 

At the same time, AI systems may still generate background activity on your site, which traditional analytics tools may still record as visits, making dashboards look busy even as your real audience shrinks. 

You can therefore end up with a misleading picture: 

  • Analytics showing “activity,” 
  • But your actual human reach quietly declining. 

That’s why the key metric of the coming years won’t be total sessions, it will be human sessions. 

And that is exactly what your analytics tool needs to make visible. 

What to consider when choosing a modern analytics tool? 

If AI is changing both how people use the web and how machines interact with websites, then the criteria for a good analytics tool must also change. 

You no longer just need a platform that counts visits. 

You need a platform that helps you understand who those visits really are. 

Modern analytics tools now provide:

  • Clear separation of human traffic from AI and automated activity. 
  • Focus on real behavioural signals, not just identifiers. 
  • No reliance on third-party cookies. 

Many mainstream tools are excellent at collecting data, but far less transparent about what that data actually represents. 

Platforms that explicitly surface AI-related traffic, like Matomo, give teams a clearer foundation for decision-making in an AI-heavy web. 

If your dashboards and your business reality no longer match, this distinction matters more than any fancy attribution model. 

The new reality for marketers and analysts 

As this settles in, the questions that actually matter are changing. 

The key question is now how much of your traffic represents real human behaviour: 

  • How much of our traffic is human? 
  • Are AI referrals ever leading to real conversions? 
  • Are we visible inside AI tools, even if fewer people click? 

Teams that can answer these questions clearly will make better decisions than teams chasing ever-higher session numbers. 

That is why privacy-first analytics are gaining credibility: they keep the focus on real people rather than artificial noise. 

Final take 

AI isn’t a distant disruption for web analytics, it’s already reshaping what our numbers mean. 

The organisations that will win in this environment won’t be those with the biggest dashboards or the highest visit counts. 

They will be the ones that can confidently say: 

“We know which of this traffic represents real humans, and we know how visible we are to AI as well.” 

In that sense, human traffic has become your most valuable metric,  while AI discoverability has become a new strategic layer alongside it. 

To gain confidence in you data, your analytics tool needs to help you clearly distinguish between human visitors and automated traffic. 

If you are rethinking your analytics stack in light of AI, it makes sense to prioritise tools that let you see human and AI traffic separately rather than blending everything together. 

Because at the end of the day, analytics should help you understand real people, not just count visits.

Start a free Matomo trial and see how much of your traffic is truly human. 

]]>