Categories
Past Workshops

Which AI toolbox for Digital Analysts in 2026?

On January 22nd 2026, I was lucky enough to attend the very first Data & IA Camp Lyon, brilliantly co-organized by Mehdi Oudjida and Lionel Cherpin 📊, two very active members of the AADF (the French-speaking Digital Analysts Association). The AADF also organizes MeasureCamp Paris every year, as well as regular meetups for the French-speaking digital analytics community across Europe.

It was a fantastic way to start the year, with one central question: Which AI tools will Digital Analysts really need in 2026?

The discussions were rich, pragmatic, and sometimes challenging – and you’ll find a summary of those exchanges below.

Article content

Workshop during the Data & IA Camp in Lyon

This event drew strong inspiration from the “unconference” format popularized by MeasureCamps around the world. It took place as an evening meetup – from 7:00 pm to 10:30 pm – with four session slots. I’ve written a full recap in French in this article.

I led a session whose goal was not to present a “miracle tool”, but to provide a very concrete state of play:

  • which AI tools are already being used today by data teams,
  • at which stages of the job,
  • with what real benefits… and what limitations.

🔄 The Digital Analyst role: transformation, not disappearance

The discussions I’ve been facilitating over the past few months (Paris, London, Brussels) all lead to the same conclusion:

👉 the Digital Analyst role is not disappearing,

👉 but it is changing profoundly.

The role is evolving towards:

  • more steering and orchestration,
  • more supervision of AI agents,
  • more governance, data quality, and critical thinking.

AI is not here to replace the analyst, but to augment them.

Article content

📊 Everyone uses AI… but still rarely at the core of data tasks

In the room, 100% of participants already use an AI tool at least once a day in their data tasks (tracking, reporting, analysis, QA).

Tools mentioned spontaneously:

  • ChatGPT
  • Claude
  • Gemini
  • Copilot
  • Cursor / Claude Code

But in practice:

  • for most teams, less than 25% of data tasks are actually assisted by AI,
  • usage often remains ad-hoc or exploratory.
Article content

🧩 The framework: Collect → Report → Analyze → Optimize

To structure the discussion, I relied on the classic Digital Analytics and Optimization lifecycle:

  • Collection
  • Reporting
  • Analysis
  • Optimization

At each step, we identified the AI tools actually used and their real added value.

Article content

🧱 Collection: KPIs, tracking plan, code, QA

On the collection side, AI is already very useful across four key areas:

🔹 KPI definition

LLMs (ChatGPT, Claude, Gemini) are often used to:

  • suggest KPI ideas,
  • challenge an existing framework,
  • accelerate business scoping (e.g. e-commerce, retail, food…).

Tools mentioned:

  • ChatGPT / Claude / Gemini
  • Jetmetrics (e-commerce KPI repositories)

🔹 Tracking plan & naming conventions

LLMs can help generate a first draft of a tracking plan, but: ⚠️ human validation remains essential.

A specialized tool mentioned: Avo for nomenclature management, cross-team consistency (web/app), tracking plan generation and validation.

🔹 Code (JS / SQL / tracking)

On the technical side, gains are often immediate:

  • JavaScript code generation,
  • SQL queries,
  • GTM configurations.

Tools mentioned:

👉 The gains are especially significant at scale (e.g. many GTM containers).

🔹 Quality Assurance (QA)

For QA, several tools can automatically detect tracking anomalies or risks (e.g. sensitive data).

Tools mentioned:

Goal: detect early rather than fix too late.

📈 Reporting: explore, narrate, automate

In reporting, AI is used at several levels:

🔹 Data exploration

AI mainly helps speed up initial exploration and structure a first version of a report more quickly.

🔹 Narrative & writing

LLMs are widely used to:

  • turn numbers into text,
  • structure analytical commentary,
  • produce a first draft of deliverables.

Tools mentioned:

🔹 Alerts & insights

For alerting and anomaly detection:

We are gradually moving from simple thresholds to automated insights.

🔹 Automation

To industrialize recurring tasks:

Typical use case: ➡️ retrieve analytics data, ➡️ send it to an LLM for a first analysis, ➡️ generate and distribute a report or recommendations.

🧠 Analysis: intelligent assistance, not full delegation

In analysis, two approaches coexist:

🔹 Native analysis within tools

These tools increasingly embed AI to detect patterns or frictions.

🔹 “DIY” analysis with LLMs

(Anonymized) data exports sent to:

👉 AI helps explore, formulate hypotheses, and structure thinking… 👉 but it does not replace analytical reasoning.

Article content

🎯 Optimization: where AI is already very mature

In optimization, AI is sometimes invisible… but everywhere:

🔹 Media & bidding

🔹 Testing

🔹 Personalization & search

🔹 Predictive

🧪 Always Be Testing: the real key skill

My conclusion is deliberately simple:

👉 test continuously,

👉 compare multiple LLMs,

👉 understand their strengths and limitations.

The Digital Analyst of tomorrow is less a “producer” and more a conductor of hybrid intelligence (human + artificial).

And in a world where everything evolves every 2–3 months, not testing already means falling behind.

A huge thank you to all the participants for making this session so interactive, engaging, and thought-provoking, as well as to ⚡️Laure de La Faye for the pictures!

Article content
Article content