DEEP DIVE

Building Better BI Chatbots: Why Context and Triggers Matter More Than You Think

Kasia Zając

I've been fixated on AI chatbots in BI (and in general SaaS) tools for the past year. Not in a "let's add AI to everything" way, but in a genuine "how can this actually help people understand their data better" way.

At Preset, we already shipped AI Assist in SQL Lab—it helps you write and iterate on queries, which is great. But here's what we learned: helping people write SQL isn't the same as helping them understand their data. It also became clear that we need something that knows our data.

Some people just need "Why did this metric go down?" They don't care about the query—they care about the answer. Even SQL experts often want higher-level help: "What's interesting in this dataset?" or "Is this data quality good enough to trust?"

And here's the other thing: when you open our SQL Lab AI Assist, you still face that blank input box. You still have to figure out how to phrase your question. The AI doesn't know what table you're looking at or what query you just ran.

That's the gap we're closing.

The Blank Input Box Problem

After studying what GitHub Copilot, Tableau, Power BI, and ThoughtSpot are doing, a pattern emerged: the best AI implementations don't replace the existing interface—they augment it.

Dashboards aren't going away. Charts aren't going away. SQL Lab isn't going away. What's changing is that every interaction point becomes a potential conversation trigger.

Are you looking at a chart that confuses you? Ask about it right there.
Are you exploring a dataset? Get quality metrics without leaving the page.
Are you building a dashboard? Generate a chart and drop it in.

The AI lives alongside the work, not in a separate space you have to navigate to.

This is fundamentally different from a chat box in the corner that waits for you to come ask it things. The AI should be woven into the fabric of how you work with data—present when you need it, invisible when you don't.

Context Changes Everything

Here's what happens in most BI tools with AI chat: you open it, see an empty input box, and freeze. "I don't even know how to phrase this question."

And even when you do ask something, the AI doesn't know what you're looking at. It's like asking for directions from someone who's blindfolded—technically possible, but frustratingly inefficient.

This is why context matters. Imagine you're viewing a chart showing regional sales. North is crushing it, West is flat. You wonder why. You shouldn't have to leave that chart, open a separate chat, and describe which chart you mean, which dataset it uses, what time period it covers.

In our new implementation, when you open a conversation while viewing a chart, you see:

  • "Explain this chart"
  • "Identify trends"
  • "Show underlying data"

Simple, right? But powerful. The context is already there—no need to describe what you're looking at or figure out how to phrase the question.

I'll be honest—this first version won't read your mind. These triggers are straightforward. Not "explain why Q2 revenue dropped 15% in the West region compared to Q1 after adjusting for seasonality"—you can ask that question and we'll answer it, but we're not generating hyper-specific buttons yet.

As the system learns more about your data over time, the suggestions will get smarter. But even simple contextual triggers beat a blank box.

The same logic applies across the interface:

  • Open chat from a dashboard → see "Analyze this dashboard"
  • Open chat from a dataset → see "Check data quality"
  • Open chat after running a query → see "Explain these results"

The AI knows where you are. It meets you there.

Showing Data, Not Just Talking About It

When you ask our chatbot about something, you won't just get text. You'll get rich, interactive components that let you actually work with the information.

Click "Preview Data", and you see an actual table with the first 10 rows. Click "Show Columns" and you get every column with its data type. These aren't just chat responses—they're real UI components.

When the AI explains a chart, you see the chart preview right there. When it generates a chart to answer your question, you see it as an interactive visualization. In edit mode, you'll be able to drag it straight onto your dashboard.

Click "List Charts", and you see all 23 charts. Click "Identify trends" and the AI walks through the key insights across the entire dashboard.

See a query in the conversation? You can run it, modify it, ask the AI to explain, optimize it or even visualize results—all without leaving the chat.

This consistency matters. Whether you encounter a dataset in search results, in a conversation, or in a list view, it looks the same. You learn the pattern once, and it applies everywhere.

That last part is important: it’s about showing data, not just talking about it. You shouldn't have to remember what the AI showed you and manually recreate it. The insight should become part of your actual work.

For Everyone, But Different Depths

Most Preset users are data scientists, analysts, and engineers—the ones building dashboards, setting up datasets, writing complex SQL queries. But they're not building for themselves. They're building for business users: sales teams tracking quarterly performance, marketing analyzing campaigns, executives prepping for Monday meetings.

Our AI needs to serve both:

Data practitioners want technical depth: "Show me the SQL," "What's the data quality score?", "Which dashboards use this dataset?" They need to verify assumptions and see how data flows through the system.

Business users want insights: "Why did sales drop?", "What's driving this trend?", "Explain this simply." They want the "so what," not the "how."

Same interface, different depth. You can set the tone explicitly: "I need technical details" or "I'm not a data scientist—keep it simple." Or let the interface guide you: click "Explain chart" for a high-level answer, then choose whether to dig into:

  • Show underlying data
  • Check data quality
  • View SQL

This matters because the same dashboard serves both audiences. The analyst builds it, the business user consumes it, and both should be able to ask questions about it at their own level of technical depth.

The Honest Truth About V1

Let me be real: this first version isn't magic.

The contextual triggers are basic. "Explain this chart" gives you a solid explanation, but won't automatically know you care specifically about the West region underperforming or that this is a recurring pattern from last quarter. You can ask about that—and we'll analyze it—but the AI won't proactively surface it yet.

The object cards won't be dynamically personalized. Everyone sees the same metadata and preview options, regardless of their role or past behavior.

Drag-and-drop from chat to dashboard? We're building it, but not for day one.

Here's why I'm not worried about that: the foundation matters more than the features.

If we build this right—with proper context awareness, rich object cards, clear triggers that help people get started—we can add sophistication over time. The AI will learn more about your data. The triggers will get smarter. The suggestions will become more relevant.

But if we build a disconnected chat box that doesn't understand what you're looking at, no amount of features will fix that fundamental problem.

The Bigger Vision

Eventually, this chatbot should know your data better than you do. Not in a creepy way—in a useful way.

It should notice when data quality drops. Surface interesting trends you might miss. Remember that you always analyze sales by region and offer to do it automatically. Learn the questions you ask most often and prepare answers before you even think to ask.

Imagine opening Preset in the morning and seeing: "Revenue dropped 8% overnight in the West region—want to investigate?" Or working on a dashboard and having the AI suggest: "The customers most at risk of churning right now are in this segment—create a chart?"

That's where we're headed. But we're not building that on day one. We're building what makes all of that possible:

Contextual awareness - The AI knows where you are and what you're looking at
Rich object representations - Cards that show real data, not just descriptions
Conversational depth - The ability to dig as deep as you need to go
Trust and transparency - Clear sources, data quality indicators, honest uncertainty

Once that foundation is solid, everything else becomes possible.

What We're Learning From You

We're rolling this out gradually, and we're watching closely how people actually use it. Not how we think they'll use it—how they really use it.

Some things we're curious about:

  • Do people prefer simple trigger buttons or do they want to type questions?
  • How often do they dig into the technical details vs. staying at a high level?
  • Which object cards are most useful?
  • What follow-up questions do they ask most?

Your feedback will shape where this goes. If everyone ignores the "Show data quality" button, we'll rethink it. If everyone clicks "Explain chart" but then immediately asks "why," we'll make "why" a first-class action.

This is very much a collaborative process.

Try It When It's Ready

We're planning to roll this out in phases over the next few months. If you're a Preset user, you'll see it appearing gradually in your workspace.

Not on Preset yet but curious where we're headed? Check out preset.io.

Building AI features into your own product—especially in data and analytics? I'd genuinely love to hear what you're learning. This field is moving fast, and none of us have all the answers yet.


A Quick Note on the Tech

We're building this on the Model Context Protocol (MCP), an emerging standard for how AI applications connect to data sources and tools. It's what lets our chatbot actually understand Apache Superset's data model, not just talk about it in the abstract.

Think of MCP like USB-C for AI—instead of every application building custom integrations with every data source (an N×M problem), you build once to the protocol (an N+M solution). As more tools adopt MCP, the entire ecosystem gets smarter.

Interested in the technical details? The spec is publicly available and worth exploring.


About Preset

Preset is a managed analytics platform built on Apache Superset. We're on a mission to make data exploration accessible to everyone—not just people who love writing SQL. AI is a big part of that vision, but only if we build it thoughtfully, with real user needs at the center.

All the features of Superset (and more) without the hassles of scaling/managing/upgrading, with plenty of time to grow into it. Free for five users, forever.

Subscribe to our blog updates

Receive a weekly digest of new blog posts

Close