AI you'll actually use

Ask questions and get answers you can trust. Powered by Omni’s semantic model.

Free Trial

Skip the query.
Start the conversation.

Ask a question

Type it in or ask for suggestions.

Refine

Filter, add fields, and calculate metrics.

Summarize

Get insights to make decisions.

Start a workbook

Go deeper with the full Omni UI.

Improve trust with a semantic model

Curate responses

Add instructions for datasets and fields.

Learn more →

Add context as you go

Adjust prompts in real-time.

Learn more →

Use existing metrics, joins, and metadata

Build for BI, re-use for AI.

Learn more →

Launch a new AI product

Customize with Omni’s MCP server and APIs

Surface insights to your customers using AI

Create custom pricing tiers

Other ways to use AI

Excel calculations

Describe a calculation and get the right Excel formula.

Learn more →

AI summaries

Turn charts, tables, and KPIs into actionable takeaways.

Our approach to AI

Our data team has aligned data properties with the terms we use every day, and now I can just open up Blobby and ask exactly what I’m looking for — like ‘Who were our top suppliers last month based on GMV growth?’ The level of detail is excellent, and honestly, it’s actually fun to use.Sophie Paulin, Chief Marketing Officer

FAQs

Does Omni use its own LLM, or does it leverage external LLMs?

Omni currently uses AWS Bedrock-hosted Claude models for most tasks and OpenAI models only for advanced AI visualizations. Omni does not have its own LLM. For more, see our docs.

How does Omni’s AI work behind the scenes? 

When a user enters a prompt, Omni shares the prompt with AWS Bedrock.

If the prompt requires data summarization, the metadata of the current query and a CSV of the results are shared with AWS Bedrock, which returns a summary.

If the prompt requires a new Omni query, then AWS Bedrock translates the prompt into an Omni semantic query. The semantic query is a collection of metadata –  including field names, filters, sorts, pivots, topic name, and limit – that is translated into SQL. The SQL is run against the data warehouse and the results are returned in the Omni interface.

How does Omni’s AI compare to other AI tools?

Omni’s semantic model lets you feed important context and nuance into the AI to promote reliable responses, instead of expecting LLMs to deliver precise answers based on your raw data

In addition, while other tools are solely focused on enabling an “AI chatbot” experience, Omni’s AI lets you do more than just ask questions and get black-box answers. AI in Omni can accelerate every part of your data analysis workflow – including creating queries, filtering, writing Excel formulas, and more.

In addition, AI is embedded into Omni’s workbook experience rather than a separate interface, making it interoperable with the rest of Omni’s analysis methods. As you query with AI, you can drop into the point-and-click UI, SQL, or spreadsheet formulas at any point. Users aren’t locked into AI, even if that’s their starting point.

Customers can also embed Omni’s AI interface directly into their own product, or use Omni’s APIs and MCP server to customize further.

Can I test the accuracy of Omni’s AI?

Omni currently doesn’t offer a way to “test” the accuracy of responses. However, we encourage data teams to optimize their data for AI and experiment with adding context to deliver more reliable results. By using this human-in-the-loop approach, teams can continually evolve their data model for AI to deliver more accurate, relevant answers.

Why is your semantic model important to AI?

Omni’s semantic model ensures AI responses aren’t a black box. For example, you might provide ChatGPT specific prompts like “You’re an expert on the economy. Summarize this article in 50 words” to get the results you want. Omni lets you do the same – you can provide instructions, context, and nuance to your data for more reliable responses.

For example, in your “Sales & Revenue” Topic, you might add context like: “Make sure your responses are easy for a busy sales leader to understand and act on.” You might get even more detailed on your Revenue field and specify: “This is the source of truth for our sales and finance reporting. It is the gross revenue from all of our sales channels. For any questions that ask for total sales, total revenue, total sold, or similar metrics, use this field.” Including these details helps AI better understand your data and deliver more helpful, trustworthy results.

How do you recommend setting up your data model for accurate AI responses?

Making your datasets AI-ready doesn’t need to be complicated – we’ve found that adding context in a few key places makes a big difference, and you can continue tuning up the model as you go. For tips, check out this how-to guide from our VP of Product Arielle, and watch our CEO Colin go from zero to AI-ready live.

How does Omni handle data security?

Omni never shares your data without your permission. When AI features are toggled on, only necessary data is shared with external LLMs. For example, when asking AI to generate queries, or write formulas, only query metadata (column names, sorts, limits, etc.) is shared. If asking AI to summarize a query, then the results set is shared. Learn more in our docs.