Search-driven analytics is a new way how you can make ad-hoc queries and get answers simply by entering a natural language question. NLQ is particularly effective for enabling non-IT users in your company to make data-driven decisions: they don't need to understand pivot tables or data visualization, as reports are automatically configured through natural language search.
LLM-powered cube's Ask Data and top-menu Search Box allow users to start data exploration with simple free-form queries. SeekTable automatically generates appropriate tabular report/chart based on the natural language query, providing clear and understandable answers. How it works:
<prompt> tags with special instructions for LLM -
so you can fine-tune Ask Data recognition for your concrete cube.
Search-driven reporting significantly lowers the barrier to entry for data analysis. Try these online demos:
average television by age range and gender
show sales sum in 2024 vs 2023 by region and month
Sometimes you can get something different from what you expect; in this case you may back to your query by returning to the cube view ("back" in a web browser or click the cube name in the breadcrumbs) and clarify your question.
Search-driven reporting is enabled by default for all cubes but can be disabled if needed (via cube's configuration form).
This is another LLM-powered AI function designed to help users analyze their tabular reports.
Finding insights or anomalies in large tables requires concentration and attention.
Users can simply use the AI menu to run predefined report prompts or ask their own specific questions.
The Ask Report function isn't just for data analysis: you can ask the LLM to process your tabular data and generate a desired output. This feature can be especially useful for reports that contain unstructured text content.
It's important to remember that report prompts only have the data you currently see on the screen. Therefore, it's useless to ask questions that the current report can't answer.
On-prem SeekTable installations without activated "AI Functions" have limited Ask Data that fallbacks to non-LLM search query recognizer (built-in / doesn't use any external APIs). This implementation works like a text search engine: it simply tries to find best matches of dimensions/measures for keywords from the query.
Note that non-LLM "Ask Data" does not really understand your query (intent) like LLM, it just matches keywords you entered and tries to suggest most relevant reports.
Keywords recognized in non-LLM search query parser:
total sum or sum of total.
To use first "Sum" measure it is enough to specify just sum.
city Berlin (or city:Berlin), name John.
Hint is required if you want to filter by a high-cardinality column (when SeekTable cannot recognize a value).
Canada, New York, John Smith, closed (may refer to 'state' or 'status' column).
Entities recognition works differently for CSV and databases. In case of CSV data SeekTable performs quick scan of CSV file and suggestions/by-value-recognition works for all dimensions. However, this approach is not possible for DB cubes, and suggestions/by-value-recognition works only for dimensions explicitly specified in "Match Dimension" list on the cube configuration form. It is recomended to specify here only low-cardinality dimensions that may be quickly loaded.
=, equal, equals, not equal, not equals<, before, below, less, less than, fewer, under, ending with>, after, after, above, greater, greater than, more,
more than, larger, over, starting withMay 1 or
2019 Mar. Note that these 'partial' dates are applied correctly only when your cube has separate dimensions
for date parts (year, month, day).
yesterday, today, tomorrow, this month, current month,
prev month, previous month, last month, next month,
this year, current year, prev year, previous year, last year,
next year
LLM makes it possible to chat with connected databases or uploaded CSV data, ask various questions, and get text answers that require making queries to your data. This is known as RAG (Retrieval Augmented Generation) when an additional data context is added to LLM's prompt and guarantees that the answers you receive are accurate and come directly from your data.
SeekTable can act as a managed context provider, when LLM can do only queries in terms of configured cubes and applied RLS (row-level security) rules to guarantee that a concrete user can access only allowed records. Compared to allowing LLM to run any SQL it can generate, this is a much more convenient and safe way to work with business data.