Tap into the full value spectrum of generative AI in data and analytics by downloading the action guide, featuring key resources and resolutions for the new year.
Get the ebookAI Agents make Insights actionable
Agentic AI is this season’s hot topic, but it’s sizzling for the data and analytics industry, as agentic AI promises to make insights actionable, seamless, and intuitive.
Picture it: You’re a merchandiser approaching the close of your fiscal year, and you need to close your sales gaps. Legacy BI brings you to a daily summary dashboard where you’ll run promotions, and make markdowns—leaving you looking for sales in all the wrong places.
With Agentic AI, a merchandiser can now converse with a digital agent—which accesses real-time insights in granular data and quickly identifies the exact places where you can make the most impact. Agentification is an exciting time; and now anyone can try ThoughtSpot’s AI Analyst, Spotter. Spotter makes AI insights accessible to everyone—not just experts.
Try SpotterBusinesses look to AI to do more with less
SaaS has led to an explosion of tools that business units can procure and deploy themselves and the modern data stack allowed for the assembling of best-of-breed components across the data, analytics, and AI ecosystems. With a 42% increase in vendors, GenAI has further fueled an explosion of start-ups, and yet CIOs and CFOs are saying, “Enough already!” In this unforgiving, unpredictable economy, we are beholden to boundless belt-tightening measures and everyone is expected to do more with less.
Get the ebookAI must pass the trust test to deliver AI to everyone
In 2024 we all want our analytics tools to be as easy as ChatGPT—or as user-friendly as Claude.ai is for semi-structured content. But the truth is that text-to-SQL is too prone to hallucination to be trusted. And here’s the kicker: Trust with business data, once lost, is hard to regain.
Data and analytics vendors who understand the need for guardrails, sophisticated reasoning, human-in-the-loop, few-shot prompting, and RAG for fine-tuning are further along in proving their trustworthiness. Many, however, are only at the demo stage and are completely ill-equipped to implement their solutions in production. Others flamed out in the last year.
Expect to see a range of new solutions here, with different levels of robustness and trustworthiness. Never settle for AI that lies, particularly when you’re driving adoption of self-service analytics for business users.
Spot the differenceSemi-structured data upends archaic data management practices
The data and analytics industry spends $103B annually in storing structured data. Spending for semi-structured data—in terms of text, voice, images, slides, and documents—is unknown and often separate from relational data, yet the IDC estimates semi-structured data accounts for 90% of the world’s data. GenAI promises to make this data more accessible and useful.
The evolution in data management practices is both organizational and technical. Vendors are taking different approaches to bring the different data types together. For example, Snowflake’s new Doc.ai essentially puts structure on top of pdf documents, and a number of analytics vendors have added support for Glean. In order to more effectively analyze all types of data, we must evolve our data management practices to include a variety of storage formats and the requisite security and governance policies. Analyzing this data distinctly offers some value, but analyzing these data sets together is the game changer.
Get the ebookYou don’t find insights, insights find you
The analytics experience in business applications has progressed significantly beyond just embedding dashboards. In the past, embedded analytics primarily involved integrating static dashboards or reports into applications. These types of embedded content offered a "view-only" experience with limited interactivity, meaning users had to manually discover, interpret, and act on the data. Ultimately, it was up to the users to uncover insights and make decisions based on that information.
Today, there is a shift toward providing dynamic, proactive, and personalized analytics experiences that are seamlessly integrated into business and application workflows. These analytics experiences are contextual and adapt to your role, current tasks, and behaviors. You can now receive real-time, actionable insights through proactive alerts and conversational AI interfaces, ensuring insights are delivered at the right moment and eliminating the need to sift through static charts.
Get the ebookOpen table formats promise interoperability – will they deliver?
With open table formats, you can simply store data once and then allow access to a single source of data in different compute layers based on a use case and analytical needs. In theory, this creates less vendor lock-in for data storage, although the formats supported by various compute vendors do vary.
One of the biggest benefits is that data can be stored in cost-effective platforms such as Amazon S3, Google Cloud Storage, or Azure Blobs. While the vision for open table formats remains a work in progress, they began as an evolution of data lake concepts—with open-source projects Apache Iceberg (developed by Apple and Netflix), Delta Lake, and Apache Hudi (developed by Uber) being the most popular.
The fact that Databricks acquired Tabular for $1B, the commercial version of Iceberg, and announced the acquisition during Snowflake’s annual user conference, further validates the importance of open table formats in modern data ecosystems.
Get the ebookLarge Language Models battle small language models
When Open AI’s GPT first gained the world’s attention in Fall 2022, it was the large foundational models that carried the market share—think Google’s Gemini, Anthropic’s Claude, and Amazon Titan. Since then, Meta open-sourced Lama, and Mistral went from an unknown, open-source, start-up in France to Microsoft taking out an equity stake.
Two years later, there are now thousands of models with trade-offs in price, performance, and accuracy. The costs to train such large models are both financial and increasingly environmental as hyperscalers race to add processing power. The accuracy of these models, however, are not necessarily getting any better. Customers, on the other hand, are leery of their private data being used to train public models, no matter the legal contracts and security controls. Also, smaller language models have risen as an alternative, both at a lower cost and greater accuracy for industry-specific use cases.
Get the ebookAI literacy is now a life skill
In 2024, we predicted that data fluency would evolve to include AI fluency—arguing that organizations with existing data fluency programs would be in a better position to leverage AI. Over the past year, we’ve seen that trend take hold in the market. According to a study from a workforce consortium that includes Cisco, Accenture, Google, IBM, Indeed, and others, AI literacy tied for first place with AI ethics and responsible AI in the list of top 10 technical skills expected to increase in relevance.
AI literacy is no longer just a job skill, it’s a life skill. Sure, businesses are doing AI imagination workshops with their teams. But also we're seeing AI literacy being embraced at the elementary school, primary school, and high school level.
Marc Cuban best explained it in a recent episode of The Data Chief podcast, “The path to least resistance to learning AI is simple. All you’ve got to do is use it… There is literally next to nothing you can't teach yourself using a large language model. You can even train and educate the model. It’s a virtuous cycle.” This is Generation AI, and it’s up to you to ensure you are equipped to use these tools in a safe and responsible way.
Watch the episodeFed up with Feds, states and businesses take control of AI policy
Last year, we predicted that AI innovation would continue to outpace legislation—a trend we’re still seeing play out in the market. But this year, state legislators started to push back. We recently saw California pass articles for AI legislation. In an act that came as no surprise to most, some businesses decided to push back. This is an all-to-familiar pattern in technology.
While a patchwork of laws isn’t perfect, some would argue that it is a start. Of course, states aren’t our only defense to protect against the misuse of GenAI and data security. As business leaders, we also have an important role to play.
According to Gartner, major tech companies are leading the way in AI ethics by adopting more responsible AI guidelines and methods. For data leaders and administrators, the key is safety. What guardrails can you put in place, and how can you empower your team to take ownership of data security?
Get the ebookCDAOs face an existential threat while everyone battles for AI ownership
Let’s face it: While Chief Data Officers have had C in their job titles, the seat at the table rarely has the same integral level as other executive leaders. Data is too often seen as an IT problem and an exhaust from digital interactions rather than a competitive business asset.
With GenAI, everyone in the boardroom is now asking about AI with some vying for the increased stature and budget that accompanies AI ownership. In some cases, CIOs who feel their waning power in the post-cloud-migration era are seizing the moment.
You cannot do AI without a solid data foundation, so it makes sense that the CDAO role would evolve to assume leadership in the GenAI strategy. Who will rise to the occasion is a work in progress.
Get the ebook
By registering, you agree to the processing of your personal data by ThoughtSpot as described in our
Privacy Statement.
The 2025 Trends ebook comes with detailed resources for every category and offers tangible action plans for the year ahead
Explore this year’s top trends with a panel of your peers and industry experts
The AI analyst for everyone
Tune in to The Data Chief podcast
for more talk on
2025 data and AI trends