By Craxel Founder and CEO David Enga
August 13, 2025
LLMs are absolutely magical from a user experience point of view. Regardless of other issues, natural language interfaces (spoken or written) are here to stay. One major impact of this will be a paradigm shift in the enterprise related to speed and data availability.
Once people experience a new level of speed, it is very difficult to go back. This fueled the upgrade cycle for personal computers for decades. Try using a ten year old computer - you'll absolutely hate it. Slow your child's Internet speed down to 1 megabit/second and see what happens! In the enterprise, Moore's law fueled an upgrade cycle that boosted productivity, such as spreadsheets recalculating faster, enabling people to explore more model variations.
So what happens in the enterprise when people become used to having real-time conversations with ChatGPT and Gemini? They'll want to have real-time conversations about what's happening in their business. The impact will be dramatically increased productivity, but not simply because of its natural language, but because of the ability to rapidly explore contextualized information and in far greater depth.
There is a massive impediment to such a future. Data technologies have struggled to deliver even timely business intelligence (BI). The same infrastructure that can't deliver insights to people quickly enough is going to deliver context to AI quickly enough for real-time conversations? While AI is rapidly and fundamentally changing user expectations, none of today's prominent technology companies can deliver it affordably, even for an organization with a massive budget like the U.S. government.
This problem can't be brute forced. While trillions may be spent on the compute for AI factories, the same can't feasibly be done to brute force enterprise data. Unlike large corpus of text used to train the foundational LLMs, enterprise data changes. This makes the notion of each enterprise training an LLM on their enterprise data and keeping them up to date unrealistic.
No, LLMs are going to be fed enterprise data through their context windows, which are limited in size. This means the relevant enterprise data needs to be instantly available or users will be waiting minutes, hours or even days before AI will spit out the next few words. This is NOT the experience users will expect, nor the experience that will drive productivity.
So AI in the enterprise will stumble until the high performance, affordable, data-centric technology emerges that can deliver a step change in price/performance. Many projects and lots of spending on neat toys will wire point-to-point interfaces to data stovepipes to collect some facts to be delivered via AI. But the real impact awaits a new data foundation for the enterprise - we call it Black Forest.