<
Return to blog

Why the World's Data Engines Are Broken—and How to Fix Them

Today’s dominant data engines were built for an earlier era and are struggling to meet the speed, efficiency, and scalability demands of AI. Craxel’s Black Forest offers a new knowledge infrastructure that unlocks the full potential of AI-powered decision making at massive scale.

By Craxel Founder and CEO David Enga

April 19, 2025

This chart defines just about everything that is wrong with how the world approaches data today. Notice that the 4 biggest market share holders are also the big 4 cloud providers. Strange coincidence? I believe looking at incentives is always important to understanding the world. So what rewards the cloud providers stock price the most? The answer is growing consumption of compute, storage, and network. So, what incentive do they have to make data access fast and efficient? Maybe market share competition?

Let's look at the engines on this list for some more insight. For the most part, we have the Oracle, Sybase, DB2, and Postgres engines plus some brute-force things. This list is dominated by antiques. It would be really difficult for these organizations to risk these revenue lines by replacing or rebuilding their products to achieve massive price/performance improvement. I believe you will see them continue their brute force approaches to performance at scale. It feeds their cloud consumption business models. Plus their products are fine for small scale use cases and they are very convenient. I think a strong argument could be made that the DBMS market is 100% driven by convenience today, not at all by price/performance because all of these vendors have the same class of technology. You would think that competitive market dynamics between cloud providers would get them to try to outclass each other, but frankly, I think they are all stuck with their current technology. These engines are highly complex and took years to build.

Given that data is the fuel for AI-Powered Decision Making, organizations that want cost efficiencies and productivity enhancements from AI are going to have to do something different to unlock their data.

We believe the answer is Craxel's Black Forest, which is the knowledge infrastructure for AI-Powered Decision Making. Powered by many patented O(1) algorithmic innovations, the world can now move from tables to networks (e.g. multidimensional on-line transactional knowledge graphs) scalable to trillions upon trillions of graph nodes and edges. Enterprises can now finally realize the power of ontology at scale. Unlike graph databases, Black Forest provides fully serializable ACID transactions on these nodes and edges- so the same graph can be used for both OLTP and analytics. Black Forest also connects graph nodes with vectors at scale - seamlessly bridging structured and unstructured data.

Black Forest: the future of data for AI.