Hello, everybody. Sue here, Field CTO at Quest. And joining me today is Yetkin Ozkucur, who is the head of our data practice here at Quest. And we have some really great new capabilities and use cases that we want to talk to you about that have everything to do with data readiness and just streamlining your data management and embedding that into the creation of your data products.
Actually, that's the big, major use case that we want to talk to you about. But really, it has everything to do with drawing on your domain knowledge that you've been collecting for all these years, and it helps you. It's more of like an AI companion as you're going about creating data products, creating data sets, creating AI models. So with that, I'm going to turn it over to Yetkin for that first slide.
Yeah, thank you, Sue. So yeah, we'd like to open with this quote from Einstein. "If you cannot explain simply, you don't understand it well enough." So it's such a good quote from Einstein, which fits perfectly in the world we live in right now as far as the data and AI goes. We deal with very complex systems.
And how do you make it, how do you explain it simply? It is data modeling. So in summary, data modeling is the practice of organizing and abstracting complexity so that the humans and the machines can understand it and work efficiently. In different words, if you cannot clearly communicate what your business meaning or structure of your data is, it's a sign that you need a deeper understanding of your data. So that kind of sets the tone what we want to talk about today.
It sets the tone, and I think that's a challenge for us to explain things simply today on the webcast.
Oh, yeah, challenge accepted. Yep, yep. We will make it simple. So let's talk about where we are as far as data modeling goes today. So some call it second renaissance. Some call it backroom to boardroom. So let's look at it. I mean, data modeling has been a well-respected data management discipline since decades. I mean, from urban perspective, we have 50,000 data modeling users around the globe actively using data modeling today.
And all these people are doing the God's work. I mean, having a nice, well structured, logical, physical, conceptual data model helps you improve your development efficiency, improve your data quality so you spend less time in troubleshooting data quality issues. Or it helps you optimize the performance for the databases, data lakes you are deploying to.
It reduces your infrastructure costs, especially with these new modern platforms, Snowflake and Databricks, where there's compute based pricing. And it makes your maintenance and integration easier. And we know all these things. I mean, none of it is new. And like I said, data modeling has been around for decades. Everybody knows that.
But now in this new realm, so we have all these boards, C-level people, they are looking at AI. They are looking at 10x to 100x return on investment, and they are easily achievable. But the key for all these initiatives to success, what affects the most is the trustworthiness and accuracy of your AI outputs.
And the data modeling has a direct impact to that. Why? Because data modeling helps you understand semantics of the data. Data modeling helps you understand how the data relates to each other. And I mean, we will talk about it in the upcoming slides. So that's pretty much the bottom line.
And just from a numbers perspective, like we say, second renaissance, backroom, boardroom, the data modeling usage jumped 13% just in the last year in 2024. So as of today, approximately 65% of the organizations worldwide using data modeling practice this way or another to govern their data. And I mean, this is last year's data, and this year, we expect to go even further. Sue, why don't you explain us a little bit more how it all fits in?
Yeah, let's talk about AI inside of the organizations today. So what Gartner is saying is that if you have 10 or less AI POCs or AI initiatives going on, you can probably manage that from a governance perspective. You can probably handle it by having a human in the loop, for sure.
But the tipping point seems to be 10. Once you reach greater than 10 AI projects going on at the same time, then your AI ambition starts to get quite high. So those characteristics are high ambitions from an AI perspective. Maybe your industry is being reinvented with AI, and you need to move fast.
And what we're seeing is our clients are coming to us and saying, they're bypassing the metadata. They're bypassing the data governance. They're bypassing the data modelers. Please help us.
So I can tell you, I've heard about on a podcast, Walmart has every one of their engineers building one AI agent a week. They're iterating on an hourly basis. They were moving from waterfall to agile to ultra agile, and we got to keep up, and we can't let them skip the metadata, and we can't let them skip that data governance piece or the modeling.
So we feel that if we can embed that into the process on the fly during the build, and if you can completely understand, going back to that Einstein quote, if you can understand your data as you're building, you're going to be better to align it. You're going to be better to trust that data, have more confidence in that AI model, and also de-risk it a bit as well.
We want you to make smart and informed decisions about your data. And having data modeling, the logical representation coupled with data intelligence, the physical representation of your inventory, you're going to start to get this well-rounded understanding of your business and