Skip to main content

Command Palette

Search for a command to run...

Who put the AI in my Database?

Updated
7 min read

I’m not a mathematician; I can’t even spell mathematician without help from spell check… so needless to say I am not the person building AI models for a living and trying to calculate the standard deviation of an elliptical ark. That’s a thing, right? Actually, the elliptical ark is part of the Cyber Security field, not AI… anyway I digress…

What I DO know is that AI depends on data. Good data… clean data… and accessible data… it’s all about the data. And according to DB-Engines Ranking the most popular place to store enterprise data is an Oracle database.

The thing about AI is that the models need to be able to access data in order to be able to train on the data. If, like me you really want to be able to “Do AI” but you are not a mathematician then how do you bring the data and the algorithm together to be able to train your data? There are a whole host of answers to that question, and the right answer to that question will depend on a lot of different factors. Also, while storage is cheap, the idea of having your data in multiple places creates issues as to the source of truth. Wouldn’t it be nice if the same database that we trust to hold our enterprise data was also the place that our AI models look to do training?

Well as it turns out… Oracle has made it very easy for customers to launch into AI model training by putting that functionality directly into the database. Yes, I was skeptical… but I work for Oracle, so of course I had to give it a test drive… and well… it works!

The secret sauce…

So what’s the secret sauce to having AI just kind of work on your data? Oracle Autonomous Database v26ai. Now I will admit when I saw Oracle adding the “ai” moniker to a database I kind of rolled my eyes. It reminded me of the days when Microsoft attached the .NET to everything rather it had anything to do with .NET or not… Windows Server .NET anyone? (The initial release didn’t even have the .NET framework installed…)

So here I am working with some data for an upcoming project, and I want to be able to showcase the use of the built in AI features of Autonomous database to address customer pain points. Customers have data, they want to be able to do something with that data, and in certain cases AI can help out. However, the average customer doesn’t have a data scientist on hand to train their models.

The Data…

I have created a sample dataset. The data is sample readings from IoT sensors on refrigerators:

TimestampDeviceIDTemperatureCAnomaly
2025-09-01T00:00:0014.10
2025-09-01T00:00:0023.80
2025-09-01T00:00:0035.00
2025-09-01T00:00:0043.00
2025-09-01T00:00:0054.50
2025-09-01T00:10:0014.40
2025-09-01T00:10:0024.20

I have a little over 65,000 rows of sample data generated. Given that this is refrigerator data, the logic is simple, any temperature between 3 and 5 degrees Celsius is good, and any reading above or below that is an anomaly.

Note: You would never actually use AI to solve this problem as the logic is extremely simple. However I needed a use case that I could quickly articulate and perform training on the data in a very short period of time so this is what I came up with. As a Solutions Architect I will tell you this, do not use AI where another technology, such as a simple If statement in this case, can do the job.

The challenges…

Challenge 1: Load the data

Even though I work for Oracle, I am NOT a database guy. In fact, interesting enough most of the workloads that run on OCI today have nothing to do with Oracle Database. Because I am not a database guy, I need to be able to get my data into an Oracle Database without having to look up a bunch of scripts. Well… Autonomous DB has you covered with the tools built in.

The built in Data loader can get my data in from my CSV in a couple of clicks… with no additional tools required. Once you click into data load, I drag and drop my file onto the load panel, then click the Edit button to review the mappings.

The data importer has automatically figured out that I need to create a new table, and created mappings for me. I haven’t had to do any work to make this happen. This is a simple table, but the one issue is the timestamp field that needs mapping. The Autonomous database automatically figured this one out for me. You can specify your own table name, but I am just keeping the table name based on the import file name.

The rest of the process is just hit close, then tell the data importer to load your data.

And in less time than it actually took me to paste the screen grab and type the text, my data was imported, all 65,000 rows.

Challenge 2: Figure out what model to use and train

One thing before you move on from here, you are discouraged from using the ADMIN user with OML (Oracle Machine Learning) and in fact, as we all know… you should NOT be using the ADMIN user for anything right? So, use the admin tools to create a new user, grant access to the data for that user, and make sure to give them OML permissions.

Back in your database actions tab, click the Machine Learning button to get started with OML:

Make sure to log in with the OML user you created, NOT the ADMIN user.

There are a ton of ML options burred in this UI, but the one that we want is AutoML. This is the thing that is going to have Autonomous DB do all the hard work.

Experiments are your first stop in the world of AutoML. Experiments are what allows Autonomous to try a bunch of different models out on your data and figure out which one works best. Select the option to create a new experiment.

Next configure the parameters of your experiment. Select the data source, in this case the SENSOR_DATA table that we imported earlier. Select the prediction that you want to make, in this case the ANOMALY column, and select the features that may or may not be part of the prediction.

Once you have done that, click on start and decide rather you want to train based on speed or accuracy. I’m choosing Accuracy for my training run.

And off Autonomous DB goes figuring out how to the AI work while I go grab a coffee.

Challenge 3: Choosing the right model

After about 5 minutes, which is less time than it would take for me to look up the names of the different algorithms, Autonomous has done the training, tested the models and let me know what has the best results.

From this I can see that a Neural Network appears to be a good choice for my predictions. What I really appreciate here is that the training looked at all the options and figured out which data had the most impact on the outcome. Now in my case I know what that answer is, because I created the sample data, and I know what I based the anomaly flag on. However, what if I had additional columns? This process has used machine learning to figure it out for me.

Challenge 4: Deploying my model

So having a new model is great… but how to I make that available for use? Well, once again Autonomous DB has you covered.

Click on your model in the leaderboard, optionally rename it, and then click Deploy…

Provide a URI for your model. Note, this isn’t a complete URI, it’s just the part that specified your model. Give it a version number, and then provide your tenancy namespace. Then click Ok to publish your model.

That’s it! You have, in under 10 minutes, Trained and deployed an AI model using Autonomous DB!

Summary

So…. who put the AI in my database? Oracle. I’ve gone from knowing absolutely nothing about the AI in Oracle database 26ai to being able to write a blog post about how to do it in less than 24 hours, and most of that time was spent on other things…

The best part, all of these tools are included in Autonomous database. There is nothing to set up on configure externally. And Autonomous Database is included in the OCI Always free tier!

Stay tuned for my next blog post where I will actually call that model from some python code to embed it as art of my overall solution.