Leveraging the AI in your database...
Yesterday I wrote a blog post called “Who put the AI in my Database?” The topic of that blog post was how you can leverage the AI capabilities that Oracle has built into your database to train and deploy models.
Being able to train and deploy an AI model is kind of cool, but it doesn’t do any good unless you can actually connect to it and use it. So… here we are continuing on with the model I built yesterday in order to fire off some queries at the model and get it to infer some results.
Now I did mention in the blog post yesterday that this is not a use case that you should use AI for. The entire decision can be modeled with:
if (val > 3) and (val < 5) :
# do something useful, no AI required...
However, what we are demonstrating is the ability to easily leverage AI from our applications using the built in capability of Autonomous Database, not the ability for AI to handle complex modelling, which we already know that it can handle.
The Code
Let’s dive into the code required to connect to our model and start inferencing.
Preliminary code
The first thing we need to do is set up some vales that we will use later on to actually do the connection. Now many of these values are data that should be stored outside your code. Secrets like your user name and password for example should be sored in OCI Vault and retrieved at runtime. However, as this is a blog post, do what I say… not what I do. :-)
import requests
# Replace these with your actual values
username = "{UserName}" # The database user that was created in the last article.
# this user should have OML permissions
password = "{Password}" # The password for the database user
omlserver = 'https://adb.us-ashburn-1.oraclecloud.com'
# The OML endpoint for your region. This is for Ashburn.
tenant = "{Tennancy OCID}" # Your tennancies unique OCID
database = "{Database Name}"
# Use the name, not the display name!
token = ""
endpointName = "nnanon" # This is the URL snippet that you used when publighint your endpoing.
All of these values are going to be used to format the REST calls into your database. Your Tennancy OCID can be retrieved form the OCI console. The database name is the name, not the Display Name, that you used when you created your database by following the last blog post.

In my case the Display name and the database name are the same thing, but they may not be in every situation.
Authentication
The first thing we need to do in order to able to call out model is to get authenticated. This involves getting a token from the database in order to authenticate the call we make later on.
# Prepare the authentication URL by injecting values into the URI
url = f"{omlserver}/omlusers/tenants/{tenant}/databases/{database}/api/oauth2/v1/token"
# Set up necessary headers
headers = {
"Content-Type": "application/json",
"Accept": "application/json"
}
# Authenticate with a user name and password
data = {
"grant_type": "password",
"username": username,
"password": password
}
# Request an access token
response = requests.post(url, headers=headers, json=data)
#get the token from the response
if response.status_code == 200:
token = response.json().get("accessToken")
else:
print("Failed to retrieve access token")
This section of the code will pass the username and password to the database and request an OAUTH token that will allow us to call into the published model. The call is being made over HTTPS, so we are not sending the password across the wire in clear text!
With luck, assuming that everything has been set up you will receive a 200 response with an access token you can save for later.
Calling the model
Now that we have the authentication token, it’s time to call the model.
# Set up the scoring URI
scoring_url = f"{omlserver}/omlmod/v1/deployment/{endpointName}/score"
# provide the autherntication token as a header
scoring_headers = {
"Authorization": f"Bearer {token}",
"Content-Type": "application/json"
}
# provide the requested data
scoring_data = {
"inputRecords": [{"TEMPERATUREC": 7}]
}
# call tthe model!
score_response = requests.post(scoring_url, headers=scoring_headers, json=scoring_data)
# Interprate the results
if score_response.status_code == 200:
result = score_response.json()
print(result)
else:
print(f"Scoring failed with status code {score_response.status_code}")
print(score_response.text)
In this case, we are predicting the value of Anomaly from the temperature, so the output of the call is how likely it is that the temperature passed in is an anomaly. (Below 3c or above 5c) If you remember from the experiments we ran in the last blog post, TEMPERATUREC was the feature that was determined to be of importance:

And for the specific model that we deployed, it was the only feature that was part of the predictions:

If other features were involved in the predictions, then we would have to add those additional features into the call for scoring.
scoring_data = {
"inputRecords": [
{
"TEMPERATUREC": 7,
"AdditionalFeature1": 1,
"AdditionalFeature2": 2,
}
]
}
Testing the results
Calling the code with a value of 7:

Calling the code with a value of 4:

Calling the code with a value of 1:

As you can see by the results a value less than 1 indicates a probability that the value passed in is NOT an anomaly while a value of 1 or greater indicates a higher likelihood that the value is an anomaly. As with all AI, the result is not a Yes/No outcome, like an “if” statement, but rather a probability that your code will need to assess.
Summary
As I wrote yesterday, training an AI model in place on your data leveraging the power of Autonomous Database is incredibly easy. The good news is that taking advantage of those models in your code is just as easy as well.
With some simple Python scripting you can leverage your data, and the AI capabilities inside of Autonomous Database to bring AI to your applications. No Mathematicians or data scientists required!
While this specific use case is not really suitable for AI, because the decision can be easily handled with a single line of code, you could use this same technique to predict failure with a largest set of variables. Maybe you can capture the external temperature and the fan speed in addition to the refrigerator temperature and predict from there? The possibilities are pretty much endless!

