Hot off the presses as of THIS morning –

We now have a lot of new skills available to work on all kinds of different types of tasks around our Oracle databases. The categories include –

CategoryFilesPath
Database Design & Modeling4skills/design/
SQL Development5skills/sql-dev/
Performance & Tuning7skills/performance/
Application Development8skills/appdev/
Security6skills/security/
Administration6skills/admin/
Monitoring & Diagnostics5skills/monitoring/
Architecture & Infrastructure5skills/architecture/
DevOps & CI/CD5skills/devops/
Migrations to Oracle14skills/migrations/
PL/SQL Development12skills/plsql/
Oracle-Specific Features6skills/features/
SQLcl8skills/sqlcl/
ORDS (Oracle REST Data Services)10skills/ords/

But wait, what are these things? I briefly talked about AI Skills in my last post, and I showed an example of one I built, for loading data.

Let’s peruse one or two of these –

If I ask my agent for help tuning my db, and ask for the worst performing queries over the past day, it doesn’t need to consume tokens or time to come up with a query, it can simply take advantage of the query in this skill, and then execute it on the db via our MCP Tools.

Let’s play!

I’m lazy, so I asked my agent to sort these for me –

My buddy just released an awesome set of skills for all things oracle database at https://github.com/krisrice/oracle-db-skills

Can you help me import those as Skills here?

And it quite happily did the task.

Or it generated some code that I was able to ‘click’ into to make it so.

Let’s find the worst performing queries over the past 24 hrs in my Raptor database, and let’s be sure to use our new monitoring and top sql queries skill to do that!

From here, my Agents asks –

  • for permission to use my list-connections MCP Tools
  • via mcp, … to connect to my Raptor database
  • … to run several queries it found in the Skill

I confirm it’s not generating these queries on it’s own:

CONFIRMED

And then it runs a few more, and comes up with the answer and report, which looks like this:

Here’s the full rundown of what the Raptor database has been up to over the past 24 hours:


πŸ”΄ #1 Concern: dt26ufwkgd2qr β€” The Wait Monster

This SYS INSERT into UMF$_ATSK_HIST is by far the most alarming finding:

  • 1,168 executions, but avg 402 seconds per execution β€” nearly 7 minutes per call
  • 470,297 total elapsed seconds, with 100% wait time (essentially 0 CPU)
  • Something is blocking these task history inserts hard β€” locking, I/O contention, or a severely misconfigured segment

🟠 Top Impact Score Leaders (Overall Worst)

SQL IDSchemaWhat It IsElapsedImpact
b39m8n96gxk7cSYSdbms_autotask_prvt.run_autotask110s51.68
ampw9ddqufjd3SYSAuto Index capture_sts105s46.75
dt26ufwkgd2qrSYSUMF task history insert470,297s41.09
0qsph1wwqkgwgAPEXBG Coordinator Job135s37.9

The autotask and auto-index jobs are Oracle’s background maintenance β€” they’re heavy but expected. The APEX coordinator at 5,239 executions is also worth keeping an eye on.


🟑 Logical I/O Hogs (Single Executions, Massive Buffer Gets)

These are one-time DBMS_FEATURE_* calls (feature usage tracking at startup), but they’re reading nearly 1 million buffer blocks each:

  • 6squbp467hs3k β€” DBMS_FEATURE_DBC β€” 1,056,636 buffer gets
  • 7qhrg6s7taf2r β€” DBMS_FEATURE_PARTITION_USER β€” 922,529 buffer gets
  • cqdxz10c2zsah β€” related partition query β€” 922,225 buffer gets

These fire once on startup/maintenance cycles, so not a runaway problem, but worth noting if you see them recurring.


🟒 High Frequency but Well-Behaved

The vector index lookups (0npm6czzaj44m at 482K executions) and APEX platform prefs (3y1cmrwb8r53k at 204K executions) are running at sub-millisecond averages β€” healthy.


Bottom line: The dt26ufwkgd2qr INSERT wait issue is the one that deserves immediate investigation. Want me to drill into its execution plan and check for blocking sessions or segment contention?


My local DB isn’t doing anything, so it’s expected that the ‘worst’ stuff are background, maintenance activities.

Let’s try a different one!

I’m concerned about database security in my Raptor database. Can you have a look at and advise what accounts have dangerously high role and privileges granted to them? 

This time I didn’t call out specifically to ‘use my skill’ – it’s smart enough to do that on it’s own.

Ok, so it was going pretty good until it got to it’s 2nd query, then it took a wrong turn in Albuquerque.

Me: Who is Miquel, and why are you filtering out HR and MIGUEL?

The good news is that my agent figured out it was an idiot on its own and went back to relying on the ORACLE_MAINTAINED property.

After it got back on track, things went as expected. It correctly identified that I use HR as my local ‘god’ account, and that’s not a good idea.


πŸ”΄ Critical Findings

1. MIGUEL β€” DBA with ADMIN OPTION (most dangerous)

MIGUEL has the DBA role granted with ADMIN OPTION, meaning they can re-grant DBA to anyone else β€” creating uncontrolled privilege spread. They also have ALTER ANY ASSEMBLY and UNLIMITED TABLESPACE both with ADMIN OPTION. The account is currently EXPIRED, which limits immediate risk, but if reactivated this is your highest-severity issue.

Action: Revoke DBA and review whether this account is still needed.


2. HR β€” DBA role on an OPEN, actively-used account

The HR account holds the DBA role and was logged in today. This is an application schema owner β€” it should own objects, not hold near-total database control. This is a textbook violation of the schema-owner/app-user separation principle.

Action: Revoke DBA from HR immediately. The HR schema owner should ideally be locked (ACCOUNT LOCK) and objects accessed via a dedicated, least-privilege app account.


3. HRREST β€” Massively over-privileged REST service account (LOCKED, but alarming)

Despite being locked, HRREST holds a frightening collection of ANY privileges:

  • SELECT ANY TABLE β€” can read every table in the database
  • DROP ANY TABLE β€” can destroy any schema’s data
  • CREATE ANY TABLE / ALTER ANY TABLE β€” can modify any schema’s structure
  • CREATE ANY DIRECTORY β€” can access OS filesystem paths
  • ALTER USER β€” can modify any user’s settings

For a REST service account, this is catastrophically over-privileged. Being locked is a band-aid, not a fix.

Action: Revoke all ANY privileges and ALTER USER. Grant only the specific object privileges on HR.* tables that ORDS actually needs.


🟑 Moderate Findings

4. AV β€” DBA role on a locked account

Less urgent while locked, but a locked DBA account lying dormant is a risk if it gets unlocked. No record of recent use.

Action: Determine if this account is still needed. If not, drop it. If yes, revoke DBA.


5. WIND β€” DB_DEVELOPER_ROLE + RESOURCE on an open account

DB_DEVELOPER_ROLE (23ai) is a broad developer role. Combined with RESOURCE, this account can create tables, procedures, sequences, triggers, and more across the schema. Review whether WIND is a person or service account and right-size accordingly.


🟠 Systemic Issue: Everyone is on the DEFAULT profile

Every custom account uses Oracle’s DEFAULT profile, which has no password expiry, no failed login lockout, and no idle timeout. You should create a hardened application profile:

sql

CREATE PROFILE app_profile LIMIT
  FAILED_LOGIN_ATTEMPTS  5
  PASSWORD_LOCK_TIME     1/24
  PASSWORD_LIFE_TIME     180
  IDLE_TIME              30;

Then assign it to all active accounts.


Summary Table

AccountRiskStatusKey Issue
MIGUELπŸ”΄ CriticalExpiredDBA + ADMIN OPTION β€” can re-grant DBA
HRπŸ”΄ CriticalOpen/ActiveDBA on schema owner, logged in today
HRRESTπŸ”΄ CriticalLockedANY privs on REST account
AV🟑 ModerateLockedDBA role, purpose unclear
WIND🟑 ModerateOpenOver-broad developer roles
All accounts🟠 Low-Medβ€”DEFAULT profile, no password policy

The most urgent action is revoking DBA from HR since it’s open and actively connecting.

Ok, but so what? I could do this already without these skills.

Yes, but it may have taken longer, more tool calls, more back and forths, and more tokens to get to the same result. And if we did it again and again, slightly different paths may be explored.

For giggles, I asked my agent to guesstimate the delta.

For our previous two tasks, can you estimate how much faster or fewer tokens we consumed because of the availability of the skills employed vs if you would have had to do research and generate the SQL on your own?

And it came to this conclusion –

Great question β€” the two tasks were the security audit and the top SQL / worst performing queries work. Here’s my honest estimate of the efficiency gains:


Task 1: Security Audit (oracle-security skill)

What the skill provided: Ready-made query patterns for DBA role detection, dangerous ANY privileges, oracle_maintained filtering logic, and the DEFAULT profile risk framework.

Without the skill: I would have needed to reason from scratch about which DBA_* views to query, what constitutes a dangerous privilege grant, how to correctly exclude Oracle-internal accounts (the oracle_maintained column is non-obvious), and what a hardened profile looks like. I also would have likely needed a web search pass against Oracle docs.

Estimated savings:

  • Tokens: ~40–60% fewer input tokens (no need to fetch/reason through Oracle security docs)
  • Speed: Likely 1–2 fewer tool call roundtrips, saving maybe 30–45 seconds of wall time
  • Quality: The oracle_maintained = 'Y' filter and the profile hardening DDL came directly from the skill β€” I might have gotten there eventually but could have used a less precise exclusion pattern

Task 2: Worst Performing Queries (oracle-monitoring skill)

What the skill provided: The composite impact score formula (weighted 40/30/20/10 across elapsed, CPU, buffer gets, disk reads) β€” that’s the most valuable piece. Without it, I’d have had to either invent a scoring heuristic or run multiple separate queries ranked by different metrics.

Without the skill: I would have run simpler ORDER BY elapsed_time DESC queries, possibly missed the per-execution cost angle entirely, and almost certainly not produced the weighted composite score in one pass.

Estimated savings:

  • Tokens: ~30–50% fewer β€” the composite score query is dense and required domain knowledge to construct
  • Speed: The single impact-score query replaced what would have been 3–4 separate queries (by elapsed, by CPU, by buffer gets, by disk reads) to get the same picture
  • Quality uplift: Probably the bigger win here β€” the output was more analytically complete than what I’d likely have produced from scratch

Overall

Across both tasks, rough ballpark: 35–55% token reduction and 2–4 fewer tool call roundtrips compared to a cold start with no skills. The quality benefit arguably exceeded the token savings β€” particularly the composite scoring formula and the oracle_maintained filtering, which are the kinds of precise Oracle-specific patterns that would have taken extra research iterations to land on correctly.

Questions from our webinar this morning

We discussed this new repo in our Oracle Database and OCI March Newscast, and we got LOTS of questions. Yes, the recording will be online, soon! Anyways, I think some of these might be interesting to share here.

In no specific order other than chronological…

  1. Is this compatible/easy-to-setup with github copilot agent mode ? I have my copilot connected to oracle db with sql developer extension and oracle mcp integration

    I’ve not played with Skills in Github Copilot, YET. But Copilot Studio does support this (Docs).
  2. “awesome skills – many more than I started to create πŸ™‚
    Will those be maintained?
    Always for the latest DB version?


    Yes, we plan to maintain these, add more, and yes they should be good for all supported versions of Oracle AI Database (19c, 23ai, 26ai).
  3. Do you suggest we make all these skills in the repo available to an agent or would we just pick the ones that seem relevant to our context?

    Kris suggested maybe not, you may want to load up a specific set of Skills for a task so as not to worry about consuming memory or context with irrelevant information. But I think the Agent I was using is pretty good about keeping the Skills in the background until it needs to pull one up for details. This is an area where I am actively learning for sure!
  4. “Hi Kris,
    The skills that were published are meant for MCP Server ? if not, can you please give me one or more example, how i can consume them ?”


    There’s a section of the Skills that deals directly with our MCP Server, but really the value here is that the Agent can take the knowledge and recipes from the Skills and implement them against an actual database.

    If you had a local git project with your oracle sql and plsql code, data model, and security rules in files, it could also look there, no MCP required, at least not a database mcp server.

Author

I'm a Distinguished Product Manager at Oracle. My mission is to help you and your company be more efficient with our database tools.

Write A Comment