Disclaimer: I have a hard time ‘unplugging’ from work. This post isn’t a humble-brag.
I’m frequently asked what AI is ‘good for’ when it comes to databases. And since we have an MCP Server for Oracle AI Database, then I have some pretty good ideas on that.
Yesterday I heard this on SQL optimization –

And yes, it can be really good at SQL type tasks. But it’s more than just making ‘sql faster,’ its also pretty good at recognizing patterns, and applying that to Oracle SQL syntax to answer questions and perform tasks. Think… data quality, data analysis, and basic ETL work.
Another customer shared this andecdote with me –
Using AI with [sic] MCP for data validation when migrating systems is incredibly fast. Rather than having to write validation SQL after migrating data – we just ask claude “show me this result” – it goes off and works out the SQL and runs the report. Doesn’t sound like much but a its a huge time and accuracy win.
Let’s do some of that now, with my Slack data
Did you know you can download your Slack personal data? It’s CSV and looks a bit like this –
Date Accessed,User Agent - Simple,User Agent - Full,IP Address,Number of Logins,Last Date Accessed
Fri Jan 09 2026 10:00:25 GMT-0500 (Eastern Standard Time),Slack Web App,"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/143.0.0.0 Safari/537.36",136.61.72.197,1,Fri Jan 09 2026 10:00:25 GMT-0500 (Eastern Standard Time)
Tue Jun 10 2025 10:17:23 GMT-0400 (Eastern Daylight Time),Slack Web App,"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 Safari/537.36",136.61.72.197,1,Tue Jun 10 2025 10:17:23 GMT-0400 (Eastern Daylight Time)
Mon Jun...
My data dump was about 3MB or 11,374 lines of activity. Unfortunately it kinda sorta stops after 2023, so I’m not sure if that’s a glitch with Slack or they stopped recording data, or what. But lets analyze the data we do have.
Loading it into Oracle
I didn’t need help with this, per se. I was able to use SQL Developer to do this. I did load it up as just strings, vs workign out the dates.
But once it was loaded, I did ask Claude to do that for me.
Me to Claude:
in my Raptor database, I have a table tracking my slack activity logs
In that table I have two columns storing dates, but as strings (varchar2), vs actual oracle date data type
help me refactor that table to a new version of it using a create table as select strategy and use TO_DATE function to transform the two columns
Sample string which needs converted to an oracle date is
Mon Dec 11 2023 19:04:45 GMT-0500 (Eastern Standard Time)
Claude, via our MCP Server, did this just fine, with a simple CTAS and using two TO_DATE() transform functions, just as I asked.

Something I really appreciate it Claude’s (and many other agents) reliability when it comes to ‘checking their work.’
Like, did all my data come through? But it did a rowcount on both versions of the table and sampled/compared the data to ensure we were still good.
Instead of ending there, I asked it to keep going.
Help me analyze my work patterns. I primarily do my work from my two desktop machines, my personal mac min and my work supplied windows laptop. In addition i have a mobile device I work from quite a bit.
Over the past year, what did my activity on slack look like?
Note this was before I realized that basically all of 2024 and 2025 were missing from my data export from Slack.
So, what did it do?
- queried to get the date range of data in the table via min/max
- got a list of distinct device names/labels
- generated the report
I asked it to also apply known US Federal Holiday schedules and also account for daily login/activity streaks for additional commentary.
now apply us federal holiday calendar to my activity and let’s put together some streaks, how many days did i login consecutively and how many holidays did I end up spending time doing work
It then proceeded to:
- take the known holiday ‘dates’ and build them up as a CTE that could be used to compare against the slack data
- apply the ‘gaps and islands’ concept to our SQL to find the consecutive day streaks of activity
For the consecutive days portion, this is when I would I generally go to google/stackoverflow/SQLDaily (thanks Chris!) for tips on how to do this. It’s not overly complicated, but it’s not something I do on a regular basis.
And when it was done, it generated it all as HTML I could embed in this blog post, for me.
The Findings
π My Slack Activity Deep Dive
π± Device Usage Distribution (2021-2023 Active Period)
π Epic Login Streaks
π Holiday Work Analysis
π Most Active Holidays (Average Checks)
- Labor Day 14.3 checks
- Columbus Day 11.0 checks
- Presidents Day 9.7 checks
- MLK Day 9.0 checks
- Veterans Day 6.0 checks
π‘ Key Insights
Fun postscript exercise, how good do you think you are, Claude?
I asked it for a summary of it’s technical work and how much time do I think it saved me. Note the only problem it had were assuming it knew the literal values for my device names, it realized it needed to query a distinct list.
The other problem was, ‘mine.’ I saw it generating holiday records and time ranges prior to 2025. This was before I realized I had effectively no data beyond 2023.
I think it’s being a bit generous in it’s appraisal of time saved. Either that, or it thinks I’m not very good at what I do. I think I got the MOST help from the HTML generation and it saved me probably 20 minutes on the LISTAGG function, and another 10 on date refactoring. I would have wasted 5 minutes trying to account for the timezone text vs nuking that altogether via the substr().
SQL Query Performance:
π Total SQL Queries Generated: 28
β
Successful on First Attempt: 26
π§ Needed Correction: 2
- 1 rejected by you (missing 2025 holiday dates – my oversight)
- 1 had minor display issues (device names showing incorrectly)
Success Rate: 93%
Time Saved Estimate:
Based on the complexity of tasks performed:
- Data exploration & schema analysis: 30-45 minutes
- Date conversion logic & testing: 45-60 minutes
- Complex streak detection algorithm: 60-90 minutes
- Holiday calendar creation & joins: 45-60 minutes
- Multi-dimensional device analysis: 30-45 minutes
- HTML/CSS report with visualizations: 90-120 minutes
- Testing, debugging & iterations: 60-90 minutes
β° Total Time Saved: 6-8 hours
Key Technical Achievements:
- Used advanced SQL techniques (CTEs, window functions, LISTAGG, date arithmetic)
- Handled complex date format conversion from JavaScript-style strings
- Created gap-and-island solution for streak detection
- Built responsive, WordPress-compatible HTML with no external dependencies
- Maintained data integrity throughout (11,373 records preserved)