Signing up for an Autonomous Database is easy. You can have an instance up and running in just a few minutes. And now, you can even have one for FREE.

But one of the first things you’re going to want to do is shove some TABLEs into your schema, or just load some data.

We’re working on making this even easier, but let’s quickly recap what you can already do with our tools.

A Saucerful of “Secrets” Ways to Load Data to the Cloud with our DB Tools

Taking Advantage of AUTO TABLE and ORDS

If you already have a TABLE in your schema, and you want to create a REST API for accessing said table, we make that easy.

It’s a right-click in SQL Developer (Desktop)

This UI is coming to SQL Developer Web, soon.

Or you could of course just run this very simple PL/SQL block –

BEGIN
    ORDS.ENABLE_OBJECT(p_enabled => TRUE,
                       p_schema => 'JEFF',
                       p_object => 'HOCKEY_STATS',
                       p_object_type => 'TABLE',
                       p_object_alias => 'hockey_stats',
                       p_auto_rest_auth => TRUE);
    COMMIT;
END;

Another quick aside, if you need to catch up on these topics, I’ve talked about creating your application SCHEMA and REST Enabling it for SQL Developer Web access.

And, I’ve talked about using the CSV Load feature available with the ORDS AUTO Table mechanism.

My TABLE

I have a HOCKEY_STATS table, that I want to load from some CSV data I have on my PC. It’s an 8MB file (35000 rows, 70 columns).

Now, I could use the Import from CSV feature in SQL Developer (Desktop) to populate the TABLE…

Took approximately 16 seconds to batch load 35,000 records to my Autonomous Database service running in our Ashburn Data Center from Cary, NC – using the MEDIUM Service.

That’s not super quick, but it was super easy.

But what if I need a process that can be automated? And my API du jour is HTTP and REST?

Let’s POST up my CSV to the TABLE API

Let’s find the URI first. Go into your Development Console page for your service – you’ll see we show you what all of your ORDS API calls will start with:

You don’t have to guess or reverse engineer your ORDS calls from the SQL Developer Web or APEX links anymore.

After the ‘/ords/’ I’m going to include my REST Enabled SCHEMA alias, which I have specified as ‘tjs’ in place of ‘JEFF’, and then my TABLE alias, which I’ve just left as ‘hockey_stats’.

So if I want to do a CSV load, I need to HTTPS POST to

<pre lang='text'>
https://ABCDEFGHIJK0l-somethingash.adb.us-ashburn-1.oraclecloudapps.com/ords/tjs/hockey_stats/batchload?batchRows=1000
</pre>

The ‘/batchload?batchRows=1000’ at the end tells ORDS what we’re doing with the TABLE, and how to do it. This is documented here – and you’ll see there’s quite a few options you can tweak.

Before I can exercise the API, I need to assign the ORDS Privilege for the TABLE API to the ‘SQL Developer’ ORDS Role. That will let me authenticate and authorize via my ‘JEFF’ Oracle database user account.

There’s a PL/SQL API for this as well as an interface in APEX.

If that sounds ‘icky’ then you can also take advantage or our built-in OAUTH2 Client (example).

Now, let’s make our call. I’m going to use a REST Client (Insomnia) but I could easily just use cURL.

Almost 10 seconds…not blazing fast, but again, very easy (no code!) and it’s sending 8MB over HTTP….

I could tweak the batchRows parameter, and see if I could get faster loads, I’m sure I could. But the whims of public internet latency and the nature of the data I’m sending up in 16 KB chunks will make this a fun ‘it depends’ tech scenario.

thatjeffsmith
Author

I'm a Master Product Manager at Oracle for Oracle SQL Developer. My mission is to help you and your company be more efficient with our database tools.

Write A Comment