Updated Feb 2021

Signing upwards for an Democratic Database is easy. You can have an instance upwards and running in merely a few minutes. And now, y'all can fifty-fifty have one for Gratis.

But 1 of the first things you're going to want to exercise is shove some TABLEs into your schema, or just load some information.

We're working on making this even easier, but let's quickly recap what you lot can already practice with our tools.

A Saucerful of "Secrets" Ways to Load Data to the Cloud with our DB Tools

  • Data Pump to Cloud
  • Cart to Cloud
  • Database Copy to the Cloud
  • Elevate and Driblet to Cloud
  • SQL Programmer Web, CSV, Excel Imports

Taking Reward of AUTO Tabular array and ORDS

If you already take a Table in your schema, and you want to create a REST API for accessing said tabular array, nosotros make that easy.

It's a right-click in SQL Developer Web, or Desktop

In the SQL card in Database Deportment (formerly known as SQL Programmer Web –

Get to your SQL worksheet, observe your table (or view or PL/SQL plan), correct click, and Enable

Or yous could of course merely run this very unproblematic PL/SQL block –

            BEGIN            ORDS.ENABLE_OBJECT(p_enabled            =>            True            ,            p_schema            =>            'JEFF'            ,            -- your schema here (has to be YOUR schema)            p_object            =>            'HOCKEY_STATS'            ,            -- your tabular array here            p_object_type            =>            'Tabular array'            ,            -- or view or plsql                        p_object_alias            =>            'hockey_stats'            ,            -- how the object will be named in the URI            p_auto_rest_auth            =>            TRUE            )            ;            -- PROTECT!            COMMIT            ;            End            ;          

Another quick aside, if you demand to take hold of upwards on these topics, I've talked nearly creating your application SCHEMA and Balance Enabling it for SQL Developer Web access.

And, I've talked about using the CSV Load feature available with the ORDS Machine Table machinery.

My Table

I have a HOCKEY_STATS table, that I want to load from some CSV data I accept on my PC. Information technology'southward an 8MB file (35000 rows, 70 columns).

Now, I could use the Import from CSV characteristic in SQL Programmer (Desktop) to populate the TABLE…

Took approximately 16 seconds to batch load 35,000 records to my Autonomous Database service running in our Ashburn Information Center from Cary, NC – using the MEDIUM Service.

That'south not super quick, but it was super like shooting fish in a barrel.

And yep, you lot can do all of this in your browser also!

But what if I need a process that tin can exist automated? And my API du jour is HTTPS and REST?

Permit's Postal service up my CSV to the TABLE API

Permit'due south find the URI starting time. Go into your REST workshop page in Database Actions

Click on the REST Card

From in that location we'll see what we have bachelor for RESTFul Web Services in our schema.

Those Cards up top are links, nosotros want to go to AUTOREST

AUTOREST

I can immediately see the tables and views I want to work with, and I tin filter those, merely I only have two at the moment. If I click on the button in the corner of the bill of fare, I can ask for the cURL information for all of the Grime REST API endpoints – including BATCH LOAD.

Copy, paste, and become (almost!)

I'chiliad going to accept this string and set it up a bit…

https://ABCDEFGHIJK0l-somethingash.adb.united states of america-ashburn-i.oraclecloudapps.com/ords/tjs/hockey_stats/batchload?batchRows=1000

The '/batchload?batchRows=1000' at the end tells ORDS what we're doing with the TABLE, and how to do it. This is documented hither – and you'll see in that location's quite a few options you can tweak.

If y'all want to protect these endpoints (and y'all almost always volition), you'll need to assign the table privilege to the 'SQL Developer' role – then you lot tin employ your database username/password with the cURL control.

That's done on the Security page for the REST workshop.

The Priv for the Rest enabled table will be pretty obvious, you'll meet the schema.tablename

If using database credentials in your REST calls sounds 'icky' then you lot tin also take advantage or our born OAUTH2 Customer (example).

There's also a full OAUTH2 UI in the REST Workshop likewise –

Using an OAuth(2!) client to access a protected REST endpoint is pretty easy.

Now, let's make our call. I'1000 going to use a REST Client (Insomnia) but I could easily just use cURL.

Almost 10 seconds…not blazing fast, but again, very easy (no lawmaking!) and it's sending 8MB over HTTP….

I could tweak the batchRows parameter, and run into if I could get faster loads, I'thousand sure I could. But the whims of public net latency and the nature of the data I'yard sending up in sixteen KB chunks volition make this a fun 'information technology depends' tech scenario.

30 September 2020 Update – 5M Rows, ~27 sec!

I decided to throw a slightly larger scenario at ORDS in Autonomous.

133 MB of CSV – 5,000,000 rows including the 1 header row

And our data…

Live in Autonomous ~25 seconds after hitting Mail in my local Residue customer

Note this scenario was besides from Cary, NC to our Deject Data Center in Virginia…and of form your results may vary based on electric current network loads.

I as well increased the batchRows setting up to five,000 from ane,000. I also tried 7,500 and 10,000 but didn't meet any boosted performance improvements, but this also is NOT a scientific test.

Some more notes on settings (Docs).

batchRows – The number of rows to include in each bath – so nosotros're inserting/committing 5,000 rows at a fourth dimension, for a full of 100 batches.

We could include additional settings…

errors and errorsMax in detail. You may want to set errorsMax to a reasonable number. In other words, do yous want to give up on the load if say, more than 10% fail to exist inserted. For debugging, I would suggest setting it to 1 or say 10 so your test 'fails fast' and doesn't consume unnecessary server resources.

And of course, results may vary, but I got this to run every bit 'fast' as 25 seconds and as slow as 29 seconds.

Tried one more time before hit the 'Update/Publish' button đŸ™‚

I ran this scenario 10x each, and besides for 10M, and 20M rows of this same data. I institute that running the loads at 5M row batches outperformed the larger ones. That is, I could kick off 2, 5M load requests and go information technology washed MUCH faster than a single 10M load.

For example, running 10M rows, 10 times, my average load time was 90 seconds, vs xxx seconds for the 5M loads.

When I kicked off 2 5M runs concurrently, both were finished in less than 27.iv seconds!

Mostly scientific, mostly.

The roll

I hate cURL. It took me more than than a few minutes, and ended upwardly having to Zoom with @krisrice to finally get this Merely right…

curl --write-out '%{time_total}' -X Mail service --information-binary "@5M.csv" -H "Content-Type:text/csv" --user <user>:<countersign> "https://....adb.the states-ashburn-1.oraclecloudapps.com/ords/<user>/huge_csv/batchload?batchRows=5000&errorsMax=xx"

Don't forget the Content-Type header! I was doing that and weird things were happening. Also had to hop to SO to effigy out how to stream in the contents of the file, and so also learned a cool trick of pulling the HTTP response fourth dimension in the call. I'yard sharing this for ME, because I'll forget information technology tomorrow.