Skip to main content

Command Palette

Search for a command to run...

[26ai] DBMS_CLOUD β€” Chapter One

Published
β€’5 min read
[26ai] DBMS_CLOUD β€” Chapter One

If you prefer to read in Spanish Spanish version.

Today, we're going to talk about DBMS_CLOUD on 26ai, and how we use it our daily tasks such as saviing Data Pump backups or generating reports, because this package connects to OCI Object Storage .

By default, DBMS_CLOUD isn't set up on 26ai, we need to configure it with a valid user from our tenancy,

Before getting our hands dirty, we must be absolutely sure that DBMS_CLOUD is working our database.

Just in case, below, you can see the Oracle's note on how to set up in our database:

How To Setup And Use DBMS_CLOUD Package (Doc ID 2748362.1)

After that, we need to configure a token or an API Key for our OCI account**.** This step is neccessary because our Bucket is private. So we need to create a credential. If our Bucket were public, this step would not be required.

Token

First all, we're going to generate a token for our OCI account.

Let's go to the top right of the console and click My profile:

Click Tokens and keys tab:

Let's look at three sections: API Keys, Auth Tokens and Customer Secrets Keys.

Our option is Auth Tokens:

Here, click in Generate token. We just need tofill in a name and clicking Generate token:

We musn't close the next window before copying the token vale.

After completing the previous step, Let's define the credential in our database using the token we just created.

Creation of the Credential

Let's use DBMS_CLOUD.CREATE_CREDENTIAL.

This procedure receives the following parameters:

  • CREDENTIAL_NAME: Name of the credential.

  • USERNAME:/<username_in_the_tenancy>.

  • Password: Token value that we generated in the previous step.

SQL>EXEC  dbms_cloud.create_credential(
    credential_name => 'DBA_IN_THE_SHADOW',
    username        => '<IdentityDomain>/<usuario_tenancy>',
    password        => '<token>');

SQL> R
  1  select credential_name,
  2         username,
  3         enabled
  4  from   DBA_credentials
  5* order by credential_name

CREDENTIAL_NAME           USERNAME                                           ENABLED
------------------------- -------------------------------------------------- --------------------
DBA_IN_THE_SHADOW         <IdentityDomain>/<usuario_tenancy>       TRUE

Let's check our credential using DBMS_CLOUD.LIST_OBJECTS. With this function, we can list the contents of the Bucket**.** Thus, we need to verify the connectivity with OCI.

This function needs several parameters, such as name of the credential and the URL in order to access ourBucket.

How do you get the url of our Bucket? The easiest way is to access an object in our Bucket and then go to the details:

Here, we copy of the β€œURL path (URI)”.

After that, let's check the contents of our Bucket using SQL:

SQL> 
Select object_name
  From Dbms_cloud.list_objects('DBA_IN_THE_SHADOW','<location_uri>'); 

OBJECT_NAME
--------------------------------------------------------------------------------
XXXXXXXX.zip
XXXXXXXX.zip
XXXXXXXX.zip

Perfect, the access was sucessful.

Uploading files

In the next step, on one hand, we're going to create a csv file, and on the other hand we'll upload it to our Bucket.

Let's use DBMS_CLOUD.PUT_OBJECTS in order to upload it. Before using it, we need to create a directory object in Oracle.

SQL> ! cat generate.sql
set heading off
set feedback off
set echo off
set verify off
set termout off
set markup csv on
spool myfile.csv
Select trunc(dbms_random.value(1, 1000)), dbms_random.string('x', 10)     
 From dual  
connect by level <= 10000;
spool off

SQL> @generate.sql
SQL> ! ls -lac *csv
-rw-r--r-- 1 oracle oinstall 168924 Feb 18 12:42 myfile.csv

SQL> CREATE DIRECTORY OCI_HOME as '/home/oracle';
SQL> GRANT READ, WRITE ON DIRECTORY OCI_HOME TO PUBLIC;
SQL> begin
  dbms_cloud.put_object (
    credential_name => 'DBA_IN_THE_SHADOW',
    object_uri      => '<location_uri>/myfile.csv',
    directory_name  => 'OCI_HOME',
    file_name       => 'myfile.csv');
end;
/ 
SQL>
SQL> Select object_name
  From 
   dbms_cloud.list_objects('DBA_IN_THE_SHADOW','<location_uri>');  

OBJECT_NAME
--------------------------------------------------------------------------------
XXXXXXXX.zip
XXXXXXXX.zip
XXXXXXXX.zip
myfile.csv

Perfect, we have uploaded the file to our Bucket.

As the file is in our Bucket, we can create an external table in order to read the file. The easiest way is to use DBMS_CLOUD.CREATE_EXTERNAL_TABLE, because it generates a table in our schema. This table reads the file directly.

SQL> 
BEGIN
  DBMS_CLOUD.CREATE_EXTERNAL_TABLE(
    table_name      => 'EXT_MYFILE_CSV',
    credential_name => 'DBA_IN_THE_SHADOW',
    file_uri_list   => '<location_uri>/myfile.csv',
    format          => JSON_OBJECT(
                         'type' VALUE 'csv',
                         'delimiter' VALUE ',',
                         'quote' VALUE '"',
                         'skipheaders' VALUE '0'
                       ),
    column_list     => '
        id   NUMBER,
        code VARCHAR2(50)
    '
  );
END;
/

SQL> SELECT COUNT(*) FROM EXT_MYFILE_CSV;
1000
SQL>
 SELECT * FROM EXT_MYFILE_CSV FETCH FIRST 25 ROWS ONLY
        ID CODE
---------- --------------------------------------------------
       224 3LU82RWW27
       654 LASYRZWUUE
        17 UW34WB3O4Y
       764 5Z2I01ITAX
       132 V4XCFDS9GA
       131 6PRK6G0VTC
       650 NST09S6RY9
       133 R8IIPLJZ95
       477 WK8Y1XC5XK
       930 O574M5RJK2
       371 P8G6PF91A9
       214 WRUYMY4BWY
       273 YD5NG441MV
       706 DQYEBYNF5H
       699 J3T3RHNHU0
       744 VVXATGVW0N
       945 U55X9EATLH
       371 RHW44MZ1DQ
       595 B1RU6VN33B
       864 P02I2A9Z8S
       218 R02XSI9MLY
       980 M0CSWHE4G7
       396 UZYFEL7UKI
       619 DYKCFI912X
       881 2TZ0GEQ6MM

25 rows selected.

We can also export the information to our Bucket using DBMS_CLOUD.EXPORT_DATA.

This procedure has the following paramaters:

  • CREDENTIAL_NAME: Name of the credential.

  • URL: β€œURL path (URI)”.

  • Format: In our case, the format is json.

  • Query: A valid SQL query.

SQL> 
BEGIN
      DBMS_CLOUD.EXPORT_DATA(  
        credential_name =>'DBA_IN_THE_SHADOW',
        file_uri_list =>'<location_uri>/config_system_parameter',
        format => json_object(  'type' VALUE 'json' ),
        query => 'SELECT NAME, VALUE FROM V$SYSTEM_PARAMETER'
      );
   END;
  / 

SQL> Select object_name
  From 
   dbms_cloud.list_objects('DBA_IN_THE_SHADOW','<location_uri>');  

OBJECT_NAME
--------------------------------------------------------------------------------
XXXXXXXX.zip
XXXXXXXX.zip
XXXXXXXX.zip
config_system_parameter_1_1_1.json
myfile.csv

And if we need to load the file into a table, we can use DBMS_CLOUD.COPY_DATA.

SQL> 
  Create table BACK_SYSTEM_PARAMETER ( 
    INFO_PARAMETER CLOB  CONSTRAINT INFO_PARAMETER_JSON CHECK ( INFO_PARAMETER IS JSON )
      );

Table created.

SQL>
begin
  DBMS_CLOUD.COPY_DATA (
    table_name      => 'BACK_SYSTEM_PARAMETER',
    credential_name => 'DBA_IN_THE_SHADOW',
    file_uri_list   => '<location_uri>/config_system_parameter_1_1_1.json'
     );
end;
/
PL/SQL procedure successfully completed.

Before using the procedure, we're going to create BACK_SYSTEM_PARAMETER table with a single column in order to load the file.

Let's query the table in order to see the contents of the file:

SQL>
 select 
    JSON_VALUE(INFO_PARAMETER, '$.NAME') AS NAME, 
    JSON_VALUE(INFO_PARAMETER, '$.VALUE') AS VALUE 
   from BACK_SYSTEM_PARAMETER
 Where INSTR(UPPER(JSON_VALUE(INFO_PARAMETER, '$.NAME')),'OPTIMIZER') != 0
  FETCH FIRST 5 ROWS ONLY;

NAME                                     VALUE
---------------------------------------- ----------------------------------------
optimizer_ignore_hints                   FALSE
optimizer_secure_view_merging            TRUE
optimizer_use_pending_statistics         FALSE
optimizer_capture_sql_plan_baselines     FALSE
optimizer_use_sql_plan_baselines         TRUE

10 rows selected.

In the next article, we're explore how to use Data Pump using DBMS_CLOUD

Looking forward to seeing you in the next article :)