Autonomous Database

Oracle Autonomous Transaction Processing – Move your Data with Oracle Data Pump – Part 3

In this blog post serie which has three parts, I want to describe how data will be uploaded from an on-premises environment into the Oracle Autonomous Transaction Processing database using Oracle Data Pump. 

Oracle Import Prerequisites

Credentials

To get acccess to the dump file on the Oracle Object Storage, a credential has to be created in the Oracle Autonomous Transaction Processing database with the DBMS_CLOUD procedure. For more information about the package, see the blog post from Christian Antognini – DBMS_CLOUD Package – A Reference Guide.

The DBMS_CLOUD procedure needs a password value which is the token from the user account. If you don’t now your token, create a new one. Go to Identity – Users – your username and click on the left side on „Auth Tokens“. Create a new token by click on „Generate Token“. The random generated string is the value for the DBMS_CLOUD password.

Enter a name for the token and click on „Generate Token“.

Note your generated token and „Close“ the window.

Login into the Autonomous Transaction Processing database as admin user and create a new credential called ATPCS_CRED.

A new Database User called HRATP

In the ATP, we create a new user called HRATP. The tablespace DATA is the default tablespace in an Autonomous Transaction Processing database and does not have to be defined.

Oracle Data Pump Import

The impdp tool is part of my Instant Client installation in Part 1 of this blog serie. Oracle recommends to set parameters like 

  • partition_options=merge
  • transform=segment_attributes:n
  • transform=dwcs_cvt_iots:y
  • transform=constraint_use_default_index:y
  • exclude=index, cluster, indextype, materialized_view, materialized_view_log, materialized_zonemap, db_link

Two new Oracle Data Pump parameters to work with the Oracle cloud databases are credential and and dumpfile.

  • credential: The DBM_CLOUD created credential
  • dumpfile: The URL where the dumpfile is located

Attention

The URL provided by the Object Storage menu cannot be accessed directly by impdp:

The URL has to be mapped from objectstorage to swiftobjectstorage:

to

Data Pump Import Execution with REMAP of Schema and Tablespace

Start of the Oracle Data Pump job from my Windows client:

The message about the existing user can be ignored. 

Data Pump Logfile

The logfile of the import process cannot be access directly, it has to be moved into the Object Storage with the DBMS_CLOUD package first.

Now the file can be access in the Object Storage menu:

Connect as new User HRATP to verify the Data

Summary of Part 3

If all prerequistes are met, the data transfer with Oracle Data Pump is easy to configure and easy to handle. Take care about the token, only 2 token per user can be generated. If you losed it, you have to delete a existing token, rebuild it and re-create the credentials before you can import data again.

Oracle Autonomous Transaction Processing – Move your Data with Oracle Data Pump – Part 2

In this blog post serie which has three parts, I want to describe how data will be uploaded from an on-premises environment into the Oracle Autonomous Transaction Processing database using Oracle Data Pump. 

 

Oracle Data Pump Export on On-Premises Database

Oracle recommends the following settings for the Oracle Data Pump export job. If you use already an 18c database, you have to set the version parameter to 12.2 to avoid this error during ATP import process: ORA-39358: Export dump file version 18.0.0 not compatible with target version 12.2.0.

My expdp export job for the schema HR:

The example schema contains these objects – output from expdp job

Transfer the Dump File Set into the Oracle Cloud

The export dump file has to be transferred into the Oracle Cloud Object Storage for later usage. First we have to create an Object Storage Bucket. Login into Object Storage menu and click on the “Create Bucket” button. 

Enter a Bucket Name and click on „Create Bucket“.

Enter the new created bucket. Click on the bucket name link or the “three bullets” on the right side to view the details.

Upload Data

Upload the export dump file, click on „Upload Object“.

„Browse“ local for the Oracle Data Pump export file and click on „Upload Object“.

Object Details

Click on object „Details“ to verify the object on the Object Storage. 

 

Note the URL. The URL will be used later for the import process.

Summary of Part 2

Upload objects into the Oracle Cloud Object Storage is very easy. If you don’t want to use the browser functionality, there are other possibilities to upload files like the API.

Now the export dump file is ready to import into the Autonomous Transaction Processing Database.

Oracle Autonomous Transaction Processing – Move your Data with Oracle Data Pump – Part 1

In this blog post serie which has three parts, I want to describe how data will be uploaded from an on-premises environment into the Oracle Autonomous Transaction Processing database Oracle Data Pump. 

Basics about ATP from the Using Oracle Autonomous Transaction Processing User Guide:

Autonomous Transaction Processing is designed to support all standard business applications and deliver scalable query performance.
Autonomous Transaction Processing provides all of the performance of the marketleading Oracle Database in an environment that is tuned and optimized for transaction processing workloads.
As a service Autonomous Transaction Processing does not require database administration. With Autonomous Transaction Processing you do not need to configure or manage any hardware, or install any software. Autonomous Transaction Processing handles creating the database, backing up the database, patching and upgrading the database, and growing or shrinking the database.

This is my setup:

  • Oracle Cloud Infrastructure account with an own Compartment
  • On-premises Oracle RDBMS 18c EE with HR example schema installed on Oracle Linux 7.4
  • Windows 10 64bit client

Creation of the Autonomous Transaction Processing Database

Login into Oracle Cloud Infrastructure in ATP menu and click on the „Create Autonomous Transaction Processing Database“ button. 

Set DISPLAY NAME, DATABASE NAME, CPU CORE COUNT, STORAGE and set the Administrator Credentials. These credentials are used later to manage the Autonomous Transaction Processing database. Verify your license situation and click on the „Create Autonomous Transaction Processing Database“ button. 

My ATP database is called ATPHRDATA, it contains later data from schema HR.

Some minutes later, the database has state AVAILABLE and is ready to use. Click on the ATP database name link or the „three bullets“ on the right side to view the details.

Click on „Service Console“.

Enter admin as username the password from the creation step. Click on „Sign In“.

This is the ATP main dashboard. Here you can manage the ATP database. Click on the „Adminstration“ link.

Download the login credentials by click on the „Download Client Credentials“ link . ATP uses a SSL encrypted connection. The provided zip file contains all required files and configurations to connect a client by OCI, ODBC or JDBC with the Autonomous Transaction Processing Database. You have to protect this file to prevent unauthorized database access by a password.

 

The Client Credential Package

Content of the extracted „Client Credentials“ package, it will used later for the connection configuration and verification. This file can also be used to configure a connection to the database with the Oracle SQL Developer 17.4 or later.

Oracle Instant Client

Download the newest Oracle Instant Client by click on the „Download Oracle Instant Client“. Older clients than 12.2 do not support ATP connections. 

In my example, I use the 18c „Instant Client Downloads for Microsoft Windows (x64)“  and the additional package with the SQL*Plus and Oracle Data Pump components. 

Client Setup

To verify the connection, I have installed the Oracle Instant Client from above and configured the Windows environment like this:

Oracle Instant Client Installation Directory C:\oracle\product\instantclient_18_3
TNS_ADMIN which contains the extracted ATP database credentials / wallet C:\oracle\network\admin

 

The sqlnet.ora file which is provided by Oracle ATP has to be modified to the real location of the configuration files. If your TNS_ADMIN is not located in a subdirectory of your ORACE_HOME, change the directory path to the wallet from

to your TNS_ADMIN path:

If the path is not set correctly, you will get an ORA-28759: failure to open file error.

Connection Verification by SQL*Plus

SQL*Plus Connection Test with the user admin – the TNS alias is from the provided tnsnames.ora.

Links

docs.oracle.com – https://docs.oracle.com/en/cloud/paas/atp-cloud/index.html
ATP User Guide – https://docs.oracle.com/en/cloud/paas/atp-cloud/atpug/using-oracle-autonomous-transaction-processing.pdf
ATP Introduction – http://www.oracle.com/us/products/database/atp-brief-5029003.pdf

Summary of Part 1

Now the autonomous database is ready to use, the client connection works fine. The next steps are to export the on-premises data and import them into the Autonomous Transaction Processing Database.