In this blog post serie which has three parts, I want to describe how data will be uploaded from an on-premises environment into the Oracle Autonomous Transaction Processing database using Oracle Data Pump.
- Part 1: Creation of the ATP database and connection verification
- Part 2: Export of on-premises data with Oracle Data Pump and move into into the Oracle cloud
- Part 3: Import of data into the ATP database with Oracle Data Pump
Oracle Import Prerequisites
Credentials
To get acccess to the dump file on the Oracle Object Storage, a credential has to be created in the Oracle Autonomous Transaction Processing database with the DBMS_CLOUD procedure. For more information about the package, see the blog post from Christian Antognini – DBMS_CLOUD Package – A Reference Guide.
The DBMS_CLOUD procedure needs a password value which is the token from the user account. If you don’t now your token, create a new one. Go to Identity – Users – your username and click on the left side on “Auth Tokens”. Create a new token by click on “Generate Token”. The random generated string is the value for the DBMS_CLOUD password.
Enter a name for the token and click on “Generate Token”.
Note your generated token and “Close” the window.
Login into the Autonomous Transaction Processing database as admin user and create a new credential called ATPCS_CRED.
C:\oracle\product\instantclient_18_3>sqlplus admin/***MY_PASSWORD***@atphrdata_high SQL*Plus: Release 18.0.0.0.0 - Production on Fri Oct 5 13:43:58 2018 Version 18.3.0.0.0 Copyright (c) 1982, 2018, Oracle. All rights reserved. Last Successful login time: Fri Oct 05 2018 10:47:04 +02:00 Connected to: Oracle Database 18c Enterprise Edition Release 12.2.0.1.0 - 64bit Production SQL> BEGIN 2 DBMS_CLOUD.CREATE_CREDENTIAL( 3 credential_name => 'ATPCS_CRED', 4 username => 'martin.berger@trivadis.com', 5 password => '***MY_TOKEN***' 6 ); 7 END; 8 / PL/SQL procedure successfully completed.
A new Database User called HRATP
In the ATP, we create a new user called HRATP. The tablespace DATA is the default tablespace in an Autonomous Transaction Processing database and does not have to be defined.
SQL> CREATE USER hratp IDENTIFIED BY "***MY_PASSWORD***"; SQL> GRANT connect,resource TO hratp; SQL> ALTER USER hratp QUOTA UNLIMITED ON data;
Oracle Data Pump Import
The impdp tool is part of my Instant Client installation in Part 1 of this blog serie. Oracle recommends to set parameters like
- partition_options=merge
- transform=segment_attributes:n
- transform=dwcs_cvt_iots:y
- transform=constraint_use_default_index:y
- exclude=index, cluster, indextype, materialized_view, materialized_view_log, materialized_zonemap, db_link
Two new Oracle Data Pump parameters to work with the Oracle cloud databases are credential and and dumpfile.
- credential: The DBM_CLOUD created credential
- dumpfile: The URL where the dumpfile is located
Attention
The URL provided by the Object Storage menu cannot be accessed directly by impdp:
ORA-31640: unable to open dump file "https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/<identity_domain>/b/datapump_transfer/o/expdpHR05102018.dmp" for read ORA-17503: ksfdopn:11 Failed to open file https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/<identity_domain>/b/datapump_transfer/o/expdpHR05102018.dmp ORA-17500: ODM err:ODM HTTP Unauthorized
The URL has to be mapped from objectstorage to swiftobjectstorage:
https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/<identity_domain>/b/datapump_transfer/o/expdpHR05102018.dmp
to
https://swiftobjectstorage.eu-frankfurt-1.oraclecloud.com/v1/<identity_domain>/datapump_transfer/expdpHR05102018.dmp
Data Pump Import Execution with REMAP of Schema and Tablespace
Start of the Oracle Data Pump job from my Windows client:
$ impdp admin/***ADMIN_PASSWORD***@atphrdata_high directory=data_pump_dir credential=ATPCS_CRED dumpfile=https://swiftobjectstorage.eu-frankfurt-1.oraclecloud.com/v1/<identity_domain>/datapump_transfer/expdpHR05102018.dmp remap_schema=HR:HRATP remap_tablespace=HRDATA:DATA partition_options=merge transform=segment_attributes:n transform=dwcs_cvt_iots:y transform=constraint_use_default_index:y exclude=index, cluster, indextype, materialized_view, materialized_view_log, materialized_zonemap, db_link Import: Release 18.0.0.0.0 - Production on Fri Oct 5 14:17:17 2018 Version 18.3.0.0.0 Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 18c Enterprise Edition Release 12.2.0.1.0 - 64bit Production Master table "ADMIN"."SYS_IMPORT_FULL_01" successfully loaded/unloaded Starting "ADMIN"."SYS_IMPORT_FULL_01": admin/********@atphrdata_high directory=data_pump_dir credential=ATPCS_CRED dumpfile=https://swiftobjectstorage.eu-frankfurt-1.oraclecloud.com/v1/<identity_domain>/datapump_transfer/expdpHR05102018.dmp remap_schema=HR:HRATP remap_tablespace=HRDATA:DATA partition_options=merge transform=segment_attributes:n transform=dwcs_cvt_iots:y transform=constraint_use_default_index:y exclude=index, cluster, indextype, materialized_view, materialized_view_log, materialized_zonemap, db_link Processing object type SCHEMA_EXPORT/USER ORA-31684: Object type USER:"HRATP" already exists Processing object type SCHEMA_EXPORT/SYSTEM_GRANT Processing object type SCHEMA_EXPORT/ROLE_GRANT Processing object type SCHEMA_EXPORT/DEFAULT_ROLE Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE Processing object type SCHEMA_EXPORT/TABLE/TABLE Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA . . imported "HRATP"."EMPLOYEES" 17.08 KB 107 rows . . imported "HRATP"."LOCATIONS" 8.437 KB 23 rows . . imported "HRATP"."JOB_HISTORY" 7.195 KB 10 rows . . imported "HRATP"."JOBS" 7.109 KB 19 rows . . imported "HRATP"."DEPARTMENTS" 7.125 KB 27 rows . . imported "HRATP"."COUNTRIES" 6.367 KB 25 rows . . imported "HRATP"."REGIONS" 5.546 KB 4 rows Processing object type SCHEMA_EXPORT/TABLE/COMMENT Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE Processing object type SCHEMA_EXPORT/VIEW/VIEW Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT Processing object type SCHEMA_EXPORT/TABLE/TRIGGER Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type SCHEMA_EXPORT/STATISTICS/MARKER Job "ADMIN"."SYS_IMPORT_FULL_01" completed with 1 error(s) at Fri Oct 5 12:19:29 2018 elapsed 0 00:00:27
The message about the existing user can be ignored.
Data Pump Logfile
The logfile of the import process cannot be access directly, it has to be moved into the Object Storage with the DBMS_CLOUD package first.
SQL> BEGIN 2 DBMS_CLOUD.PUT_OBJECT( 3 credential_name => 'ATPCS_CRED', 4 object_uri => 'https://swiftobjectstorage.eu-frankfurt-1.oraclecloud.com/v1/<identity_domain>/datapump_transfer/import.log', 5 directory_name => 'DATA_PUMP_DIR', 6 file_name => 'import.log' 7 ); 8 END; 9 /
Now the file can be access in the Object Storage menu:
Connect as new User HRATP to verify the Data
C:\oracle\product\instantclient_18_3>sqlplus hratp/***MY_PASSWORD***@atphrdata_high SQL*Plus: Release 18.0.0.0.0 - Production on Fri Oct 5 14:37:27 2018 Version 18.3.0.0.0 Copyright (c) 1982, 2018, Oracle. All rights reserved. Connected to: Oracle Database 18c Enterprise Edition Release 12.2.0.1.0 - 64bit Production SQL> select tname from tab; TNAME -------------------------------------------------------------------------------- COUNTRIES DEPARTMENTS JOB_HISTORY REGIONS LOCATIONS EMPLOYEES JOBS EMP_DETAILS_VIEW 8 rows selected.
Summary of Part 3
If all prerequistes are met, the data transfer with Oracle Data Pump is easy to configure and easy to handle. Take care about the token, only 2 token per user can be generated. If you losed it, you have to delete a existing token, rebuild it and re-create the credentials before you can import data again.