Oracle OCI Data Transfer Service – A journey from Kestenholz/Jurasüdfuss/Switzerland to Frankfurt and back

The Oracle Data Transfer service is a offline data transfer method to migrate data to the Oracle Cloud Infrastructure. A transfer service is useful, when your network bandwidth and connection is not sufficient to upload your migration data  in a meaningful time. Oracle offers two methods: The disk-based data transfer and the appliance-based data transfer. The service is no only one-way, data can also be exported in an Oracle Cloud Infrastructure data center and shipped to your data center.

According one of my Company Trivadis’ cultural value called curiosity, I was wondering how this service works. This is the story of a tiny USB Hard Disk Drive full of data, which was going on a long journey from Kestenholz / Jurasüdfuss / Solothurn / Switzerland to the Oracle Infrastructure data center in Frankfurt and back.

Setup

  • The OCI Data Transfer utility is Linux based, the USB 3.0 HDD is attached to a VMware Virtual machine where Oracle Linux is running
  • The virtual machine has access to the Internet
  • Data is available – for this example I used some open data from Swiss government (opendata.swiss)

Data Transfer Service Regions

Actually data transfer is available in Frankfurt, Ashburn, Phoenix, London and Osaka. From Switzerland, Frankfurt is the nearest location.

How is your data coming into the Oracle Cloud

  1. Enable Data Transfer Service – Data Transfer
  2. Prepare an Object Storage bucket
  3. Create a Transfer Job
  4. Attach a HDD to a Linux based host, use the Data Transfer Utility to create and encrypt the device
  5. Copy data to the HDD
  6. Create and upload the disk manifest
  7. Lock the disk
  8. Attach the Disk to the transfer label and create the package
  9. Create a transfer package
  10. Shipping and update shipping information
  11. Tracking
  12. Data Verification
  13. Object Storage Replication Policy (optional)
  14. Finally…

Note: Most of the jobs above can be done by the OCI CLI on command line and are very well described in the Oracle doumentation.

1. Enable Data Transfer Service – Entitlement

Before you can use this service, the Data Transfer service has to be enabled in general. Therefore you have to request it. The OCI tenant administrator gets a document, what he has to sign in a digital way. It contains for example a description how to bring data to OCI, and if you order an appliance that there will be a 45day maximum limit where the appliance has to be returned to Oracle. And a few days later, the service is ready to use. Basically now you have the permissions, to order a Data Transfer Appliance, but in this test I used the Disk Service.

2. Prepare an Object Storage Bucket

In the Frankfurt region, I created a new Object Storage bucket called data_transfer_usb. This is the bucket where the shipped data will be transferred in.

3. Create a Transfer Job

In Object Storage – Data Transfer Import,  we create a new transfer job. It contains the upload bucket from above and as method the transfer device type is disk. For furtehr processing, we need the OCID of the job. As you can see, actually there is no transfer disk attached.

4. Attach a HDD to a Linux based host, use the Data Transfer Utility to create and encrypt the Device

Prerequisites for the Data Transfer Utility according the documentation:

  • An OCI Account which have the IAM permissions for Data Transfer
  • A Linux machine with Oracle Linux 6 or greater, Ubuntu 14.04 or greater, SUSE 11 or greater
  • Java 1.8  or 1.11
  • hdparm 9.0 or later
  • Cryptsetup 1.2.0 or later

Package Installation for my Oracle Linux 7 Machine

# yum install java-1.8.0-openjdk-devel
# yum install hdparm
# yum install cryptsetup

Download and Installation of the Data Transfer Utility

The actual link to the file is in the online documentation.

# wget https://docs.cloud.oracle.com/tools/dts-rpm/3.0.2461/dts-3.0.2461.x86_64.rpm
# yum localinstall /tmp/dts-3.0.2461.x86_64.rpm

Test

# dts --version
3.0.2461 ( DEV_NO_HASH )

Configure IAM Credentials for Data Transfer Actions

The configuration is according configuring the Oracle Cloud Infrastructure CLI with user, fingerprint, key_file, tenancy and region. Example configuration file:

[DEFAULT]
user=ocid1.user.oc1..aaaaaaaadlv5vwm3zfdnyanosmdjm7cyjzdhmgawre12345671234567
fingerprint=6a:ce:8e:a7:4a:a1:70:ad:4e:d7:e6:8f:d1:23:45:g6
key_file=/root/.oci/oci_api_key.pem
tenancy=ocid1.tenancy.oc1..aaaaaaaaxuk4je4t3aorovuzmwyeaq5sftqv3nkyz64s12345671234567
region=eu-frankfurt-1

Verify Credentials

# dts job verify-upload-user-credentials --bucket data_transfer_usb
created object BulkDataTransferTestObject in bucket data_transfer_usb
overwrote object BulkDataTransferTestObject in bucket data_transfer_usb
inspected object BulkDataTransferTestObject in bucket data_transfer_usb
read object BulkDataTransferTestObject in bucket data_transfer_usb

Show Data Transfer Job Details – Status is PREPARED

Here you can see the shipping address from the Oracle Infrastructure data center frankfurk and the label. Both information are used later in process.

# dts job show --job-id ocid1.datatransferjob.oc1.eu-frankfurt-1.antheljszffzlryauwznondp2zp6rztpbvzyekvpmgdnnobcgrdb4awa7c5q
Transfer Job :
ID : ocid1.datatransferjob.oc1.eu-frankfurt-1.antheljszffzlryauwznondp2zp6rztpbvzyekvpmgdnnobcgrdb4awa7c5q
CompartmentId : ocid1.compartment.oc1..aaaaaaaat5uo2xh77edws4huwvqorengp7x4xdv6x3giw3vryk36vyydwsdq
UploadBucket : data_transfer_usb
Name : data_transfer_usb_job_01
Label : J5Y0R5KAS
CreationDate : 2020/10/27 12:53:02 CET
Status : PREPARING
freeformTags : *** none ***
definedTags :
Oracle-Tags :
CreatedBy : oracleidentitycloudservice/martin.berger@trivadis.com
CreatedOn : 2020-10-27T11:53:02.320Z
Packages :
[1] :
Label : PDQ0QMLNT
TransferSiteShippingAddress : Telehouse c/o Oracle Data Transfer Service; Christos Panoudis; Phone: +49152 22882673 Job:J5Y0R5KAS Package:PDQ0QMLNT ; Kleyerstraße 75-87; 60326 Frankfurt am Main; Germany
DeliveryVendor : DHL
DeliveryTrackingNumber : *** none ***
ReturnDeliveryTrackingNumber : *** none ***
Status : PREPARING
Devices : [*** none ***]
UnattachedDevices : [*** none ***]
Appliances : [*** none ***]

Prepare USB Hard Disk Drive

The disk is attached as /dev/sdb – it is a Western Digital drive. Important: The disk needs no partition.

# fdisk -l /dev/sdb
Disk /dev/sdb: 931.5 GiB, 1000170586112 bytes, 1953458176 sectors
Units: sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disklabel type: dos
Disk identifier: 0xe7ec0c10

Device Boot Start End Sectors Size Id Type
/dev/sdb1 2048 1953458175 1953456128 931.5G 7 HPFS/NTFS/exFAT

# hdparm -I /dev/sdb | grep Model
Model Number: WDC WD10JMVW-11S5XS0

Create Transfer Disk for Data Copy

This command will setup the disk and mount it immediately. As additional information we need the disk label for further processing.

# dts disk create --job-id ocid1.datatransferjob.oc1.eu-frankfurt-1.antheljszffzlryauwznondp2zp6rztpbvzyekv12345671234567 --block-device /dev/sdb

Will prepare /dev/sdb
serial number WD-WX61E82FY633 as an encrypted block device.
=> Set up disk for encryption using cryptsetup.
=> Create file system.
=> Mount file system.

Continue?? [y/n]y

---> IMPORTANT: RECORD THE ENCRYPTION PASSPHRASE: [m, o, 3, 4, +, j, w, I, 3, +, s, M, =] <---


The passphrase cannot be retrieved again. It will be used to encrypt this disk
and normally will not be needed again. However, if the system is restarted before
all files are copied to the filesystem and the disk is then finalized through this
CLI, you will need to provide the passphrase.

Continue?? [y/n]y

The disk label is DAMOED7GH
DO NOT INTERRUPT THIS COMMAND WHILE THE DISK IS BEING PREPARED.
This can take several minutes ...

initialized device for encryption
set up device for encryption
creating ext4 filesystem on /dev/mapper/WD-WX61E82FY633 ...
Found a atari partition table in /dev/mapper/WD-WX61E82FY633
created filesystem

Successfully prepared the disk.
You can now copy files to the file system.
After all files are copied, run this command to create the manifest:

dts disk manifest --job-id ocid1.datatransferjob.oc1.eu-frankfurt-1.antheljszffzlryauwznondp2zp6rztpbvzyekv12345671234567 --disk-label DAMOED7GH [--object-name-prefix <object_name_prefix>]

Transfer Disk :
Label : DAMOED7GH
SerialNumber : WD-WX61E82FY633
Status : PREPARING
EncryptionPassphrase : mo34+jwI3+sUaL/NORhXOKQU9k12345671234567

Mount point is /mnt/orcdts_DAMOED7GH.

# df -TH /mnt/orcdts_DAMOED7GH
Filesystem Type Size Used Avail Use% Mounted on
/dev/mapper/WD-WX61E82FY633 ext4 984G 80M 934G 1% /mnt/orcdts_DAMOED7GH

The Transfer Disk status has changed to PREPARING and the disk serial number is registered now.

5. Copy Data to HDD

For the test run I have copied some Open Data stuff, an Oracle Backup and Oracle Data Pump export files to the disk.

# ll /mnt/orcdts_DAMOED7GH
total 1540788
-rw-r--r--. 1 root root 13816663 Oct 28 11:07 basel-opendata.tgz
-rw-r--r--. 1 root root 132009 Oct 28 11:07 bernmobil-opendata.tgz
-rw-r--r--. 1 root root 28493853 Oct 28 11:07 bundesverwaltung-opendata.tgz
-rw-r--r--. 1 root root 307998059 Oct 28 11:07 expdphr-28-10-20-01.dmp.tgz
-rw-r--r--. 1 root root 307998059 Oct 28 11:07 expdphr-28-10-20-02.dmp.tgz
drwx------. 2 root root 16384 Oct 28 11:03 lost+found
-rw-r--r--. 1 root root 695625592 Oct 28 11:08 RMAN_CDB118_28102020.tgz
-rw-r--r--. 1 root root 223666017 Oct 28 11:08 sbb-opendata.tgz

6. Generate Manifest File

It generates a local file which contains a list of the files and her MD5 checksums like an inventory file. Here the disk label is required.

# dts disk manifest --job-id ocid1.datatransferjob.oc1.eu-frankfurt-1.antheljszffzlryauwznondp2zp6rztpbvzyekv12345671234567 --disk-label DAMOED7GH
Walking the filesystem to gather stats and to check for errors
Reading all checkpoint data. This might take a while depending on checkpoint log size.
2020/10/28 11:09:18 Manifest generation started

Files processed : 2 / 8
Bytes processed : 41.15 MB / 1.47 GB

Files processed : 2 / 8
Bytes processed : 103.15 MB / 1.47 GB

Files processed : 2 / 8
Bytes processed : 164.15 MB / 1.47 GB

Files processed : 2 / 8
Bytes processed : 228.15 MB / 1.47 GB

Files processed : 2 / 8
Bytes processed : 284.15 MB / 1.47 GB

Files processed : 2 / 8
Bytes processed : 353.15 MB / 1.47 GB

Files processed : 2 / 8
Bytes processed : 412.15 MB / 1.47 GB

Files processed : 2 / 8
Bytes processed : 476.15 MB / 1.47 GB

Files processed : 2 / 8
Bytes processed : 545.15 MB / 1.47 GB

Files processed : 3 / 8
Bytes processed : 616.88 MB / 1.47 GB

Files processed : 4 / 8
Bytes processed : 881.06 MB / 1.47 GB

Files processed : 6 / 8
Bytes processed : 950.54 MB / 1.47 GB

Files processed : 6 / 8
Bytes processed : 1007.54 MB / 1.47 GB

Files processed : 6 / 8
Bytes processed : 1.04 GB / 1.47 GB

Files processed : 6 / 8
Bytes processed : 1.11 GB / 1.47 GB

Files processed : 6 / 8
Bytes processed : 1.18 GB / 1.47 GB

Files processed : 6 / 8
Bytes processed : 1.25 GB / 1.47 GB

Files processed : 6 / 8
Bytes processed : 1.31 GB / 1.47 GB

Files processed : 6 / 8
Bytes processed : 1.37 GB / 1.47 GB

Files processed : 6 / 8
Bytes processed : 1.41 GB / 1.47 GB

Files processed : 8 / 8
Bytes processed : 1.47 GB / 1.47 GB

The file:

# cat /mnt/orcdts_DAMOED7GH/.bulk_data_transfer/metadata/manifest

{"path":"logFile.log","md5":"PZdMVbAydkbIo3JuJW5HKQ==","lastModifiedTime":1603879758896,"size":26710}
{"path":"bernmobil-opendata.tgz","md5":"J6VvwZemr5ayF+hZmEMvUQ==","lastModifiedTime":1603879648205,"size":132009}
{"path":"expdphr-28-10-20-02.dmp.tgz","md5":"tgtoAyh79EaAHnG2cMhJ4g==","lastModifiedTime":1603879668772,"size":307998059}
{"path":"basel-opendata.tgz","md5":"rRKvfnfw3kjRgp+yyJd9eg==","lastModifiedTime":1603879648180,"size":13816663}
{"path":"sbb-opendata.tgz","md5":"iggsOeCuvpE+zDlwoYmXcA==","lastModifiedTime":1603879699757,"size":223666017}
{"path":"bundesverwaltung-opendata.tgz","md5":"Pbs8W5rNpqs+x48bb/wVQw==","lastModifiedTime":1603879649038,"size":28493853}
{"path":"RMAN_CDB118_28102020.tgz","md5":"6ivSfbVKHceQ6FMy2xVSNA==","lastModifiedTime":1603879692329,"size":695625592}
{"path":"expdphr-28-10-20-01.dmp.tgz","md5":"tgtoAyh79EaAHnG2cMhJ4g==","lastModifiedTime":1603879658104,"size":307998059}

7. Lock the Disk

# dts disk lock --job-id ocid1.datatransferjob.oc1.eu-frankfurt-1.antheljszffzlryauwznondp2zp6rztpbvzyekv12345671234567 --disk-label DAMOED7GH --block-device /dev/sdb
Copying upload user credentials.
created object BulkDataTransferTestObject in bucket data_transfer_usb
overwrote object BulkDataTransferTestObject in bucket data_transfer_usb
inspected object BulkDataTransferTestObject in bucket data_transfer_usb
read object BulkDataTransferTestObject in bucket data_transfer_usb
Scanning filesystem to validate manifest. If special files are encountered, they will be listed below.
validated manifest
/dev/sdb DAMOED7GH is encrypted and locked
Locked disk.

8. Attach the Disk to the Transfer Label and create the Package

The status now changes to ACTIVE.

# dts disk attach --job-id ocid1.datatransferjob.oc1.eu-frankfurt-1.antheljszffzlryauwznondp2zp6rztpbvzyekv12345671234567 --disk-label DAMOED7GH --package-label PDQ0QMLNT
Attached disk: DAMOED7GH to package: PDQ0QMLNT

9. Shipping and Shipping Information Update

As shipping company I used DHL Switzerland. They have a pick point near by in Langenthal. At this point, it’s important to organize the return shipping too and put the return shipping label in the box. I didn’t realize it and have forgotten to organize the return shipping. So the disk was stranded in the Frankfurt data center.  And then the story began. As a private person, the delivery companies DHL and UPS doesn’t allow private persons to re-import packages from outside Switzerland  without a customer number. But, private persons don’t get such a number. Finally with FedEx I was able to organize the return shipping. Thanks to Andrew and Christos from Oracle’s OCI Data Transfer team for their patience!

Note: Companies like DHL have templates to create pro-forma commercial invoices – https://www.dhl.ch/exp-de/express/zollabwicklung/zollpapiere/proforma_rechnung.html#invoice

This disk was sent to the Oracle Cloud Infrastructure data center Frankfurt.

Now the shipping information has to be updated with vendor and the tracking numbers.

# dts package ship --job-id ocid1.datatransferjob.oc1.eu-frankfurt-1.antheljszffzlryauwznondp2zp6rztpbvzyekv12345671234567 --package-label PDQ0QMLNT --package-vendor DHL --tracking-number 5542918614 --return-tracking-number 12345678
Update package: PDQ0QMLNT

10. Tracking

DHL required two days until delivery in Frankfurt. Oracle started one day later with the data import.

11. Data Processing

Then Oracle is uploading the data and the disk is attached, the job transfer status changes to PROCESSING.

# dts package show --job-id ocid1.datatransferjob.oc1.eu-frankfurt-1.antheljszffzlryauwznondp2zp6rztpbvzyekv12345671234567 --package-label PDQ0QMLNT
Transfer Package :
Label : PDQ0QMLNT
TransferSiteShippingAddress : Telehouse c/o Oracle Data Transfer Service; Christos Panoudis; Phone: +49152 22882673 Job:J5Y0R5KAS Package:PDQ0QMLNT ; Kleyerstraße 75-87; 60326 Frankfurt am Main; Germany
DeliveryVendor : DHL
DeliveryTrackingNumber : 5542918614
ReturnDeliveryTrackingNumber : 12345678
Status : PROCESSING
Devices :
[1] :
Label : DAMOED7GH
SerialNumber : WD-WX61E82FY633
UploadStatusLogURL : J5Y0R5KAS/DAMOED7GH/upload_summary.txt
Status : PROCESSING

12. Data Verification

Finally the data is arrive in the Oracle Cloud Infrastructure Object Storage and is ready for use. The file processing is logged in the new created file upload_summary.txt.

# dts package show --job-id ocid1.datatransferjob.oc1.eu-frankfurt-1.antheljszffzlryauwznondp2zp6rztpbvzyekv12345671234567 --package-label PDQ0QMLNT
Transfer Package :
Label : PDQ0QMLNT
TransferSiteShippingAddress : Telehouse c/o Oracle Data Transfer Service; Christos Panoudis; Phone: +49152 22882673 Job:J5Y0R5KAS Package:PDQ0QMLNT ; Kleyerstraße 75-87; 60326 Frankfurt am Main; Germany
DeliveryVendor : DHL
DeliveryTrackingNumber : 5542918614
ReturnDeliveryTrackingNumber : 12345678
Status : PROCESSING
Devices :
[1] :
Label : DAMOED7GH
SerialNumber : WD-WX61E82FY633
UploadStatusLogURL : J5Y0R5KAS/DAMOED7GH/upload_summary.txt
Status : PROCESSING


########################################################################################################################################################################################################
######################################### SUMMARY FOR DEVICE [WD-WX61E82FY633] ############################################
Generated at 2020-11-02 18:27:35
TOTAL: 8
### P present: 8
### M missing: 0
### C name collision: 0
### U unreadable: 0
### N nameTooLong: 0
########################################################################################################################################################################################################
| STATUS | NAME | LAST_MODIFIED | SIZE(MB) | MD5 | ETag |
| present | RMAN_CDB118_28102020.tgz | Mon Nov 02 18:27:36 UTC 2020 | 663.40 | 6ivSfbVKHceQ6FMy2xVSNA== | 99baa855-b15c-4f79-bfaa-dbf16385a182 |
| present | logFile.log | Mon Nov 02 18:27:16 UTC 2020 | 0.05 | Aepiez79Y7/WZYweOWA40w== | 894a88ee-c2f2-454b-875a-0bc3339ff6ab |
| present | bernmobil-opendata.tgz | Mon Nov 02 18:27:16 UTC 2020 | 0.13 | J6VvwZemr5ayF+hZmEMvUQ== | 38aa09fa-46f7-4009-8f8d-b710a43fe74f |
| present | expdphr-28-10-20-02.dmp.tgz | Mon Nov 02 18:27:29 UTC 2020 | 293.73 | tgtoAyh79EaAHnG2cMhJ4g== | 78094b66-6366-464a-9c93-3d4a4e1be3ca |
| present | basel-opendata.tgz | Mon Nov 02 18:27:17 UTC 2020 | 13.18 | rRKvfnfw3kjRgp+yyJd9eg== | f70fb2e9-baf2-40b1-a1dc-7522e6005d66 |
| present | bundesverwaltung-opendata.tgz | Mon Nov 02 18:27:19 UTC 2020 | 27.17 | Pbs8W5rNpqs+x48bb/wVQw== | 20f46883-2e13-4655-8c63-ff455db81fba |
| present | sbb-opendata.tgz | Mon Nov 02 18:27:26 UTC 2020 | 213.30 | iggsOeCuvpE+zDlwoYmXcA== | 90b81b13-611d-440a-9915-35c53e46b10b |
| present | expdphr-28-10-20-01.dmp.tgz | Mon Nov 02 18:27:29 UTC 2020 | 293.73 | tgtoAyh79EaAHnG2cMhJ4g== | 258c26c8-6d2f-4fba-864f-b03062de699e |
########################################################################################################################################################################################################

13. Object Storage Replication Policy (optional)

The files are now in the data center Frankfurt, but I want to have them in the Swiss data center region Zurich. Therefore I set a replication policy on level Object Storage. In Zurich, a new bucket called data_transfer_usb_from_FRA is created. And a few minutes later, the files were available in the Object Storage Zurich. Sure, it depends on the file size 😉

Finally…

Detach the transfer disk so the data center guys can send it back to you.

And after a few days…welcome FedEx in Kestenholz / Jurasüdfuss / Solothurn / Switzerland!

Some words about Shipping and Costs

Shipping costs from DHL and Fedex:

Vendor From To Costs
DHL Langenthal / Switzerland Frankfurt 79.50 CHF
FedEx Frankfurt Kestenholz 130.45 CHF

Links

Summary

To watch nice marketing slides and documents about  cool features is not enough. To find out how it works in the real word, a real is test is required. How to migrate data into a data center of any cloud provider should be basic know-how of each consultant which is working with and on cloud themes. Moving data by a disk ro an appliance opens a lot of possibilities for data migrations into the cloud. For example a huge DWH: Transfer the RMAN backup into the cloud, restore it, close the GAP by an incremental backup and synchronize it with Oracle Golden Gate. #ilike