Oracle Cloud Infrastructure

Oracle 11g – let’s move that old Stuff to the Oracle Cloud

This blog post describes the lift and shift of an on-prem Oracle 11g Enterprise Edition to Oracle Cloud Infrastructure by using Oracle RMAN paired with OCI Object Storage. Works for other versions > Oracle 11g / Enterprise Edition too (the tablespace encryption method may change).

Architecture

Q&A

What’s the motivation of the lift and shift of a good old on-prem Oracle 11g database to Oracle Database Service?

  • Hardware runs out of lifecycle
  • Out scoping of an on-prem database which is only used for several times to compare old data
  • Regulatory, for example we have the keep and access the data for 10 yrs
  • Changing from on-prem licenses to the license included model
  • Upgrade tests for 19c and convert to Multitenancy Architecture
  • Part of company’s cloud strategy

Why using the OCI database service instead of a cheaper compute instance?

Only the database service allows tablespace encryption without any additional costs in the license included model. From my point of view, this is a must when running Oracle databases outside of the on-prem datacenter in any cloud. And for this case, 11g to 11g, it fit’s best without any configuration overhead. And finally, the root OS access gives me flexibility.

Are the other methods available than using RMAN backup/restore/recovery with the Oracle Database Backup Service?

Sure, according the Oracle docs:

  • Golden Gate
  • Data Transfer Service
  • Oracle Data Pump
  • Database Migration Service
  • RMAN Transportable Tablespaces

Why I like the Oracle Database Backup Cloud Service?

  • The on-prem RMAN backups have to be encrypted, no encryption, no Database Backup Service.
  • The configuration of the Backup Service Module is not really complicated, well documented and can be used 1:1 in on-prem and in the cloud. For 11.2.0.4,I need to install on-prem a patch to encrypt the backups (18339044 – RMAN-06770: backup encryption requires Enterprise Edition).
  • The backup configuration can be done in advanced without any pressure, the target database can be recovered at any time. This reduces once the restore is done the downtime when the final move should be done. Backing up an Oracle database by RMAN locally and to the cloud can co-exist without any problems.
  • The backup is stored in the Object Storage. With a replication policy, I can rebuild a new database in another region in an easy way.
  • Oracle Database Backup Module allows proxy configurations to backup to the cloud.

Links and My Oracle Support Notes

Migration Steps

  1. Create OCI Object Storage
  2. Configure on-prem database to use the Oracle Database Backup Service
  3. Create a new OCI Database Virtual Machine
  4. Configure OCI cloud database to use the Oracle Database Backup Service
  5. Clean up OCI cloud database
  6. Restore OCI cloud database from Object Store
  7. Encrypt tablespace

1. OCI Object Storage

A Object storage bucked called onprem-bucket is created in region eu-zurich-1:

 

2. Configure on-prem database to use the Oracle Database Backup Service

The installation of the Backup Service is described in this link here, you can download the OCI Backup Service Module: www.oracle.com/technetwork/database/availability/oracle-cloud-backup-2162729.html

On-Prem Database – Specifications

  • 11.2.0.4 Enterprise Edition
  • Database name DB11
  • Oracle Linux 7.9
  • About 60GB size
  • Oracle Backup Service to Object Storage enabled and configured
  • Regular RMAN inc0, inc1c and arc backups to the Cloud
  • Enabled RMAN compression and encryption (for the usage of the Backup Service the license is included)
  • Backup encrypted by password

On-Prem Database – Tablespaces and Datafiles

On-Prem Database – RMAN Backup to Oracle Cloud Infrastructure

Example RMAN Output where the media is the Oracle Cloud Infrastructure Object Storage in Zurich / Switzerland.

 

3. Create a new OCI Database Virtual Machine

A new 11g Oracle Database virtual machine is created, license included. 11g is not available for Logical Volumes unfortunately, therefore we need to use Grid Infrastructure.

Cloud Database – Specifications

  • 11.2.0.4 Enterprise Edition Database Virtual Machine
  • Oracle Linux 7.9
  • Grid Infrastructure (11g is not available for Logical Volumes unfortunately)
  • Database Name is DB11
  • Connected from the on-prem data center a) by VPN or b) by SSH-Tunnel via Bastion Host

Cloud Database – Controlfile, Tablespaces and Datafiles

Login as OS user opc, sudo to oracle and set environment.

Login as SYSDBA.

Query controlfile information.

Query datafile information. The existing datafiles will be dropped later before the restore starts.

Query tablespace information for encryption, the USERS tablespace is already encrypted. Encryption of SYSTEM, SYSAUX etc. is introduced in a later Oracle version and not available in 11g.

4. Configure OCI Cloud Database to use the Oracle Database Backup Service

Note: When the database is located in a VCN private subnet, to install and using the Oracle Database Backup Service module you have to configure these two resources:

  • Service Gateway for Object Storage access
  • NAT Gateway used by the Database Backup Service Module Library installation routine

For the installation the SSH private key from the existing on-prem Database Backup Service configuration is required and has to be transferred to cloud server. The fingerprint of the public ssh key is required.

A list of object storage endpoints is avaliable here: Object Storage Service API | Oracle Cloud Infrastructure API Reference and Endpoints

Cloud Database – Oracle Database Backup Service Installation and Configuration

Login to cloud virtual machine and sudo to OS user oracle

Create directory for the Database Backup Module installation, transfer and extract it.

Create directories for wallet and library.

Transfer the SSH key from on-prem to the cloud virtual machine into oci_wallet directory.

Install Oracle Backup Service Module, use the private key and public fingerprint from the on-prem installation. Set environment and start the installer. Use the bucket name from OCI Object Storage bucket which you have created first.

Installer Output.

A new parameter file is created which contains the OCI Object Storage information. This configuration file is used later in RMAN.

5. Clean up OCI Cloud Database and restart NOMOUNT

Login to cloud database virtual machine as OS user grid.

Set environment to database DB11.

Shutdown the database instance by using the database unique name.

Error message when environment for +ASM is set:

ASM Cleanup with environment +ASM – you have to add ORACLE_HOME manually

Login in ASM and remove existing controlfiles, datafiles, tempfiles and directories.

Set environment to DB11.

Start database NOMOUNT.

6. Restore OCI cloud database from Object Store

Restore and recovery of the database, the database has same name as the on-prem database. The datafile is migrated from file system to ASM. Required information for restore and recovery:

  • On-prem database DBID
  • Encryption password

Cloud Database – Login in RMAN and set Decryption Password set Source DBID

Cloud Database – Restore Controlfile from Object Storage

Use the library path and the path to the configuration file (OPC_FILE) properly.

Cloud Database – Mount Instance

Cloud Database – Restore Instance

Allocate channel for maintenance first.

Start restore.

Recover database – ignore the last line of the incomplete recovery.

Cloud Database – Open RESETLOGS

7. Encrypt Tablespace

Login as OS user oracle / SYS as SYSDBA to verify the existing situation.

Cloud Database – Verify Tablespaces

Cloud Database – Tablespace Encryption

We use the existing wallet and add a new TDE master key to the configuration. Show parameter for tablespace encryption.

Verify existing actual encryption situation, no tablespaces are encrypted.

Take user tablespaces offline – a small syntax provider script.

As SYS AS SYSDBA, set new Master Key

Encrypt tablespaces – a small syntax provider script.

Take encrypted tablespaces online, the encryption starts and the taking online action needs some time (depends on CPU and I/O).

A few minutes later, the user tablespaces are shown as encrypted.

Verify encrypted tablespaces by DBVerify – as you can see here, Total Pages Encrypted is shown.

Summary and what’s next

Migration of an on-prem database to Oracle Cloud Infrastructire by RMAN and Object Storage is a very nice method to bring not only older databases into the Oracle Cloud Infrastructure. Once there, you can leverage of OCI features like Data Safe, monitoring, backup to Object Storage and many more.

#ilikeit

Never stop Learning – why I love Oracle LiveLabs

Since over one and a half year, this week I was back in an onsite training, live people, live teaching. With a motivated junior DBA class, we started with all about Oracle architecture based on our Trivadis training O-AI – Oracle Architecture and Internals. The training is a mix between slides, demos and labs. Therefore during the course we run the training environments in Oracle Cloud Infrastructure, build by Terraform (Credits to Stefan Oehrli from oradba.ch which has ramped up the whole stuff). After the course at the end of the month, the environments will be cleaned up. And what’s next?

Training Environments

There are a lot of possibilities to get a deeper knowledge of all this Oracle stuff like processes, data encryption, multitenancy, datapump and so on:

But my actual favorite is Oracle LiveLabs!

Oracle LiveLabs

This platform is not only for DBAs, it has a lot of workshops for Application Developers, Data Scientists and DevOps Engineers too. There are different workshop types available:

  • get your free Oracle Cloud Infrastructure training environment for free during a time period like Oracle Database 19c New Features – run on LiveLabs
  • workshops which are running in a free tenancy
  • workshops what you can do in your own paid tenancy

At the moment there are 21 workshops where you get a live environment with all components you need virtual machines or database in Oracle Cloud Infrastructure like Oracle Multitenant Fundamentals, Database 19c – Automatic Indexing, 21c New Features on Autonomous Database and many more. All workshops are very well described, from the access to the initial setup and finally for the workshop himself too.

In this case I have decided to start the Oracle Multitenant lab to gather more information how PDB Snapshot Copy works.

1st – Search your training in the available workshops  and press Launch

2nd – Define where the Workshop should run

In this case, I want to reserve an environment. This is not possible for all workshops, you can see that in the workshop details if it’s possible to the an Oracle Cloud Infrastructure setup.

3rd – Define the Start Date and propose your SSH Public Key

With the key, you can get access to the training servers by SSH. In this case I want to start the workshop immediately. Otherwise define a start and end date. If you don’t want to start now, you will get a confirmation that the workshop is reserved and an email at the day where the workshop starts with the credential information.

4rd – View my Reservation

After some minutes, the status for the workshop is updated. As you can see here, in about three minutes from now, the environment should be ready. You will receive a confirmation mail.

5th – Launch Workshop

When the workshop is ready, the workshop can be launched.

6th – Workshop Details

All information you need is in the details like:

  • User name
  • Initial password for OCI
  • Compartment
  • Instance public IP

Here you have also the chance to extend your workshop reservation time. Follow the Get Started instructions to the bottom and push the button to move on the introduction. Step by step you are guide through the login and setup process. All labs contain a manual how to connect and to do the initial setup like starting listeners or get scripts from the OCI Object Storage.

There are the connection options how you can interact with the LiveLab:

  1. Connect using Cloud Shell
  2. Connect using MAC or a Windows CYGWIN Emulator
  3. Connect using Putty

Example code for the multitenancy lab preparation:

Summary

Oracle LiveLabs is another great opportunity to learn and train new stuff. All you have to take care now is to follow the workshop instructions and take care about the limited time. Enjoy it, learn new stuff and have fun! Oracle LiveLabs are easy to join, easy to set up and well described. This is why I love it 🙂

Oracle Cloud Infrastructure – A short Blog Post about a secure and small Development Setup

For an internal project I had the pleasure to setup a new Oracle Cloud Infrastructure environment for an APEX development team. Here is a short overview about the setup.

Requirements

  • VPN Access from everywhere – 2 people are working maximal at same time on the environment
  • Oracle Standard Edition 2 – no license available in project
  • Small monitoring to verify server stats
  • Instances can be started and stopped from the developers to save costs for example over night, weekend, holiday etc.

Architecture Diagram

Resource Network Usage Remarks
Open VPN Access Server Public Subnet VPN client access and traffic routing OCI Cloud Marketplace Image – OpenVPN Access Server (2 FREE VPN Connections) – OpenVPN Inc. – Oracle Cloud Marketplace
Management Server Private Subnet OCI-CLI, Monitoring Application server and database node start/stop with OCI-CLI, Grafana and Prometheus for monitoring
Application Server Private Subnet Tomcat ORDS, APEX
Database System Private Subnet OCI Database Standard Edition 2, Backup to Object Store enabled

Network Components

  • Regional private and public subnet
  • Security lists and network security groups
  • Private and public routing table
  • NAT gateway for regional private subnet

Monitoring

Grafana and Prometheus, running on the management server. The free shape VM.Standard.E2.1.Micro fits perfect for this small setup! The Prometheus node exporter runs on the database and the application server. I used this Grafana dashboard here: Prometheus Node Exporter Full dashboard for Grafana | Grafana Labs

Links

Next Steps

  • Adding Influx DB for persistence
  • Adding the Oracle database to Grafana monitoring
  • Optimizing shape size for the database server according usage

Other Ideas

  • Create a blueprint for internal developer environments
  • Automate the setup with Terraform and Ansible

Summary

Setting up this infrastructure in Oracle Cloud Infrastructure was fun. All developer requirements are fulfilled. Started with the Network and OpenVPN configuration – I really like their Marketplace instance – and the moved on to application and database server, step-by-step. There are many other ideas what we can do more based on this setup, the work will not run out. #ilike

Oracle Cloud Infrastructure Data Safe – How to burn down 201.44 Swiss Francs in 30 Seconds…

Is Data Safe really for free?

In the last autumn, the new Oracle Cloud Infrastructure feature called Data Safe was released. For sure, new features has to be tested. I have tested the Data Safe feature too and added a cloud database to Data Safe. But in my enthusiasm about this cool feature – or maybe it was just too late in the evening –  I did a mistake by adding the database target. Four days later, I recognized that Data Safe is charged in my account. Mmm, but should it not be for free? First reaction: I raised an SR and described the case. The nice guy from My Oracle Support realized the situation quickly:

Dear Mister Berger, you have used the wrong target type when adding the Oracle Cloud Infrastructure database as a new Data Safe target.

From the Service Request:

  • B91632 – Oracle Cloud Infrastructure – Data Safe for Database Cloud Service – Each (Includes 1 million audit records per target per month) – Free
  • B91631 – Oracle Cloud Infrastructure – Data Safe for Database Cloud Service – Audit Record Collection Over 1 Million Records (over 1 million audit records per target per month) – 0.0800 / 10,000 Audit Records Per Target Per Month
  • B92733 – Oracle Cloud Infrastructure – Data Safe for On-Premises Databases – Target Database Per Month – 200.00 Target Database Per Month + Includes 1 million audit records per target per month (pre-requisite under B91632)

Indeed, indeed. According My Oracle Support I have used the wrong target type. Instead Oracle Cloud Database, I used Oracle Database on Compute. And did not realized, the mistake and ignored the text below to the dropdown box. Shame on me 😉 –  here is the small, but important difference:

So far so good, the mistake was recognized. I deleted the target and added it from scratch with the correct target type. But this didn’t help, the charging went on.

Oracle Cloud Infrastructure Price List

Adding an other target type than Oracle Cloud Database is charged on monthly fee base as described here: Cloud Price List | Oracle

Cost and Usage Report

In the detailed  cost and usage report, the target is marked as deleted (suffix DELETED + deletion date), and charged.

All you can do is getting angry about that mistake and wait. After a month, the money was burned down, and there were no more Oracle Cloud Infrastructure Data Safe costs charged. As you can see, there are 201.44 CHF charged for a month.

I don’t know what Oracle has for a currency converter, but actual 200 USD are less that 180 CHF 😉

Lessons learned

Pity about the beautiful money – and for my next test run: RTFM.

Oracle OCI Data Transfer Service – A journey from Kestenholz/Jurasüdfuss/Switzerland to Frankfurt and back

The Oracle Data Transfer service is a offline data transfer method to migrate data to the Oracle Cloud Infrastructure. A transfer service is useful, when your network bandwidth and connection is not sufficient to upload your migration data  in a meaningful time. Oracle offers two methods: The disk-based data transfer and the appliance-based data transfer. The service is no only one-way, data can also be exported in an Oracle Cloud Infrastructure data center and shipped to your data center.

According one of my Company Trivadis’ cultural value called curiosity, I was wondering how this service works. This is the story of a tiny USB Hard Disk Drive full of data, which was going on a long journey from Kestenholz / Jurasüdfuss / Solothurn / Switzerland to the Oracle Infrastructure data center in Frankfurt and back.

Setup

  • The OCI Data Transfer utility is Linux based, the USB 3.0 HDD is attached to a VMware Virtual machine where Oracle Linux is running
  • The virtual machine has access to the Internet
  • Data is available – for this example I used some open data from Swiss government (opendata.swiss)

Data Transfer Service Regions

Actually data transfer is available in Frankfurt, Ashburn, Phoenix, London and Osaka. From Switzerland, Frankfurt is the nearest location.

How is your data coming into the Oracle Cloud

  1. Enable Data Transfer Service – Data Transfer
  2. Prepare an Object Storage bucket
  3. Create a Transfer Job
  4. Attach a HDD to a Linux based host, use the Data Transfer Utility to create and encrypt the device
  5. Copy data to the HDD
  6. Create and upload the disk manifest
  7. Lock the disk
  8. Attach the Disk to the transfer label and create the package
  9. Create a transfer package
  10. Shipping and update shipping information
  11. Tracking
  12. Data Verification
  13. Object Storage Replication Policy (optional)
  14. Finally…

Note: Most of the jobs above can be done by the OCI CLI on command line and are very well described in the Oracle doumentation.

1. Enable Data Transfer Service – Entitlement

Before you can use this service, the Data Transfer service has to be enabled in general. Therefore you have to request it. The OCI tenant administrator gets a document, what he has to sign in a digital way. It contains for example a description how to bring data to OCI, and if you order an appliance that there will be a 45day maximum limit where the appliance has to be returned to Oracle. And a few days later, the service is ready to use. Basically now you have the permissions, to order a Data Transfer Appliance, but in this test I used the Disk Service.

2. Prepare an Object Storage Bucket

In the Frankfurt region, I created a new Object Storage bucket called data_transfer_usb. This is the bucket where the shipped data will be transferred in.

3. Create a Transfer Job

In Object Storage – Data Transfer Import,  we create a new transfer job. It contains the upload bucket from above and as method the transfer device type is disk. For furtehr processing, we need the OCID of the job. As you can see, actually there is no transfer disk attached.

4. Attach a HDD to a Linux based host, use the Data Transfer Utility to create and encrypt the Device

Prerequisites for the Data Transfer Utility according the documentation:

  • An OCI Account which have the IAM permissions for Data Transfer
  • A Linux machine with Oracle Linux 6 or greater, Ubuntu 14.04 or greater, SUSE 11 or greater
  • Java 1.8  or 1.11
  • hdparm 9.0 or later
  • Cryptsetup 1.2.0 or later

Package Installation for my Oracle Linux 7 Machine

Download and Installation of the Data Transfer Utility

The actual link to the file is in the online documentation.

Test

Configure IAM Credentials for Data Transfer Actions

The configuration is according configuring the Oracle Cloud Infrastructure CLI with user, fingerprint, key_file, tenancy and region. Example configuration file:

Verify Credentials

Show Data Transfer Job Details – Status is PREPARED

Here you can see the shipping address from the Oracle Infrastructure data center frankfurk and the label. Both information are used later in process.

Prepare USB Hard Disk Drive

The disk is attached as /dev/sdb – it is a Western Digital drive. Important: The disk needs no partition.

Create Transfer Disk for Data Copy

This command will setup the disk and mount it immediately. As additional information we need the disk label for further processing.

Mount point is /mnt/orcdts_DAMOED7GH.

The Transfer Disk status has changed to PREPARING and the disk serial number is registered now.

5. Copy Data to HDD

For the test run I have copied some Open Data stuff, an Oracle Backup and Oracle Data Pump export files to the disk.

6. Generate Manifest File

It generates a local file which contains a list of the files and her MD5 checksums like an inventory file. Here the disk label is required.

The file:

7. Lock the Disk

8. Attach the Disk to the Transfer Label and create the Package

The status now changes to ACTIVE.

9. Shipping and Shipping Information Update

As shipping company I used DHL Switzerland. They have a pick point near by in Langenthal. At this point, it’s important to organize the return shipping too and put the return shipping label in the box. I didn’t realize it and have forgotten to organize the return shipping. So the disk was stranded in the Frankfurt data center.  And then the story began. As a private person, the delivery companies DHL and UPS doesn’t allow private persons to re-import packages from outside Switzerland  without a customer number. But, private persons don’t get such a number. Finally with FedEx I was able to organize the return shipping. Thanks to Andrew and Christos from Oracle’s OCI Data Transfer team for their patience!

Note: Companies like DHL have templates to create pro-forma commercial invoices – https://www.dhl.ch/exp-de/express/zollabwicklung/zollpapiere/proforma_rechnung.html#invoice

This disk was sent to the Oracle Cloud Infrastructure data center Frankfurt.

Now the shipping information has to be updated with vendor and the tracking numbers.

10. Tracking

DHL required two days until delivery in Frankfurt. Oracle started one day later with the data import.

11. Data Processing

Then Oracle is uploading the data and the disk is attached, the job transfer status changes to PROCESSING.

12. Data Verification

Finally the data is arrive in the Oracle Cloud Infrastructure Object Storage and is ready for use. The file processing is logged in the new created file upload_summary.txt.

13. Object Storage Replication Policy (optional)

The files are now in the data center Frankfurt, but I want to have them in the Swiss data center region Zurich. Therefore I set a replication policy on level Object Storage. In Zurich, a new bucket called data_transfer_usb_from_FRA is created. And a few minutes later, the files were available in the Object Storage Zurich. Sure, it depends on the file size 😉

Finally…

Detach the transfer disk so the data center guys can send it back to you.

And after a few days…welcome FedEx in Kestenholz / Jurasüdfuss / Solothurn / Switzerland!

Some words about Shipping and Costs

Shipping costs from DHL and Fedex:

Vendor From To Costs
DHL Langenthal / Switzerland Frankfurt 79.50 CHF
FedEx Frankfurt Kestenholz 130.45 CHF

Links

Summary

To watch nice marketing slides and documents about  cool features is not enough. To find out how it works in the real word, a real is test is required. How to migrate data into a data center of any cloud provider should be basic know-how of each consultant which is working with and on cloud themes. Moving data by a disk ro an appliance opens a lot of possibilities for data migrations into the cloud. For example a huge DWH: Transfer the RMAN backup into the cloud, restore it, close the GAP by an incremental backup and synchronize it with Oracle Golden Gate. #ilike