Autonomous Database

Never stop Learning – why I love Oracle LiveLabs

Since over one and a half year, this week I was back in an onsite training, live people, live teaching. With a motivated junior DBA class, we started with all about Oracle architecture based on our Trivadis training O-AI – Oracle Architecture and Internals. The training is a mix between slides, demos and labs. Therefore during the course we run the training environments in Oracle Cloud Infrastructure, build by Terraform (Credits to Stefan Oehrli from oradba.ch which has ramped up the whole stuff). After the course at the end of the month, the environments will be cleaned up. And what’s next?

Training Environments

There are a lot of possibilities to get a deeper knowledge of all this Oracle stuff like processes, data encryption, multitenancy, datapump and so on:

But my actual favorite is Oracle LiveLabs!

Oracle LiveLabs

This platform is not only for DBAs, it has a lot of workshops for Application Developers, Data Scientists and DevOps Engineers too. There are different workshop types available:

  • get your free Oracle Cloud Infrastructure training environment for free during a time period like Oracle Database 19c New Features – run on LiveLabs
  • workshops which are running in a free tenancy
  • workshops what you can do in your own paid tenancy

At the moment there are 21 workshops where you get a live environment with all components you need virtual machines or database in Oracle Cloud Infrastructure like Oracle Multitenant Fundamentals, Database 19c – Automatic Indexing, 21c New Features on Autonomous Database and many more. All workshops are very well described, from the access to the initial setup and finally for the workshop himself too.

In this case I have decided to start the Oracle Multitenant lab to gather more information how PDB Snapshot Copy works.

1st – Search your training in the available workshops  and press Launch

2nd – Define where the Workshop should run

In this case, I want to reserve an environment. This is not possible for all workshops, you can see that in the workshop details if it’s possible to the an Oracle Cloud Infrastructure setup.

3rd – Define the Start Date and propose your SSH Public Key

With the key, you can get access to the training servers by SSH. In this case I want to start the workshop immediately. Otherwise define a start and end date. If you don’t want to start now, you will get a confirmation that the workshop is reserved and an email at the day where the workshop starts with the credential information.

4rd – View my Reservation

After some minutes, the status for the workshop is updated. As you can see here, in about three minutes from now, the environment should be ready. You will receive a confirmation mail.

5th – Launch Workshop

When the workshop is ready, the workshop can be launched.

6th – Workshop Details

All information you need is in the details like:

  • User name
  • Initial password for OCI
  • Compartment
  • Instance public IP

Here you have also the chance to extend your workshop reservation time. Follow the Get Started instructions to the bottom and push the button to move on the introduction. Step by step you are guide through the login and setup process. All labs contain a manual how to connect and to do the initial setup like starting listeners or get scripts from the OCI Object Storage.

There are the connection options how you can interact with the LiveLab:

  1. Connect using Cloud Shell
  2. Connect using MAC or a Windows CYGWIN Emulator
  3. Connect using Putty

Example code for the multitenancy lab preparation:

Summary

Oracle LiveLabs is another great opportunity to learn and train new stuff. All you have to take care now is to follow the workshop instructions and take care about the limited time. Enjoy it, learn new stuff and have fun! Oracle LiveLabs are easy to join, easy to set up and well described. This is why I love it 🙂

The Grafana Plugins for Oracle Cloud Infrastructure Monitoring are back!

In September 2019 I wrote a blog post how to monitor an Oracle Cloud Infrastructure Autonomous database with Grafana plugin oci-datasource. But some weeks after publication, the plugin was not available on the Grafana page anymore. And only Oracle and Grafana had a clue why.

Now everything will be fine now. Since the 6th of October, there are two new Grafana plugins available for download. They both don’t require a Grafana enterprise account.

The first one is a successor of the former oci-datasource plugin, the second allows to get logs from OCI resources like Compute or Storage. As an infrastructure guy, let’s install the Oracle Cloud Infrastructure Metrics on an local Oracle Enterprise Linux 8 installation!

Install and configure the OCI CLI

Link: https://docs.cloud.oracle.com/en-us/iaas/Content/API/SDKDocs/cliinstall.htm

OS User oci and Installer

As OS user root, create a new user mentioned oci, change to new created user oci.

Run the installer script.

In this demo case, I use the default settings and the tab completion. After some seconds, all packages are installed and the OCI CLI is ready to configure.

Configure the OCI CLI

If you have already a created SSH key pair from a former OCI action, then you can use it here. Otherwise this setup process creates a new private and public key for you. Take care, the public key has to be in the PEM format!

Required values to finish the setup:

config location /home/oci/.oci/config
user OCID OCI > Identity > Users > [YOUR_USER] > OCID
tenancy OCID OCI > Administration > Tenancy Details > [YOUR_TENANCY] > OCID
region choose your region, e.g. eu-zurich-1
generate a new API signing RSA key pair Y -> only if you don’t have already an existing key pair
key directory /home/oci/.oci
key name oci_api_key_07102020

 

Run the setup.

OCI Console API Key

The content of the created public key has to be added in OCI Console as API key – just copy and paste it. OCI Console >> Identity >> Users >> User Details >> API Keys >> Add Public Key.

How to: https://docs.cloud.oracle.com/Content/API/Concepts/apisigningkey.htm#How2

OCI CLI Configuration Test

Verify the configuration by execute a CLI command. Example to list images based on Oracle Linux.

OCI Console Group Policy

If your user is not part of the Administrator group, a new group and a group policy is needed which has the permissions to read tenant metrics. OCI Console >> Identity >> Groups >> Create Group.

Create the policy in the root compartment of your tenant. OCI Console >> Identity >> Policy>> Create Policy.

Install and configure Grafana and the Oracle Cloud Infrastructure Metrics Data Source Plugin

Grafana

Link: https://docs.cloud.oracle.com/en-us/iaas/Content/API/SDKDocs/grafana.htm

Start and enable the service.

Don’t forget to open the firewall port 3000 for the Grafana UI.

Oracle Cloud Infrastructure Metrics Data Source Plugin

List the available OCI Grafana plugins.


Install the metric plugin.

Restart Grafana Server.

Grafana Data Source Configuration

RSA Key Configuration

Grafana needs the configuration file and the RSA Key from the user oci. One solution: as user root, copy the files and set the ownership to OS user grafana.

Change the path to the key file in /usr/share/grafana/.oci/config.

from:

to:

Add a new Data Source

Login into the Grafana server by address <server-ip>:3000. The initial username and password is admin. It’s recommended to change the password at the first login. Add a new data source. Configuration >> Data Sources >> Add data source.

Filter by oracle and select the Oracle Cloud Infrastructure Metrics plugin.

Set Tenancy OCID, select your Default Region and set the Environment to local. Press Save & Test to verify the functionality.

Create a new Dashboard and add a new panel.

Now you can query the data, for example the VPN bandwidth for region eu-zurich1 in my personal compartment. Feel free to add new panels based on the available metrics.

Example

Summary

Great to have the Oracle Cloud Infrastructure Grafana plugins back. To get an idea, which metrics are all available, verify it in the OCI Console >> Monitoring >> Metrics Explorer. The free ADB is not available in the collected metrics. But this is a general issue.

This was a review of the first OCI plugin. In the next week I will take a deeper look into the Oracle Cloud Infrastructure Logging Data Source plugin.

Oracle Cloud Infrastructure – Network Troubleshooting with VCN Flow Logs

Do have a problem with a connection from or to your private/public subnet? There is a new functionality called VCN Flow Logs available. It collects information about network traffic (source/target) in the Oracle Cloud Infrastructure VCN subnet. At the moment (05/03/2020), this functionality is not available in all regions and I did not find any command in OCI CLI, but will be rolled out. There is no documentation available at  docs.cloud.oracle.com.

Link to the OCI blog announcement and demo: https://blogs.oracle.com/cloud-infrastructure/announcing-vcn-flow-logs-for-oracle-cloud-infrastructure

LA

I have registered our company tenant for the Cloud Native Limited Availability Program to get this brand new feature available. Watch here: https://blogs.oracle.com/cloud-infrastructure/announcing-limited-availability-of-oracle-cloud-infrastructure-logging-service

Use Case

A public compute instance with private IP 10.92.10.2 is not able to connect to the private database server with IP 10.92.100.2 anymore via SSH/22 – data center is Switzerland North (Zurich).

Create a new Log Group in your Compartment

Fill in name and description for the Log Group

The Log Group is created,  Enable Log

Enable Resource Log

Define the service and resource for VCN Flow Logs and enable logging. For the private subnet investigation I used:

  • Service: Flow Logs
  • Resource: My Private Subnet Name

Flow Log

The Flow is created, now we can explore the log. You can also disable logging or indexing or edit the name.

Log Search

Basically you see all log entries, with Explore with Log Search we can add filters. For example for a source IP address or a log content text like REJECTED.

Modify Filters & Columns

Now we add a filter to find out REJECTED connections. Wildcards are allowed in search terms.

  • Log Field: msg
  • Value: *REJECT* 

Apply.

Now we see the connections with state REJECT.

The solution – Add the IP to the Security List

There was a missing entry in the private subnet security list. After adding the source IP address range to the list, the connection is ok now. There are no REJECT message entries anymore in the VCN Flow Logs by this source IP address.

Object Storage

Flow logs are stored in Object Storage too. The bucket is created automatically. Housekeeping can be configured by a Lifecycle Rule for the log file bucket or by CLI. Take a look into the documentation to avoid error when you want to create a lifecycle rule . You have to create a Service Permissions policy first for the object storage before you can create a rule.

OCI Object Storage Lifecycle Rule

You can remove them by a lifecycle rule or by CLI. Take a look at the OCI documentation section Using Object Lifecycle Management to avoid permission errors when you want to create a lifecycle rule . You have to create a service permissions policy first for the object storage before you can create a rule.

Missing permissions error message:

Example Policy Statement to allow actions on object store:

OCI CLI example command to remove old files – for example with date pattern 2020-03-05T07 – 7AM

OCI Command Line Interface starter page: https://docs.cloud.oracle.com/en-us/iaas/Content/API/Concepts/cliconcepts.htm

What’s next

Try out the new logging feature for other OCI components like Functions, Event Service and Object Storage. And why not to integrate the logs in your existing Splunk environment? There is Splunk OCI object storage plugin available. Take a look here: https://blogs.oracle.com/cloud-infrastructure/announcing-the-object-storage-plugin-for-splunk

MV2ADB – One-Click Move of your Data into OCI Autonomous Databases – Auto Operation

In the previous blog post MV2ADB – One-Click Move of your Data into OCI Autonomous Databases – Step by Step I wrote about the new Oracle Cloud Infrastructure tool to transfer local data into Autonomous Databases step by step. There you see how to install and configure mv2adb and how to transfer your data to ADB step by step.

The auto operation parameter is now “all in one”, one parameter and all required steps like export, transfer etc. are done fully automated.

Prerequisites

  • mv2adb rpm package installed, always get the newest version from My Oracle Support (Doc ID 2463574.1)
  • HTTP/SQL*Net Connectivity from the on premises server to the Autonomous Database
  • Autonomous Database Wallet (can be downloaded from the ATP main page)
  • Instant Client with Basic Package, SQL*Plus Package and Data Pump, SQL*Loader and Workload Replay Client – if there is an existing RDBMS installation 18.3 or higher, you can use it
  • Java executable – same like above, you can use the RDBMS installation too
  • Perl Release 5.10 or above
  • Optional: Oracle OCI Command Line Interface – https://docs.cloud.oracle.com/iaas/Content/API/Concepts/cliconcepts.htm installed and configured

Let’s go – we transfer the local HR Schema to ADB fully automated

Example with parameter OCIC=true – visible in the output lines where the OCI bucket upload is done.

Verification

Summary

The auto function completely eliminates the manual steps to upload your data into an Autonomous Database steps. And in case of any error, you have the same logfiles like you do it step by step. Great!

#ilikeit

MV2ADB – One-Click Move of your Data into OCI Autonomous Databases – Step by Step

There is a new Oracle Cloud Infrastructure tool available called MV2ADB(ADB) MV2ADB: move data to Autonomous Database in “one-click” (Doc ID 2463574.1). All steps which have to be executed manually to transfer data into an Autonomous Database are now automated:

  • Advisor for local schemas
  • Oracle Data Pump local export
  • Transfer into Oracle Cloud Infrastructure Object Store
  • Create Autonomous Database Credentials to get access on the Object Store
  • Oracle Data Pump local import
  • Verify Oracle Data Pump import logfile

The data transfer job can be done fully automated or step by step (autonomus schema advisor, export data, create bucket, upload dump files etc.). In this blog post I describe the manual steps.

How it works

Image from My Oracle Support Note 2463574.1:

 

 

 

 

 

 

 

 

 

 

 

Prerequisites

  • mv2adb rpm package installed, always download the newest version from My Oracle Support (Doc ID 2463574.1)
  • HTTP/SQL*Net Connectivity from the on premises server to the Autonomous Database
  • Autonomous Database Wallet (can be downloaded from the ATP main page)
  • Instant Client with Basic Package, SQL*Plus Package and Data Pump, SQL*Loader and Workload Replay Client – if there is an existing RDBMS installation 18.3 or higher, you can use it
  • Java executable – same like above, you can use the RDBMS installation too
  • Perl Release 5.10 or above
  • Optional: Oracle OCI Command Line Interface – https://docs.cloud.oracle.com/iaas/Content/API/Concepts/cliconcepts.htm installed and configured

mv2adb – Options

Configuration File

The mv2adb install process provides an example of a configuration file – here is my version with OCI CLI enabled. Take care and read the example and the comments. At this point, thanks to Ruggero Citton from Oracle’s Cloud Innovation and Solution Engineering Team for his great support to find my configuration mistake. If you dont’ want to use the configuration file, all parameters can be attached to the mv2db command.

All passwords have to be encrypted with the mv2adb encpass command ind advance.

For the parameter OCI_PASSWORD, you have to create an OCI Authentification Token first in the console or by CLI and encrypt the provided password.

In this configuration file, I use the OCI CLI. In this example we transfer the Oracle demo schema HR. Take care about the Expdp/Impdp Parameters, if you want to encrypt the Data Pump export files, you need an additional Advanced Security Option ASO. No license? Just comment it out or let the parameters blank.

 

 

Let’s go – we transfer the local HR Schema to ADB

18/12/2019: At the moment I have some trouble with the automated function which is doing all steps at one (option auto)  – this is under investigation.

0. Advisor

It executes the ADB Schema Advisor. This advisor provides information if your data can be transferred into the cloud and which database objects are problematic. If you want to know more, take a look at this My Oracle Support Note: Oracle Autonomous Database Schema Advisor (Doc ID 2462677.1) Excerpt from the output with the hint that user defined tablespaces are not allowed in an ADB environment (If you want to verify it: The manually executed CREATE TABLESPACE command results into ORA-01031: insufficient privileges).

In the background, a temporary user called ADB_ADVISOR is created to analyse the data (Script @/opt/mv2adb/utils/install_adb_advisor.sql). The user will be dropped automatically after the run.

1. Create an OCI Object Storage Bucket called ocibucket01

2. Execute the local Oracle Data Pump Export

3. Upload the Data Pump Export Files into the OCI Bucket

4. Import the Data

5. Verification

X. Troubleshooting

Logs for all steps are available in the installation sub folder. There you can find all excuted commands, detailed error messages.

My ToDo List for next MV2ADB Blog Post

  • Clarification of the license situation, if the export to the cloud has to be encrypted, Advanced Security Option is required, maybe a special license solution like compression for the OCI backup service is planned.
  • Execution of steps without a parameter file.
  • Transfer data without OCI CLI pre-installed.

Summary

The Oracle Cloud Infrastructure MV2ADB is a great tool to make data moves into the OCI Autonomous Database much easier. I like the concept, a small configuration file, passwords are all encrypted and the logs are very detailed. The advisor is helpful to identify conflict in advance.

#ilikeit