User Community for HCL Informix
  • Home
  • Blogs
  • Forum
  • About
  • Contact
  • Resources
  • Events

HCL Informix Backup In Cloud -  Direct backups to amazon s3 USING PSM

11/13/2017

5 Comments

 
Picture
                                                                                                            Updated: 8/30/18

With the release of HCL Informix Dynamic Server 12.10.xC10 we are adding a capability in our Primary Storage Manager to store/retrieve backups directly to the ecosystem of selected cloud providers, namely Amazon S3 and IBM Softlayer Object Storage. ​


​Amazon s3

Amazon S3 is a very powerful offering to store vast amounts of data in a cost effective manner. HCL Informix Dynamic Server has a good integration with this service using our ontape utility as an offsite copy.

In HCL Informix  Dynamic Server 12.10 we have expanded our capability to take direct backups to Amazon S3 using our Primary Storage manager and On-Bar. We will use this document to do a step by step configuration.

In order to use S3 you will, of course, need an account with Amazon Web Services. Some of the steps to get this working will be done in the AWS console, some of them can be accomplished directly from a console with HCL Informix.


​Disclaimer

In this tutorial we are granting our user, full, unrestrictive access to S3, which in all likelihood is not what a secure, production level system will need. You will need to design your own permissions and access according to your company’s policies and needs. This is just a simple example on how to get a user ready to be used with the Primary Storage Manager.
​

Groups and Security in AWS ​

The first step is to log into the AWS console. The first objective will be to create a user that will be used to transfer data in and out of S3. The first step is to create a group of users and assign them enough permissions to access S3.
​

Create a Group to access S3

Enter the Groups tab and then click “Create New Group”​
Picture
Name the Group and the click "Next Step"
​​​
Picture

The next step will ask you to attach a policy to the group. In this case we will select “AmazonS3FullAccess”. As its name implies, this policy will allow any member of this group to do everything in all containers in S3.
​

Again, this is probably not what you want if you use S3 for other purposes or for multiple instances. You can change this by going in to the “Policies” tab before creating the group and creating a customized policy that suits your needs.

Picture
The next will ask you to review your choices, if everything looks correct click “Create Group”.
​
Picture
Picture


Create a User to Access S3

The next step is to create a user that belongs to this group, so click in the “Users” tab
Picture
Then click “Create New Users” and provide a user name for the new user in this case “ifmx_s3_user” and click “Create”
​
Picture
The screen will display two strings: one is the Access Key and the other is called Secret Access Key. Store these numbers in a safe location. Do not let anybody not authorized to access your data to have them. These strings are the equivalent of a username and password that can be used to store and retrieve data from S3 programmatically using APIs. Anybody with access to these strings can retrieve/steal your information.

Additionally, you can download these credentials into a text file that you can store in a safe location. The file is NOT encrypted therefore you must be very careful where to place it.

This is the only place where you will be able to download or copy these credentials. If you do not save them or if you lose them, you will need to create a new user as the former one will be unusable.
​

Then click “Close”.
Picture
You will be sent back to the user screen, you will see the user you just created:
​
Picture


​Assign your user to the group

Click the check box next to the user and the click “User Actions” and select “Add User to Groups” ​
Picture
Add the user to the group we first created by clicking the check box, the click “Add to Groups”.
​
Picture
After this we will have a user named “ifmx_s3_user”  that belongs to the group “ifx_s3_group” that has unrestricted access to all the S3 resources.

The next step is to create a place where to put the HCL Informix backups. This place is called a “Bucket” in S3 and is the equivalent of a directory in a regular file system.
​

The bucket name will be needed later while creating a PSM device using the parameter “—container”

The next step is to go to “Services” and the “Amazon S3” and the click “Create Bucket”
​
Picture
Give the bucket a name and then select the appropriate zone, in this case US Standard. Then click “Create” ​
Picture
Picture


Create a Primary Storage Manager (PSM) device ​

At this point the configuration in S3 is done, you just need to configure the Primary Storage Manager to use the Bucket and Credentials you just created.
​

In order to use the PSM with the S3 Bucket you just created you need to create a device in PSM of type CLOUD using S3 as provider.
Primary Storage Manager (PSM) Device

    
In this command line:
  1. AMAZON1, is the arbitrary name you will give to this device.  With FILE type devices, this is actually the full path of the directory that will store the data but in the case of CLOUD type devices is just any name that will help you organize your devices. This name plus the pool (DBSPOOL in this case) must be unique.
  2. ‘-t CLOUD’, is the device type that will tell PSM to store/retrieve the data to/from a CLOUD infrastructure.
  3. ‘--provider S3’, is the target cloud provider (Amazon S3) in this case. At this moment only S3 and SWIFT (OpenStack SWIFT ) are supported.
  4. ‘--url https://ifmx-s3-dev.s3.amazonaws.com’, is the URL where your backups will go. In the specific case of S3 it just so happens that the bucket name is part of the URL provided.
  5. ‘--user AKIAIT1111155555X4PA’, for S3 this is the Access Key provided to you when your user was created.
  6. ‘--password A2nB21111155555nvTI0X9ZxGzUJNJivoBQY9MrD’, for S3 is the Secret Key provided to you when the user was created.
  7. ‘--container ifmx-s3-dev’ is the amazon bucket
  8. ‘—-max_part_size 25600’ will fragment your objects in 25MB pieces, in the case of S3 we suggest a size between 25 and 100 MB for this.
Check the device was created

    
Take a Level Zero Backup

    

Check Backup Data is in your Bucket ​

The backup data is organized using the INFORMIXSERVER, for example, in the next picture there is backup data from 6 different HCL Informix instances: cloudbkp, gaccurl_lin32, gaccurl_shm, gaccurl_tcp, ol_informix1210_2 and win32curl:
​
Picture

​Gustavo Castro
Senior Solutions Architect at HCL 
​Connect with me on LinkedIn

Informix is a trademark of IBM Corporation in at least one jurisdiction and is used under license.
5 Comments
Deen Murad
11/14/2017 01:09:13 pm

Good work in the right direction.
I have a question on encryption. Is this backup encrypted.
Does the encryption include in route or just at rest where it resides?

Thank you.

Reply
Gustavo A. Castro
11/14/2017 01:59:45 pm

Hi Deen,
The example used will send the data using the HTTPS protocol therefore the data is encrypted in the wire. However the data will not be encrypted while in the cloud and anybody with your cloud username and password will be able to retrieve it and use it.

If you want to have the data encrypted at rest in the cloud provider, one option is to use BACKUP_FILTER and RESTORE_FILTER in the Informix Configuration File to achieve this. I will write another article addressing this.

regards

Gustavo.

Reply
DEEN MURAD
3/6/2018 01:42:30 pm

The above implies the logical logs can be setup to backup to the Cloud device also. Is that correct?

Would that be recommended as I would suspect it will affect Database operation performance?

Thank you
- Deen

Reply
Jack Naguib
7/15/2018 06:09:53 pm

Hi Gustavo,

Thank you for the post it was very helpful. I have two concerns that I was hoping you could address:

1- We cannot use the AmazonS3FullAccess in production obviously. Therefore what are the minimum set of permissions required to make it work? Are ListBucket, GetObject, PutObject enough?

2- Where can I find the configuration for ONPSM? I could not find what I need in the ONCONFIG file. For example, I need to know which S3 URL the cloud device name in the $ onpsm -D list command output refers to, SSL/TLS settings, etc?

Cheers

Reply
Jack Naguib
7/17/2018 06:16:04 am

What about recovery from that cloud backup to a new blank db server?

Reply



Leave a Reply.

    Archives

    November 2019
    September 2019
    May 2019
    April 2019
    February 2019
    January 2019
    October 2018
    July 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017

    Categories

    All
    Business
    Technical

    RSS Feed

Proudly powered by Weebly
  • Home
  • Blogs
  • Forum
  • About
  • Contact
  • Resources
  • Events