USER GUIDE FOR POWERCENTER - INFORMATICA POWEREXCHANGE FOR AMAZON REDSHIFT 10.2 - USER ...

Page created by Alex Reed
 
CONTINUE READING
USER GUIDE FOR POWERCENTER - INFORMATICA POWEREXCHANGE FOR AMAZON REDSHIFT 10.2 - USER ...
Informatica® PowerExchange for Amazon
Redshift
10.2

User Guide for PowerCenter
USER GUIDE FOR POWERCENTER - INFORMATICA POWEREXCHANGE FOR AMAZON REDSHIFT 10.2 - USER ...
Informatica PowerExchange for Amazon Redshift User Guide for PowerCenter
10.2
September 2017
© Copyright Informatica LLC 2014, 2019

This software and documentation are provided only under a separate license agreement containing restrictions on use and disclosure. No part of this document may be
reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC.

Informatica, the Informatica logo, PowerCenter, and PowerExchange are trademarks or registered trademarks of Informatica LLC in the United States and many
jurisdictions throughout the world. A current list of Informatica trademarks is available on the web at https://www.informatica.com/trademarks.html. Other company
and product names may be trade names or trademarks of their respective owners.

U.S. GOVERNMENT RIGHTS Programs, software, databases, and related documentation and technical data delivered to U.S. Government customers are "commercial
computer software" or "commercial technical data" pursuant to the applicable Federal Acquisition Regulation and agency-specific supplemental regulations. As such,
the use, duplication, disclosure, modification, and adaptation is subject to the restrictions and license terms set forth in the applicable Government contract, and, to the
extent applicable by the terms of the Government contract, the additional rights set forth in FAR 52.227-19, Commercial Computer Software License.

Portions of this software and/or documentation are subject to copyright held by third parties. Required third party notices are included with the product.

The information in this documentation is subject to change without notice. If you find any problems in this documentation, report them to us at
infa_documentation@informatica.com.

Informatica products are warranted according to the terms and conditions of the agreements under which they are provided. INFORMATICA PROVIDES THE
INFORMATION IN THIS DOCUMENT "AS IS" WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING WITHOUT ANY WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND ANY WARRANTY OR CONDITION OF NON-INFRINGEMENT.

Publication Date: 2019-04-23
USER GUIDE FOR POWERCENTER - INFORMATICA POWEREXCHANGE FOR AMAZON REDSHIFT 10.2 - USER ...
Table of Contents
     Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
     Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
           Informatica Network. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
           Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
           Informatica Product Availability Matrixes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
           Informatica Velocity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
           Informatica Marketplace. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

     Chapter 1: Introduction to PowerExchange for Amazon Redshift. . . . . . . . . . . . . . . 7
     PowerExchange for Amazon Redshift Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
     PowerCenter Integration Service and Amazon Redshift Integration. . . . . . . . . . . . . . . . . . . . . . . 8
     Introduction to Amazon Redshift. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

     Chapter 2: PowerExchange for Amazon Redshift Configuration. . . . . . . . . . . . . . 10
     PowerExchange for Amazon Redshift Configuration Overview. . . . . . . . . . . . . . . . . . . . . . . . . 10
     Prerequisites. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
     Configuring Custom Property. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
     IAM Authentication. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
           Create Minimal Amazon S3 Bucket Policy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
     Registering the Plug-in. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

     Chapter 3: Amazon Redshift Sources and Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . 14
     Amazon Redshift Sources and Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
     Import Amazon Redshift Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
     Amazon Redshift Lookup Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

     Chapter 4: Amazon Redshift Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
     Amazon Redshift Mappings Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
     Configuring the Source Qualifier. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
     Amazon Redshift Mapping Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

     Chapter 5: Amazon Redshift Pushdown Optimization. . . . . . . . . . . . . . . . . . . . . . . . 19
     Amazon Redshift Pushdown Optimization Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
     Pushdown Optimization Functions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
     Configuring Amazon Redshift ODBC Connection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
           Configuring Amazon Redshift ODBC Connection on Windows. . . . . . . . . . . . . . . . . . . . . . 22
           Configuring Amazon Redshift ODBC Connection on Linux. . . . . . . . . . . . . . . . . . . . . . . . . 26
           Creating an Amazon Redshift ODBC Connection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
     Rules and Guidelines for Functions in Pushdown Optimization. . . . . . . . . . . . . . . . . . . . . . . . . 29

                                                                                                                                     Table of Contents       3
USER GUIDE FOR POWERCENTER - INFORMATICA POWEREXCHANGE FOR AMAZON REDSHIFT 10.2 - USER ...
Chapter 6: Amazon Redshift Sessions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
             Amazon Redshift Sessions Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
             Amazon Redshift Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
                  Amazon Redshift Connection Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
                  Configuring an Amazon Redshift Connection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
                  Configuring the Source Qualifier. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
             Amazon Redshift Source Sessions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
                  Client-side Encryption . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
                  Identity Columns. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
                  Unload Command. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
                  Partitioning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
                  Amazon Redshift Source Session Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
             Amazon Redshift Target Sessions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
                  Server-side Encryption for Amazon Redshift Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
                  Amazon Redshift Staging Directory. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
                  Vacuum Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
                  Retain Staging Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
                  Copy Command. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
                  Amazon Redshift Target Session Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
             Working with Large Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
             Octal Values as DELIMITER and QUOTE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
             Success and Error Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
                  Success Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
                  Error Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

             Appendix A: Amazon Redshift Data Type Reference. . . . . . . . . . . . . . . . . . . . . . . . . . 50
             Data Type Reference Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
             Amazon Redshift and Transformation Data Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

             Appendix B: Troubleshooting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
             Troubleshooting Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
             Troubleshooting for PowerExchange for Amazon Redshift. . . . . . . . . . . . . . . . . . . . . . . . . . . 52

             Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

4   Table of Contents
USER GUIDE FOR POWERCENTER - INFORMATICA POWEREXCHANGE FOR AMAZON REDSHIFT 10.2 - USER ...
Preface
     The Informatica PowerExchange® for Amazon Redshift User Guide for PowerCenter® describes how to read
     data from and write data to an Amazon Redshift target. The guide is written for database administrators and
     developers who are responsible for moving data from a source to an Amazon Redshift target, and from an
     Amazon Redshift source to a target. This guide assumes that you have knowledge of database engines,
     Amazon Redshift, and PowerCenter.

Informatica Resources

  Informatica Network
     Informatica Network hosts Informatica Global Customer Support, the Informatica Knowledge Base, and other
     product resources. To access Informatica Network, visit https://network.informatica.com.

     As a member, you can:

     •   Access all of your Informatica resources in one place.
     •   Search the Knowledge Base for product resources, including documentation, FAQs, and best practices.
     •   View product availability information.
     •   Review your support cases.
     •   Find your local Informatica User Group Network and collaborate with your peers.

  Informatica Documentation
     To get the latest documentation for your product, browse the Informatica Knowledge Base at
     https://kb.informatica.com/_layouts/ProductDocumentation/Page/ProductDocumentSearch.aspx.

     If you have questions, comments, or ideas about this documentation, contact the Informatica Documentation
     team through email at infa_documentation@informatica.com.

  Informatica Product Availability Matrixes
     Product Availability Matrixes (PAMs) indicate the versions of operating systems, databases, and other types
     of data sources and targets that a product release supports. If you are an Informatica Network member, you
     can access PAMs at
     https://network.informatica.com/community/informatica-network/product-availability-matrices.

                                                                                                                   5
USER GUIDE FOR POWERCENTER - INFORMATICA POWEREXCHANGE FOR AMAZON REDSHIFT 10.2 - USER ...
Informatica Velocity
              Informatica Velocity is a collection of tips and best practices developed by Informatica Professional
              Services. Developed from the real-world experience of hundreds of data management projects, Informatica
              Velocity represents the collective knowledge of our consultants who have worked with organizations from
              around the world to plan, develop, deploy, and maintain successful data management solutions.

              If you are an Informatica Network member, you can access Informatica Velocity resources at
              http://velocity.informatica.com.

              If you have questions, comments, or ideas about Informatica Velocity, contact Informatica Professional
              Services at ips@informatica.com.

     Informatica Marketplace
              The Informatica Marketplace is a forum where you can find solutions that augment, extend, or enhance your
              Informatica implementations. By leveraging any of the hundreds of solutions from Informatica developers
              and partners, you can improve your productivity and speed up time to implementation on your projects. You
              can access Informatica Marketplace at https://marketplace.informatica.com.

6   Preface
USER GUIDE FOR POWERCENTER - INFORMATICA POWEREXCHANGE FOR AMAZON REDSHIFT 10.2 - USER ...
Chapter 1

Introduction to PowerExchange
for Amazon Redshift
     This chapter includes the following topics:

     •   PowerExchange for Amazon Redshift Overview, 7
     •   PowerCenter Integration Service and Amazon Redshift Integration, 8
     •   Introduction to Amazon Redshift, 9

PowerExchange for Amazon Redshift Overview
     You can use PowerExchange for Amazon Redshift to read data from or write data to Amazon Redshift. You
     can also use PowerExchange for Amazon Redshift to read data from Amazon Redshift views.

     You can also read data from or write data to the Amazon Redshift cluster that reside in a Virtual Private
     Cloud (VPC).

     Amazon Redshift views contain information about the functioning of the Amazon Redshift system. You can
     run a query on views like you run a query on database tables.

     You can use Amazon Redshift objects as sources and targets in mappings. When you use Amazon Redshift
     objects in mappings, you must configure properties specific to Amazon Redshift.

     You can configure HTTPS proxy to connect to Amazon Redshift. You can also configure an SSL connection to
     connect to Amazon Redshift. The PowerCenter Integration Service uses the Amazon driver to communicate
     with Amazon Redshift.

     Note: PowerExchange for Amazon Redshift does not support real-time processing.

     Example
     You work for an organization that stores purchase order details, such as customer ID, item codes, and item
     quantity in an on-premise MySQL database. You need to analyze purchase order details and move data from
     the on-premise MySQL database to an affordable cloud-based environment. Create a mapping to read all the
     purchase records from the MySQL database and write them to an Amazon Redshift target for data analysis.

                                                                                                                  7
USER GUIDE FOR POWERCENTER - INFORMATICA POWEREXCHANGE FOR AMAZON REDSHIFT 10.2 - USER ...
PowerCenter Integration Service and Amazon
Redshift Integration
             The PowerCenter Integration Service uses the Amazon Redshift connection to connect to Amazon Redshift.

             The following image shows how PowerCenter connects to Amazon Redshift to read data:

             When you run the Amazon Redshift session, the PowerCenter Integration Service reads data from Amazon
             Redshift based on the workflow and Amazon Redshift connection configuration. The PowerCenter Integration
             Service connects and reads data from Amazon Simple Storage Service (Amazon S3) through a TCP/IP
             network. The PowerCenter Integration Service then stores data in a staging directory on the PowerCenter
             machine. Amazon S3 is a storage service in which you can copy data from source and simultaneously move
             data to any target. The PowerCenter Integration Service issues a copy command that copies data from
             Amazon S3 to the target.

             The following image shows how PowerCenter connects to Amazon Redshift to write data:

             When you run the Amazon Redshift session, the PowerCenter Integration Service writes data to Amazon
             Redshift based on the workflow and Amazon Redshift connection configuration. The PowerCenter Integration
             Service stores data in a staging directory on the PowerCenter machine. The PowerCenter Integration Service
             then connects and writes data to Amazon Simple Storage Service (Amazon S3) through a TCP/IP network.
             Amazon S3 is a storage service in which you can copy data from source and simultaneously move data to
             Amazon Redshift clusters. The PowerCenter Integration Service issues a copy command that copies data
             from Amazon S3 to the Amazon Redshift target table.

8   Chapter 1: Introduction to PowerExchange for Amazon Redshift
USER GUIDE FOR POWERCENTER - INFORMATICA POWEREXCHANGE FOR AMAZON REDSHIFT 10.2 - USER ...
Introduction to Amazon Redshift
     Amazon Redshift is a cloud-based petabyte-scale data warehouse service that organizations can use to
     analyze and store data.

     Amazon Redshift uses columnar data storage, parallel processing, and data compression to store data and to
     achieve fast query execution. Amazon Redshift uses a cluster-based architecture that consists of a leader
     node and compute nodes. The leader node manages the compute nodes and communicates with the external
     client programs. The leader node interacts with the client applications and communicates with compute
     nodes. A compute node stores data and runs queries for the leader node. Any client that uses a PostgreSQL
     driver can communicate with Amazon Redshift.

                                                                              Introduction to Amazon Redshift   9
Chapter 2

PowerExchange for Amazon
Redshift Configuration
     This chapter includes the following topics:

     •    PowerExchange for Amazon Redshift Configuration Overview, 10
     •    Prerequisites, 10
     •    Configuring Custom Property, 11
     •    IAM Authentication, 11
     •    Registering the Plug-in, 13

PowerExchange for Amazon Redshift Configuration
Overview
     You can use PowerExchange for Amazon Redshift on Windows or Linux. You must configure PowerExchange
     for Amazon Redshift before you can extract data from or load data to Amazon Redshift.

Prerequisites
     Before you can use PowerExchange for Amazon Redshift, perform the following tasks:

     1.    Install or upgrade to PowerCenter 10.2.
     2.    Verify that you can connect to Amazon Redshift with an SQL client that uses the PostgreSQL driver.
           For example, you can use SQL Workbench/J to connect to Amazon Redshift.
     3.    Verify that you have read, write, and execute permissions on the following directories:
           /server/bin
     4.    Verify that you have read and write permissions on the following directories:
           /Clients/PowerCenterClient/client/bin
     The organization administrator must also perform the following tasks:

     •    Mandatory. Get the Amazon Redshift JDBC URL.

10
•   Mandatory. Manage Authentication. Use either of the following two methods:
         - Create an Access Key ID and Secret Access Key. Applicable when the PowerCenter client is not installed
          on Amazon Elastic Compute Cloud (EC2) system. Provide the values for Access Key ID and Secret
          Access Key when you configure the Amazon Redshift connection.
         - Configure AWS Identity and Access Management (IAM) Authentication. For enhanced security.
          Applicable when you install the PowerCenter client on Amazon EC2 system and you want to run sessions
          on the EC2 system. If you use IAM authentication, do not provide Access Key ID and Secret Access Key
          explicitly in the Amazon Redshift connection. Instead, you must create an Redshift Role Amazon
          Resource Name (ARN), add the minimal Amazon S3 bucket policy to the Redshift Role ARN, and add the
          Redshift Role ARN to the Redshift cluster. For more information, see IAM Authentication section.
          Provide the Redshift Role ARN in the AWS_IAM_ROLE option in the UNLOAD and COPY commands when
          you create a session.
          If you specify both, Access Key ID and Secret Access Key in the connection properties and
          AWS_IAM_ROLE in the UNLOAD and COPY commands, AWS_IAM_ROLE takes the precedence.
          You must add IAM EC2 role and IAM Redshift role to the customer master key when you use IAM
          authentication and server-side encryption using customer master key.
     •   Optional. Create an Amazon Redshift master symmetric key to enable client-side encryption.
     •   Optional. Create an AWS Key Management Service (AWS KMS)-managed customer master key to enable
         server-side encryption.

Configuring Custom Property
     You can configure the custom properties for Amazon Redshift targets.

     You can set the custom property to run the copy command serially. Set the value of the explicitCopyCommit
     custom property to true. For example, you can wait for the previous copy command to run successfully
     before running another copy command.

IAM Authentication
     Optional. You can configure IAM authentication when the PowerCenter Integration Service runs on an
     Amazon Elastic Compute Cloud (EC2) system. Use IAM authentication for secure and controlled access to
     Amazon Redshift resources when you run a session.

     Use IAM authentication when you want to run a session on an EC2 system. Perform the following steps to
     configure IAM authentication:

     •   Step 1: Create Minimal Amazon S3 Bucket Policy. For more information, see “Create Minimal Amazon S3
         Bucket Policy” on page 12.
     •   Step 2: Create the Amazon EC2 role. The Amazon EC2 role is used when you create an EC2 system in the
         Redshift cluster. For more information about creating the Amazon EC2 role, see the AWS documentation.
     •   Step 3: Create an EC2 instance. Assign the Amazon EC2 role that you created in step #2 to the EC2
         instance.

                                                                                  Configuring Custom Property   11
•   Step 4: Create the Amazon Redshift Role ARN for secure access to Amazon Redshift resources. You can
                 use the Amazon Redshift Role ARN in the UNLOAD and COPY commands. For more information about
                 creating the Amazon Redshift Role ARN, see the AWS documentation.
             •   Step 5: Add the Amazon Redshift Role ARN to the Amazon Redshift cluster to successfully perform the
                 read and write operations. For more information about adding the Amazon Redshift Role ARN to the
                 Amazon Redshift cluster, see the AWS documentation.
             •   Step 6: Install the PowerCenter Integration Service on the EC2 system.

      Create Minimal Amazon S3 Bucket Policy
             The minimal Amazon S3 bucket policy restricts user operations and user access to particular Amazon S3
             buckets by assigning an AWS IAM policy to users. You can configure the AWS IAM policy through the AWS
             console.

             You can use the following minimum required actions for users to successfully read data from and write data
             to Amazon Redshift resources:

             •   PutObject
             •   GetObject
             •   DeleteObject
             •   ListBucket
             •   GetBucketPolicy
             Sample Policy:

             {

             "Version": "2012-10-17", "Statement": [

             { "Effect": "Allow", "Action": [ "s3:PutObject", "s3:GetObject", "s3:DeleteObject",
             "s3:ListBucket", "s3:GetBucketPolicy" ], "Resource":
             [ "arn:aws:s3:::/*", "arn:aws:s3:::" ] }

             ]

             }

             You must make sure that the Amazon S3 bucket and Amazon Redshift cluster reside in the same region to
             run a session successfully.

             The supported regions are:

             •   Asia Pacific (Mumbai)
             •   Asia Pacific (Seoul)
             •   Asia Pacific (Singapore)
             •   Asia Pacific (Sydney)
             •   Asia Pacific (Tokyo)
             •   AWS GovCloud
             •   Canada (Central)
             •   China (Bejing)
             •   EU (Ireland)
             •   EU (Frankfurt)
             •   South America (Sao Paulo)

12   Chapter 2: PowerExchange for Amazon Redshift Configuration
•   US East (N. Virginia)
      •   US East (Ohio)
      •   US West (N. California)
      •   US West (Oregon)

Registering the Plug-in
      After you install or upgrade to PowerCenter 10.2, you must register the plug-in with the PowerCenter
      repository.

      A plug-in is an XML file that defines the functionality of PowerExchange for Amazon Redshift. To register the
      plug-in, the repository must be running in exclusive mode. Use the Administrator tool or the pmrep
      RegisterPlugin command to register the plug-in.

      The plug-in file for PowerExchange for Amazon Redshift is AmazonRSCloudAdapterPlugin.xml. When you
      install PowerExchange for Amazon Redshift, the installer copies the AmazonRSCloudAdapterPlugin.xml file
      to the following directory:

      \server\bin\Plugin

      Note: If you do not have the correct privileges to register the plug-in, contact the user who manages the
      PowerCenter Repository Service.

                                                                                           Registering the Plug-in   13
Chapter 3

Amazon Redshift Sources and
Targets
     This chapter includes the following topics:

     •    Amazon Redshift Sources and Targets, 14
     •    Import Amazon Redshift Objects, 14
     •    Amazon Redshift Lookup Transformation, 16

Amazon Redshift Sources and Targets
     Create a mapping with an Amazon Redshift source to read data from Amazon Redshift. Create a mapping
     with any source and an Amazon Redshift target to write data to Amazon Redshift. You cannot use the Create
     Target option from the advanced target session properties on an Amazon Redshift target using the
     PowerCenter Client.

     Note: Dynamic partitioning does not work for PowerExchange for Amazon Redshift.

Import Amazon Redshift Objects
     You can import Amazon Redshift source and target objects before you create a mapping.

     1.    Start PowerCenter Designer, and connect to a PowerCenter repository configured with an Amazon
           Redshift instance.
     2.    Open a source or target folder.
     3.    Select Source Analyzer or Target Designer.

14
4.   Click Sources or Targets, and then click Import from AmazonRSCloud Adapter.

     The Establish Connection dialog box appears.
5.   Specify the following information and click Connect.

      Connection Property     Description

      Username                User name of the Amazon Redshift account.

      Password                Password for the Amazon Redshift account.

      Schema                  Amazon Redshift schema name.
                              When you import objects from AmazonRSCloudAdapter in the PowerCenter Designer,
                              the table names are listed in an alphabetical order.
                              Default is public.

      AWS Access Key ID       Amazon S3 bucket access key ID.

      AWS Secret Access       Amazon S3 bucket secret access key ID.
      Key

      Master Symmetric Key    Optional. Amazon S3 encryption key.
                              Provide a 256-bit AES encryption key in the Base64 format.

      Customer Master Key     Optional. Specify the customer master key ID or alias name generated by AWS Key
      ID                      Management Service (AWS KMS).
                              You must generate the customer master key ID for the same region whereAmazon S3
                              bucket reside. You can either specify the customer generated customer master key
                              IDor the default customer master key ID.

                                                                               Import Amazon Redshift Objects    15
Connection Property       Description

                   Cluster Node Type         Node type of the Amazon Redshift cluster.
                                             You can select the following options:
                                             - ds1.xlarge
                                             - ds1.8xlarge
                                             - dc1.large
                                             - dc1.8xlarge
                                             - ds2.xlarge
                                             - ds2.8xlarge
                                             For more information about nodes in the cluster, see the Amazon Redshift
                                             documentation.

                   Number of Nodes in        Number of nodes in the Amazon Redshift cluster.
                   the Cluster               For more information about nodes in the cluster, see the Amazon Redshift
                                             documentation.

                   JDBC URL                  Amazon Redshift connection URL.
                                             If you configure the Amazon Redshift cluster for SSL, you can specify the secure URL.

                   Number of bytes           Not Applicable.
                   needed to support         This property is not supported as you cannot use the Create Target option from the
                   multibytes for varchar    advanced target session properties on an Amazon Redshift target using the
                                             PowerCenter Client.

             6.   Click Next.
             7.   Select the table that you want to import, and then click Finish. If you want to see the table metadata,
                  select the table, and click the table name.

Amazon Redshift Lookup Transformation
             You can use the imported Amazon Redshift source in a lookup transformation.

             You cannot configure the Lookup Caching Enabled option. By default, the Lookup Caching Enabled option is
             selected.

             For more information, see "Lookup Transformation" in the PowerCenter Transformation Guide.

16   Chapter 3: Amazon Redshift Sources and Targets
Chapter 4

Amazon Redshift Mappings
     This chapter includes the following topics:

      •   Amazon Redshift Mappings Overview, 17
      •   Configuring the Source Qualifier, 17
      •   Amazon Redshift Mapping Example, 17

Amazon Redshift Mappings Overview
     After you import an Amazon Redshift source or target definition into the PowerCenter repository, you can
     create a mapping to read data from an Amazon Redshift source or write data to an Amazon Redshift target.

     You can read data from a single Amazon Redshift source and write data to a multiple Amazon Redshift
     targets.

     For information on the Performance Tuning and Sizing Guidelines, see
     https://kb.informatica.com/h2l/HowTo%20Library/1/1110-
     PerformanceTuningandSizingGuidelinesforPowerExchangeforAmazonRedshiftforPC-H2L.pdf

Configuring the Source Qualifier
     When you import a source to create a mapping for Amazon Redshift source , you must configure the source
     qualifier to create the mapping.

     1.    In the mapping, click Source Qualifier
     2.    Select the Configure tab
     3.    Specify the Amazon Redshift connection details.
     4.    Save the mapping.

Amazon Redshift Mapping Example
     Your e-commerce organization stores sales order details in an Oracle database. Your organization needs to
     move the data from the Oracle database to an Amazon Redshift target.

                                                                                                                 17
The following procedure shows how to move data from the Oracle database to Amazon Redshift:

             1.   Import the Oracle source.
             2.   Import an Amazon Redshift target.
             3.   Create a mapping with a source and an Amazon Redshift target.
                  The following image shows the example mapping:

             4.   Create a session and configure it to write the data to the Amazon Redshift target.

             The mapping contains the following objects:

             Source Definition

                  The mapping source definition is a relational Oracle database. In the Source Analyzer, import the Oracle
                  source. The PowerCenter Integration Service reads the sales order details from the Oracle source.

                  The following table describes the structure of the source definition called Sales_Order_Details:

                   Field                                                Data Type

                   Order_No                                             Varchar

                   Item_Codes                                           Varchar

                   Item_Quantity                                        Number

                   Price                                                Number (p,s)

             Mapping Target

                  The mapping contains an Amazon Redshift target definition.

                  In the Target Designer, import an Amazon Redshift target definition.

                  The following image shows the Amazon Redshift target definition t_Sales_Order_Details:

18   Chapter 4: Amazon Redshift Mappings
Chapter 5

Amazon Redshift Pushdown
Optimization
     This chapter includes the following topics:

     •   Amazon Redshift Pushdown Optimization Overview, 19
     •   Pushdown Optimization Functions, 19
     •   Configuring Amazon Redshift ODBC Connection, 22
     •   Rules and Guidelines for Functions in Pushdown Optimization, 29

Amazon Redshift Pushdown Optimization Overview
     You can use pushdown optimization to push transformation logic to source or target databases. Use
     pushdown optimization when you use database resources to improve mapping performance.

     When you run a mapping configured for pushdown optimization, the mapping converts the transformation
     logic to an SQL query. The mapping sends the query to the database, and the database executes the query.

     Amazon Redshift supports source-side, target-side, or full pushdown optimization for mappings. You can
     perform insert, update, or delete operation in a pushdown optimization.

     Note: PowerExchange for Amazon Redshift does not support upsert operation in a pushdown optimization.

     When you configure full pushdown optimization for a mapping, you can include all the commands in one
     transaction and then use a single commit command to increase the performance. Set the value of the
     FullPushdownInOneTransaction custom property to yes to avoid multiple commit commands. For example,
     instead of running separate transactions for create view, insert, update, delete, and drop view commands, you
     can include all the commands in a single transaction and commit the transaction.

Pushdown Optimization Functions
     The following table summarizes the availability of pushdown functions in an Amazon Redshift database.
     Columns marked with an X indicate that the function can be pushed to the Amazon Redshift database by
     using source-side, target-side, or full pushdown optimization. Columns marked with an S indicate that the

                                                                                                                 19
function can be pushed to the Amazon Redshift database by using source-side pushdown optimization.
             Columns marked with a dash (-) symbol indicate that the function cannot be pushed to the database.

              Function               Pushdown       Function           Pushdown    Function              Pushdown

              ABORT()                -              INITCAP()          X           REG_MATCH()           -

              ABS()                  X              INSTR()            X           REG_REPLACE           -

              ADD_TO_DATE()          X              IS_DATE()          -           REPLACECHR()          -

              AES_DECRYPT()          -              IS_NUMBER()        -           REPLACESTR()          -

              AES_ENCRYPT()          -              IS_SPACES()        -           REVERSE()             -

              ASCII()                -              ISNULL()           S           ROUND(DATE)           -

              AVG()                  S              LAST()             -           ROUND(NUMBER)         X

              CEIL()                 X              LAST_DAY()         X           RPAD()                X

              CHOOSE()               -              LEAST()            -           RTRIM()               X

              CHR()                  X              LENGTH()           X           SET_DATE_PART()       -

              CHRCODE()              -              LN()               X           SIGN()                X

              COMPRESS()             -              LOG()              -           SIN()                 X

              CONCAT()               X              LOOKUP             -           SINH()                -

              COS()                  X              LOWER()            X           SOUNDEX()             -

              COSH()                 -              LPAD()             X           SQRT()                X

              COUNT()                S              LTRIM()            X           STDDEV()              S

              CRC32()                -              MAKE_DATE_TIME()   -           SUBSTR()              X

              CUME()                 -              MAX()              S           SUM()                 S

              DATE_COMPARE()         X              MD5()              -           SYSTIMESTAMP()        S

              DATE_DIFF()            X              MEDIAN()           -           TAN()                 S

              DECODE()               X              METAPHONE()        -           TANH()                -

              DECODE_BASE64()        -              MIN()              S           TO_BIGINT             X

              DECOMPRESS()           -              MOD()              S           TO_CHAR(DATE)         S

              ENCODE_BASE64()        -              MOVINGAVG()        -           TO_CHAR(NUMBER)       X

              EXP()                  X              MOVINGSUM()        -           TO_DATE()             X

              FIRST()                -              NPER()             -           TO_DECIMAL()          X

20   Chapter 5: Amazon Redshift Pushdown Optimization
Function              Pushdown     Function               Pushdown      Function               Pushdown

 FLOOR()               X            PERCENTILE()           -             TO_FLOAT()             X

 FV()                  -            PMT()                  -             TO_INTEGER()           X

 GET_DATE_PART()       X            POWER()                X             TRUNC(DATE)            S

 GREATEST()            -            PV()                   -             TRUNC(NUMBER)          S

 IIF()                 X            RAND()                 -             UPPER()                X

 IN()                  S            RATE()                 -             VARIANCE()             S

 INDEXOF()             -            REG_EXTRACT()          -

The following table lists the pushdown operators that can be used in an Amazon Redshift database. Columns
marked with an X indicate that the operator can be pushed to the Amazon Redshift database by using source-
side, target-side, or full pushdown optimization. Columns marked with an S indicate that the operator can be
pushed to the Amazon Redshift database by using source-side pushdown optimization.

 Operator                                           Pushdown

 +                                                  X

 -                                                  X

 *                                                  S

 /                                                  X

 %                                                  X

 ||                                                 X

 >                                                  S

 =                                                  S

 >=                                                 S
Configuring Amazon Redshift ODBC Connection
             You can set the pushdown optimization for the ODBC connection type that uses Amazon ODBC Redshift
             drivers to enhances the mapping performance. To use an ODBC connection to connect to Amazon Redshift,
             you must configure the ODBC connection.

             After you configure the ODBC connection, select the value of the Pushdown Optimization property as Full, To
             Source, or To Target accordingly in the session properties.

             Note: If you want to perform an update operation, you must select the value of the Allow Temporary View for
             Pushdown as Yes in the session properties.

             Amazon Redshift supports Amazon ODBC Redshift drivers on Windows and Linux systems. You must install
             the Amazon ODBC Redshift 32-bit or 64-bit driver based on your system requirement.

      Configuring Amazon Redshift ODBC Connection on Windows
             Before you establish an ODBC connection to connect to Amazon Redshift on Windows, you must configure
             the ODBC connection.

             Perform the following steps to configure an ODBC connection on Windows:

             1.   Download the Amazon Redshift ODBC drivers from the AWS website.
                  You must download the 32-bit or 64-bit driver based on your Windows system.
             2.   Install the Amazon Redshift ODBC drivers on the machine that hosts the PowerCenter Integration
                  Service.
             3.   Open the following folder in which ODBC data source file is installed.
                  •   For 32-bit driver: C:\WINDOWS\syswow64
                  •   For 64-bit driver: C:\WINDOWS\system32
             4.   Run the odbcad32.exe file.
                  The ODBC Data Sources Administrator box appears.
             5.   Click System DSN.

22   Chapter 5: Amazon Redshift Pushdown Optimization
The System DSN tab appears. The following image shows the System DSN tab on the ODBC Data
     Sources Administrator box:

6.   Click Configure.

                                                         Configuring Amazon Redshift ODBC Connection   23
The Amazon Redshift ODBC Driver DSN Setup box appears. The following image shows the Amazon
                  Redshift ODBC Driver DSN Setup box where you can configure the connection settings and
                  authentication:

             7.   Specify the following connection properties in the Connection Settings section:

                   Property                             Description

                   Data Source Name                     Name of the data source.

                   Server                               Location of the Amazon Redshift server.

                   Port                                 Port number of the Amazon Redshift server.

                   Database                             Name of the Amazon Redshift database.

                  Note: You must specify the Server, Port, and Database values from the JDBC URL.

24   Chapter 5: Amazon Redshift Pushdown Optimization
8.   Specify the following authentication properties in the Authentication section:

       Property                         Description

       Auth Type                        Type of the authentication.
                                        Default is Standard.

       User                             User name to access the Amazon Redshift database.

       Password                         Password for the Amazon Redshift database.

       Encrypt Password For             Encrypts the password for the following users:
                                        - Current User Only
                                        - All Users of This Machine
                                        Default is Current User Only.

 9.   Click SSL Options in the Amazon Redshift ODBC Driver DSN Setup box.
      The SSL Configuration box appears. The following image shows the SSL Configuration box:

10.   Select disable to disable the authentication in the Authentication Mode field.

11.   Click OK in the SSL Configuration box.
      The SSL Configuration box closes.
12.   Click Test to test the connection in the Amazon Redshift ODBC Driver DSN Setup box.
13.   Click OK.
      The Amazon Redshift ODBC connection is configured successfully on Windows.
 After you configure the Amazon Redshift ODBC connection, you must create an ODBC connection to connect
 to Amazon Redshift.
 For more information about how to create an ODBC connection to connect to Amazon Redshift, see “Creating
 an Amazon Redshift ODBC Connection” on page 26

                                                                  Configuring Amazon Redshift ODBC Connection   25
Configuring Amazon Redshift ODBC Connection on Linux
             Before you establish an ODBC connection to connect to Amazon Redshift on Linux, you must configure the
             ODBC connection.

             Perform the following steps to configure an ODBC connection on Linux:

             1.   Download the Amazon Redshift ODBC drivers from the AWS website.
                  You must download the 32-bit or 64-bit driver based on your Linux system.
             2.   Install the Amazon Redshift ODBC drivers on the machine that hosts the PowerCenter Integration
                  Service.
             3.   Configure the odbc.ini file properties in the following format:
                      [ODBC Data Sources]
                      driver_name=dsn_name

                      [dsn_name]
                      Driver=path/driver_file

                      Host=cluster_endpoint
                      Port=port_number
                      Database=database_name
             4.   Specify the following properties in the odbc.ini file:

                   Property                         Description

                   ODBC Data Sources                Name of the data source.

                   Driver                           Location of the Amazon Redshift ODBC driver file.

                   Host                             Location of the Amazon Redshift host.

                   Port                             Port number of the Amazon Redshift server.

                   Database                         Name of the Amazon Redshift database.

                  Note: You must specify the Host, Port, and Database values from the JDBC URL.
             5.   Add the odbc.ini file path in your source file in the following format:
                  ODBCINI=//odbc.ini
             6.   Restart the PowerCenter Server.
                  The Amazon Redshift ODBC connection on Linux is configured successfully.
             After you configure the Amazon Redshift ODBC connection, you must create an ODBC connection to connect
             to Amazon Redshift.
             For more information about how to create an ODBC connection to connect to Amazon Redshift, see “Creating
             an Amazon Redshift ODBC Connection” on page 26

      Creating an Amazon Redshift ODBC Connection
             You must create an ODBC connection to connect to Amazon Redshift after you configure the ODBC
             connection.

             Perform the following steps to create an Amazon Redshift ODBC connection on the Connections page:

             1.   In the Workflow Manager, click Connections.

26   Chapter 5: Amazon Redshift Pushdown Optimization
2.   Select Relational from the list.
     The Relational Connection Browser box appears. The following image shows the Relational Connection
     Browser box:

3.   Select Type as ODBC.
4.   Click New.

                                                           Configuring Amazon Redshift ODBC Connection   27
The Connection Object Definition box appears. The following image shows the Connection Object
                  Definition box:

             5.   Configure the following relational connection properties:

                   Relational Connection    Description
                   Property

                   Name                     Enter a name for the connection.

                   Type                     The connection type is set by default. You cannot edit this value.

                   User Name                Enter the user name to connect to the Amazon Redshift database.

                   Password                 Enter the password to connect to the Amazon Redshift database.

                   Connect String           Enter the name of the ODBC data source that you created for the Amazon Redshift
                                            database.

                   Code Page                Select the code page that the PowerCenter Integration Service must use to read or
                                            write data.

                   Attributes               Enter the ODBC Subtype attribute value as AWS Redshift.

                  The Amazon Redshift ODBC connection is created successfully.

28   Chapter 5: Amazon Redshift Pushdown Optimization
Rules and Guidelines for Functions in Pushdown
Optimization
     Use the following rules and guidelines when pushing functions to an Amazon Redshift database:

     •   To push TRUNC(DATE) to Amazon Redshift, you must define the date and format arguments. Otherwise,
         the PowerCenter Integration Service does not push the function to Amazon Redshift.
     •   The aggregator functions for Amazon Redshift accept only one argument, a field set for the aggregator
         function. The filter condition argument is not honored. In addition, make sure that all fields mapped to the
         target are listed in the GROUP BY clause.
     •   For Amazon Redshift, when you define only a string argument for TO_DATE() and TO_CHAR(), the
         PowerCenter Integration Service considers the default date format present in the session property. The
         default date format in the session property is: MM/DD/YYYY HH24:MI:SS.US
     •   Do not specify a format for SYSTIMESTAMP() to push the SYSTIMESTAMP to Amazon Redshift. The
         Amazon Redshift database returns the complete time stamp.
     •   To push INSTR() to Amazon Redshift, you must only define string, search_value, and start arguments.
         Amazon Redshift does not support occurrence and comparison_type arguments.
     •   The flag argument is ignored when you push TO_BIGINT and TO_INTEGER to Amazon Redshift.
     •   The CaseFlag argument is ignored when you push IN() to Amazon Redshift.
     •   If you use the NS format as part of the ADD_TO_DATE() function, the PowerCenter Integration Service
         does not push the function to Amazon Redshift.
     •   If you use any of the following formats as part of the TO_CHAR() and TO_DATE() functions, the
         PowerCenter Integration Service does not push the function to Amazon Redshift:
         - NS

         - SSSS

         - SSSSS

         - RR
     •   To push TRUNC(DATE), GET_DATE_PART(), and DATE_DIFF() to Amazon Redshift, you must use the
         following formats:
         - D

         - HH24

         - MI

         - MM

         - MS

         - SS

         - US

         - YYYY
     •   To push GET_DATE_PART() to Amazon Redshift, you must use the following formats:
         - D

         - DDD

         - HH24

         - MI

         - MM

                                                        Rules and Guidelines for Functions in Pushdown Optimization   29
- MS

                - SS

                - US

                - YYYY

30   Chapter 5: Amazon Redshift Pushdown Optimization
Chapter 6

Amazon Redshift Sessions
     This chapter includes the following topics:

     •   Amazon Redshift Sessions Overview, 31
     •   Amazon Redshift Connections, 31
     •   Amazon Redshift Source Sessions, 33
     •   Amazon Redshift Target Sessions, 39
     •   Working with Large Tables, 47
     •   Octal Values as DELIMITER and QUOTE, 47
     •   Success and Error Files, 47

Amazon Redshift Sessions Overview
     You must configure an Amazon Redshift connection in the Workflow Manager to read data from or write data
     to an Amazon Redshift table.

     The PowerCenter Integration Service writes the data to a staging directory and then to an Amazon S3 bucket
     before it writes the data to Amazon Redshift. You must specify the location of the staging directory in the
     session properties. You must also specify an Amazon S3 bucket name in the session properties. You must
     have write access to the Amazon S3 bucket.

Amazon Redshift Connections
     Use an Amazon Redshift connection to connect to the Amazon Redshift database. The PowerCenter
     Integration Service uses the connection when you run an Amazon Redshift session.

                                                                                                               31
Amazon Redshift Connection Properties
             When you configure an Amazon Redshift connection, you define the connection attributes that the
             PowerCenter Integration Service uses to connect to the Amazon Redshift database.

             The following table describes the application connection properties:

              Property               Description

              Name                   Name of the Amazon Redshift connection.

              Type                   The AmazonRSCloudAdapter connection type.

              User Name              User name to access the Amazon Redshift database.

              Password               Password for the Amazon Redshift database user name.

             The following table describes the Amazon Redshift connection attributes:

              Property                 Description

              Schema                   Schema name for the Amazon Redshift tables.
                                       When you import objects from AmazonRSCloudAdapter in the PowerCenter Designer, the
                                       table names are listed in an alphabetical order.
                                       Default is public.
                                       Note: The public schema might not work for all the Amazon Redshift tables.

              AWS Access Key ID        Amazon S3 bucket access key ID.

              AWS Secret Access        Amazon S3 bucket secret access key ID.
              Key

              Master Symmetric Key     Optional. Amazon S3 encryption key.
                                       Provide a 256-bit AES encryption key in the Base64 format.

              Customer Master Key      Optional. Specify the customer master key ID or alias name generated by AWS Key
              ID                       Management Service (AWS KMS).
                                       You must generate the customer master key ID for the same region where Amazon S3 bucket
                                       reside. You can either specify the customer generated customer master key ID or the default
                                       customer master key ID.

              Cluster Node Type        Node type of the Amazon Redshift cluster.
                                       You can select the following options:
                                       - ds1.xlarge
                                       - ds1.8xlarge
                                       - dc1.large
                                       - dc1.8xlarge
                                       - ds2.xlarge
                                       - ds2.8xlarge
                                       For more information about clusters, see the Amazon Redshift documentation.

              Number of Nodes in       Number of nodes in the Amazon Redshift cluster.
              the Cluster              For more information about nodes in the cluster, see the Amazon Redshift documentation.

32   Chapter 6: Amazon Redshift Sessions
Property                 Description

      JDBC URL                 Amazon Redshift connection URL.

      Number of bytes          Not applicable.
      needed to support        This property is not supported as you cannot use the Create Target option from the
      multibytes for varchar   advanced target session properties on an Amazon Redshift target using the PowerCenter
                               Client.

  Configuring an Amazon Redshift Connection
     Configure an Amazon Redshift connection in the Workflow Manager to define the connection attributes that
     the PowerCenter Integration Services uses to connect to the Amazon Redshift database.

     1.   In the Workflow Manager, click Connections > Application.
          The Application Connection Browser dialog box appears.
     2.   Click New.
          The Select Subtype dialog box appears.
     3.   Select AmazonRSCloudAdapter and click OK.
          The Connection Object Definition dialog box appears.
     4.   Enter a name for the Amazon Redshift connection.
     5.   Enter the application properties for the connection.
     6.   Enter the Amazon Redshift connection attributes.
     7.   Click OK.

  Configuring the Source Qualifier
     When you import a source to create a mapping for Amazon Redshift source , you must configure the source
     qualifier to create the mapping.

     1.   In the mapping, click Source Qualifier
     2.   Select the Configure tab
     3.   Specify the Amazon Redshift connection details.
     4.   Save the mapping.

Amazon Redshift Source Sessions
     Create a mapping with an Amazon Redshift source and a target to read data from Amazon Redshift.

     You can encrypt data, specify the location of the staging directory, and securely unload the results of a query
     to files on Amazon Redshift.

                                                                                   Amazon Redshift Source Sessions     33
Client-side Encryption
             Client-side encryption is a technique to encrypt data while writing the data to Amazon S3.

             Create a Master Symmetric Key, which is a 256-bit AES encryption key in Base64 format. To enable client-
             side encryption, provide a Master Symmetric Key or customer master key you created in the connection
             properties. The PowerCenter Integration Service encrypts the data by using the Master Symmetric Key or
             customer master key. PowerExchange for Amazon Redshift uploads the data to the Amazon S3 server by
             using the Master Symmetric Key or customer master key, and then loads the data by using the copy
             command with the Encrypted option and a private encryption key for additional security.

             The PowerCenter Integration Service encrypts the files that are uploaded to Amazon S3 at the client-side. If
             you enable both server-side and client-side encryption for an Amazon Redshift target, then the client-side
             encryption is used for data load. If you provide customer master key ID generated by AWS Key Management
             Service in the Amazon Redshift connection properties, then the server-side encryption is used for data load.

             To support encryption with maximum security, you must update the security policy .jar files
             local_policy.jar and US_export_policy.jar. The .jar files are located at \ava\jre\lib\security. You can download the .jar files supported by the JAVA environment
             from the Oracle website.

      Identity Columns
             An identity column contains unique values that are automatically generated.

             Rules and Guidelines for Identity Columns
             •   The data type for an identity column must be either int or bigint.
             •   When you create a mapping for an insert operation, you must link either all the source and target identity
                 columns or none.
             •   When you create a mapping for an update, upsert or delete operation, you cannot map the identity
                 columns that are not part of the primary key.
             •   If an identity column is part of the primary key, you must map the column for update, upsert, and delete
                 operations, or the session fails. However, you cannot set a source value for these columns.
             •   The ExplicitID and MaxError count options are removed for the upsert, update, and delete operations.

      Unload Command
             You can use the Unload command to extract data from Amazon Redshift and create staging files on Amazon
             S3. The Unload command uses a secure connection to load data into one or more files on Amazon S3.

             You can specify the Unload command options directly in the UnloadOptions Property File field. Enter the
             options in uppercase and delimit the options by using a space. The Unload command has the following
             options and default values:

             DELIMITER=\036 ESCAPE=OFF PARALLEL=ON AWS_IAM_ROLE=arn:aws:iam:::role/

             You can also create a property file. The property file contains the Unload command options. Include the
             property file path in the UnloadOptions Property File field. For example:

             C:\Temp\Redshift\unloadoptions.txt

             In the property file, delimit the options by using a new line. For example:

             DELIMITER=\036

34   Chapter 6: Amazon Redshift Sessions
ESCAPE=OFF

PARALLEL=ON

AWS_IAM_ROLE=arn:aws:iam:::role/

It is recommended to use octal representation of non-printable characters as DELIMITER and QUOTE.

If you run the Unload command as a pre-SQL or post-SQL command, specify the ALLOWOVERWRITE option to
overwrite the existing objects.

Unload Command Options
The Unload command options extract data from Amazon Redshift and load data to staging files on Amazon
S3 in a particular format. You can delimit the data with a particular character or load data to multiple files in
parallel.

To add options to the Unload command, use the UnloadOptions Property File option. You can set the
following options:
DELIMITER

    A single ASCII character to separate fields in the input file. You can use characters such as pipe (|), tilde
    (~), or a tab (\t). The delimiter you specify should not be a part of the data. If the delimiter is a part of
    data, use ESCAPE to read the delimiter character as a regular character. Default is \036, the octal
    representation of the non-printable character, record separator.

ESCAPE

    You can add an escape character for CHAR and VARCHAR columns in delimited unload files before
    occurrences of the following characters:

     •   Linefeed \n
     •   Carriage return \r
     •   Delimiter character specified for the unloaded data
     •   Escape character \
     •   Single- or double-quote character

    Default is OFF.

PARALLEL
    The Unload command writes data in parallel to multiple files, according to the number of slices in the
    cluster. Default is ON. If you turn the Parallel option off, the Unload command writes data serially. The
    maximum size of a data file is 6.5 GB.

AWS_IAM_ROLE

    Specify the Amazon Redshift Role Resource Name (ARN) to run the session on PowerCenter Integration
    Service installed on an Amazon EC2 system in the following format:
    AWS_IAM_ROLE=arn:aws:iam:::role/

    For example: arn:aws:iam::123123456789:role/redshift_read

ADDQUOTES

    ADDQUOTES is implemented with the UNLOAD command by default. Do not specify the ADDQUOTES
    option in the advanced source properties. The Unload command adds quotation marks to each data
    field. With added quotation marks, the UNLOAD command can read data values that contain the
    delimiter. If double quote (") is a part of data, use ESCAPE to read the double quote as a regular
    character.

                                                                             Amazon Redshift Source Sessions    35
Partitioning
             If you need to extract a large amount of source data, you can partition the sources to improve session
             performance. Partitioning sources allows the PowerCenter Integration Service to create multiple connections
             to sources and process partitions of source data concurrently. You can partition sources if the PowerCenter
             Integration Service can maintain data consistency when it processes the partitioned data.

             By default, the Workflow Manager sets the partition type to pass-through for Amazon Redshift tables. In
             pass-through partitioning, the PowerCenter Integration Service passes all rows at one partition point to the
             next partition point without redistributing them.

             If you create multiple partitions for an Amazon Redshift source session, the PowerCenter Integration Service
             evaluates the session properties in the following order to run the session:

             1.   SQL Query
             2.   INFA ADVANCED FILTER
             3.   Slices on Amazon Redshift Nodes

             Configuring Partitioning
             When you create or edit a session, you can add partition to the sources to improve the session performance.

             Perform the following steps to configure partitioning:

             1.   In the Workflow Manager, open the session.
             2.   Double-click the session.
                  The Session Properties dialog box appears.
             3.   Click Mapping.
                  The Mapping tab appears.
             4.   Click Partitions and select the partition point where you want to add the partition.
             5.   Click Edit Partition Point.
                  The Edit Partition Point dialog box appears.
             6.   Click Add to add a partition.
                  Note: You can add multiple partitions as required.

36   Chapter 6: Amazon Redshift Sessions
The following image shows the Edit Partition Point dialog box where you can add partitions:

 7.   Under the Partition Type section, select the Pass Through partition.
 8.   Click OK.
      The Edit Partition Point dialog box closes.
 9.   In the Properties section on the Mapping tab, navigate to the INFA ADVANCED FILTER attribute.
10.   Add the filter conditions for each partition to optimize the search.
      For example, id>0 AND id
The following image shows the INFA ADVANCED FILTER attribute where you can add the filter
                  conditions:

            11.   Click OK.

      Amazon Redshift Source Session Configuration
             You can configure a session to read data from Amazon Redshift. Define the properties for each source in the
             session.

             The following table describes the session properties:

              Advanced Property        Description

              S3 Bucket Name           Amazon S3 bucket name for the Amazon Redshift source data.
                                       Use an S3 bucket in the same region as your Amazon Redshift cluster.

              Enable Compression       Compresses staged files before writing the files to Amazon Redshift.
                                       Session performance improves when the PowerCenter Integration Service compresses the
                                       staged files.
                                       Default is selected.

              Staging Directory        Amazon Redshift staging directory.
              Location                 Specify a directory on the machine that hosts the PowerCenter Integration Service.

38   Chapter 6: Amazon Redshift Sessions
Advanced Property        Description

      UnloadOptions            Path to the property file.
      Property File            Enables you to add options to the unload command to write data from an Amazon Redshift
                               object to an S3 bucket.
                               You can add the following options:
                               - DELIMITER
                               - PARALLEL
                               - ESCAPE
                               - AWS_IAM_ROLE
                               Either specify the path of the property file that contains the unload options or specify the
                               unload options directly in the UnloadOptions Property File field. Specify a directory on the
                               machine that hosts the PowerCenter Integration Service.

      Turn on S3 Client Side   Indicates that the PowerCenter Integration Service encrypts data before writing the data to
      Encryption               Amazon S3 by using a private encryption key.

      Enable Downloading       Downloads large Amazon S3 objects in multiple parts.
      S3 Files in Multiple     When the file size of an Amazon S3 object is greater than 8 MB, you can choose to download
      Parts                    the object in multiple parts in parallel.

      Infa Advanced Filter     SQL filter command to divide the source database into multiple segments.

      Pre-SQL                  The UNLOAD or COPY commands to read from or write to Amazon Redshift. The command
                               you specify here is processed as a plain text.

      Post-SQL                 The UNLOAD or COPY commands to read from or write to Amazon Redshift. The command
                               you specify here is processed as a plain text.

      SQL Query                Overrides the default query. Enclose column names in double quotes. The SQL query is case
                               sensitive. Specify an SQL statement supported by the Amazon Redshift database.

      Number of Sorted         Number of columns used when sorting rows queried from the source. The PowerCenter
      Ports                    Integration Service adds an ORDER BY clause to the default query when it reads source
                               rows. The ORDER BY clause includes the number of ports specified, starting from the top of
                               the transformation. When you specify the number of sorted ports, the database sort order
                               must match the session sort order.
                               Default is 0.

      Select Distinct          Selects unique values. The PowerCenter Integration Service includes a SELECT DISTINCT
                               statement if you choose this option. Amazon Redshift ignores trailing spaces. Therefore, the
                               PowerCenter Integration Service might extract fewer rows than expected.

      Source Table Name        You can override the default source table name.

Amazon Redshift Target Sessions
     Create a session and associate it with the mapping that you created to move data to an Amazon Redshift
     table. Change the connection to an Amazon Redshift connection, and define the session properties to write
     data to Amazon Redshift.

     You can perform insert, update, delete, and upsert operations on an Amazon Redshift target.

                                                                                      Amazon Redshift Target Sessions         39
In the source session properties, if you set Update as the value of the Treat source as rows property and
             select the Update as Insert option in the target session properties, the mapping runs successfully. However,
             the PowerCenter Integration Service rejects the data.

             Note: If the distribution key column in a target table contains null values and you configure a task with an
             Update as Insert operation for the same target table, the task might create duplicate rows. To avoid creating
             duplicate rows, you must perform one of the following tasks:

             •   Replace the null value with a non-null value when you load data.
             •   Do not configure the column as a distribution key if you expect null values in the distribution key column.
             •   Remove the distribution key column from the target table temporarily when you load data. You can use the
                 Pre-SQL and Post-SQL properties to remove and then add the distribution key column in the target table.

      Server-side Encryption for Amazon Redshift Targets
             If you want Amazon Redshift to encrypt data while uploading the .csv files to Amazon Redshift, you must
             enable server-side encryption. To enable server-side encryption, select Server Side Encryption as the
             encryption type in the target session properties.

             You can configure the customer master key ID generated by AWS Key Management Service (AWS KMS) in the
             connection properties for server-side encryption. You must add IAM EC2 role and IAM Redshift role to the
             customer master key when you use IAM authentication and server-side encryption using customer master
             key. If you select the server-side encryption in the target session properties and do not specify the customer
             master key ID in the connection properties, Amazon S3-managed encryption keys are used to encrypt data.

      Amazon Redshift Staging Directory
             The PowerCenter Integration Service creates a staging file in the directory that you specify in the session
             properties. The PowerCenter Integration Service writes the data to the staging directory before it writes the
             data to Amazon Redshift.

             The PowerCenter Integration Service deletes the staged files from the staging directory after it writes the
             data to Amazon S3. Specify a staging directory in the session properties with an appropriate amount of disk
             space for the volume of data that you want to process. Specify a directory on the machine that hosts the
             PowerCenter Integration Service.

             The PowerCenter Integration Service creates subdirectories in the staging directory. Subdirectories use the
             following naming convention:
                  /infaRedShiftStaging

      Vacuum Tables
             You can use vacuum tables to recover disk space and sorts rows in a specified table or all tables in the
             database.

             After you run bulk operations, such as delete or load, or after you run incremental updates, you must clean
             the database tables to recover disk space and to improve query performance on Amazon Redshift. Amazon
             Redshift does not reclaim and reuse free space when you delete and update rows.

             You can configure vacuum table recovery options in the session properties. You can choose to recover disk
             space for the entire database or for individual tables in a database. Vacuum databases or tables often to
             maintain consistent query performance. You must run vacuum when you expect minimal activity on the
             database or during designated database administration schedules. Long durations of vacuum might impact
             database operations. Run vacuum often because large unsorted regions result in longer vacuum times.

40   Chapter 6: Amazon Redshift Sessions
You can also read