Integrating PostgreSQL on Google Cloud SQL to Redshift is an essential step in unlocking the power of data for modern businesses. By centralizing data in Redshift, a fully managed data warehousing service that provides high-performance analytical capabilities, you can expedite the analysis of voluminous datasets. 

This integration not only enhances data analysis capabilities but also streamlines data management, offering instant access to comprehensive datasets in real-time. It empowers you to extract actionable insights to improve your sales strategies and customer experiences. 

Methods to Connect PostgreSQL on Google Cloud SQL to Redshift

In this article, we will cover two straightforward methods of PostgreSQL on Google Cloud SQL to Redshift integration.

Let’s dive in!

Method 1: Manually Load Data from PostgreSQL on Google Cloud SQL to Redshift

Moving data from PostgreSQL on Google Cloud SQL to Amazon Redshift involves several steps. Here’s a general outline of the process.

Step 1: Export Data into CSV Files

You can use the Google Cloud Console to export data from your PostgreSQL database in Google Cloud SQL into CSV files. To export data in CSV files using the console, follow these steps:

  • Go to the Google Cloud Console and sign in to your Google Cloud account.
  • From the navigation menu, select SQL under the Storage section.
  • Click on the name of your Google Cloud SQL PostgreSQL instance from the list.
  • On the PostgreSQL instance overview page, click on the Export tab.
Image Source
  • Configure the Export options:
  • Select the Database that you want to export.
  • Choose CSV as the export format.
  • Specify the tables that you want to export. You can export all or select specific ones.
  • In the Export location, choose the Cloud Storage Bucket as the destination for your CSV files.
  • To initiate the export process, click on the Export button.
  • Google Cloud SQL will start exporting the data from your PostgreSQL database into CSV files and save them in the specified Cloud Storage bucket.
  • Now, the CSV files are temporarily stored in the Google Cloud Storage bucket before transferring them to the Amazon S3 bucket. The Amazon S3 bucket acts as a staging area where you can store and organize your data before loading it into the Redshift table.

Step 2: Upload CSV Files to Amazon S3

  • Under the Cloud Storage section, open the Google Cloud Storage bucket where the CSV files are placed.
  • Select the checkbox next to the CSV files and click on the download button next to the specific CSV files. These files will now be downloaded to your local machine.
Add Bucket Details
  • Once the files are downloaded, you can upload CSV files from your local machine to your S3 bucket using AWS CLI or AWS console.
  • Using AWS CLI:
  1. Configure the AWS CLI and use the aws s3 cp command to upload files from the local machine to the Amazon S3 bucket.
aws s3 cp /path_to_local_folder/ s3://s3_bucket_name/folder_name

Replace path_to_local_folder with the path to the file or folder on your local machine that you want to copy and s3_bucket_name/folder_name with your S3 bucket and folder name.

  • Using AWS Console:
  1. Select the S3 bucket and open the folder where you want to upload the CSV files.
  2. Click on the Upload button and choose Add Files.
Upload Files
  1. Browse your local machine to select the CSV files you want to upload and configure the necessary settings.
  2. Click on Start Upload.
  • Once the transfer is complete, verify if the CSV files are correctly copied in your S3 bucket using the AWS Management console.

Step 3: Load Data into Redshift

  • In the AWS Redshift cluster, create or use the existing table. Ensure that the column names, data types, and constraints align with the PostgreSQL tables.
  • In Amazon Redshift, the COPY command is used to load data into Redshift tables from various sources. These sources can include Amazon S3, DynamoDB, or data files on your local machine. This command supports various data formats, including JSON, CSV, AVRO, Parquet, and more. 

Use the COPY command to load data from the S3 bucket to Amazon Redshift.

COPY redshift_table_name
FROM 's3://s3-bucket_path/data-prefix/' 
CREDENTIALS 'aws_access_key_id=ACCESS_KEY_ID;aws_secret_access_key=SECRET_ACCESS_KEY' 
CSV;

Replace redshift_table_name, S3 bucket name, and AWS credentials with the actual information. The CSV indicates that the data is in CSV format.

These steps complete the process of transferring data from PostgreSQL on Google Cloud SQL to Redshift using the CSV files, Google Cloud Console, COPY command, and AWS S3.

The above approach offers several advantages:

  • Ease of Use: The Google Cloud Console provides a user-friendly interface to export data in CSV readable format from Cloud SQL. It also eliminates the need for coding, simplifying the export process for a wide range of technical as well as non-technical users.
  • Infrequent Backups: The manual approach using CSV files is best suited for creating backups with smaller to moderate datasets, particularly for scenarios when you don’t need continuous data replication.

Limitations of using CSV Files, Google Cloud Console, and AWS S3 for PostgreSQL on Google Cloud SQL to Redshift Data Migration. 

  • Manual Intervention: The process of exporting Google CloudSQL PostgreSQL data into CSV and then transforming from S3 to Redshift requires manual interventions at several stages. This could be time-consuming, especially for larger datasets. It also requires continuous monitoring and management at each step during the data transfer process.
  • File Size Limitations: Google Cloud Storage supports a maximum single-object size upto 5 TB. If you try to upload files larger than this limit, the transfer for those files will fail. In such situations, you need to divide larger files into smaller segments to ensure successful uploads.

Method 2: Using a No-Code Tool like Hevo Data to Build PostgreSQL on Google Cloud SQL to Redshift ETL Pipeline

Using a no-code tool enables quick and accurate data integration from various sources, reducing the time required to set up data pipelines. In addition, they eliminate the limitations mentioned in the above method, significantly reducing manual tasks and efforts. With streamlined data integrations with no-code tools, you can make informed decisions quickly.

Hevo Data is a cost-effective no-code data replication tool that helps you transfer data from Google Cloud PostgreSQL and 150+ other data sources to Redshift seamlessly. Its architecture using Kafka allows real-time streaming and offers flexibility for pre-load transformations and schema management.

Step 1: Specify Google Cloud PostgreSQL Connection Settings 

Configure Google Cloud PostgreSQL Source

Step 2: Configure Redshift as a Destination 

Configure Redshift Destination

With these two simple steps, you’ve completed PostgreSQL on Google Cloud SQL to Redshift integration using the Hevo data replication platform.

Some of the notable features of Hevo Data:

  • Intuitive Interface: Hevo offers a user-friendly interface that allows both technical and non-technical users to set up and manage data pipelines quickly.
  • Drag-and-Drop Transformation: Through Hevo’s drag-and-drop interface, you can apply simple transformations without the need for coding. This allows you to clean and enrich data before loading it into the Redshift table. However, for more complex transformations, you can use a Python-based console.
  • Automated Schema Mapper: Hevo’s Automapper automatically analyzes the source schema and updates the changes in the destination schema. This feature will create compatible data types in your Redshift table without any human intervention.
  • Monitoring and Alerts: You can use the alert features to stay updated about your PostgreSQL Google Cloud SQL Redshift data pipeline status. This allows you to proactively monitor and address any issues, ensuring uninterrupted data flow.

What can you Achieve from PostgreSQL on Google Cloud SQL Redshift Integration?

Migrating data from PostgreSQL on Google Cloud SQL to Redshift can assist you with the following scenarios:

  • What are the overall sales trends over time, and which product or services generate the highest revenue following the trends?
  • What marketing campaigns relate to the increased sales?
  • Is there any correlation between customer demographics and purchasing habits?
  • Does the customer experience or feedback influence buying pattern?
  • What are the Key Performance Indicators (KPIs) for each team, and how they have improved over time?

Conclusion

The integration of PostgreSQL on Google Cloud SQL with Redshift can enhance your business’s data management needs. Although the manual approach offers certain benefits, such as simplicity for frequent backups and security, it does come with limitations. One of these is the lack of real-time synchronization due to manual interventions.

However, leveraging a no-code solution like Hevo Data simplifies the process of transferring data from PostgreSQL on Google Cloud SQL to Redshift. Hevo provides real-time data synchronization, multiple connectors, and a completely automated solution to streamline the integration process.

In case you want to integrate data into your desired Database/destination, then Hevo Data is the right choice for you! It will help simplify the ETL and management process of both the data sources and the data destinations. 

Visit our Website to Explore Hevo

Offering 150+ plug-and-play integrations and saving countless hours of manual data cleaning & standardizing, Hevo Data also offers in-built pre-load data transformations that get it done in minutes via a simple drag-and-drop interface or your custom python scripts. 

Want to take Hevo Data for a ride? SIGN UP for a 14-day free trial and experience the feature-rich Hevo suite first hand. Check out the pricing details to understand which plan fulfills all your business needs

Tejaswini Kasture
Freelance Technical Content Writer, Hevo Data

Tejaswini's profound enthusiasm for data science and passion for writing drive her to create high-quality content on software architecture, and data integration.

All your customer data in one place.