Migrating data from Google Cloud PostgreSQL to SQL Server is a strategic move that addresses various business needs. Firstly, this migration enables seamless integration within the Microsoft ecosystem, facilitating efficient data sharing and management across platforms.

Secondly, it unlocks scalability, ensuring that your data infrastructure can effortlessly expand with growing demand. Beyond these benefits, this integration helps organizations with SQL Servers’ enhanced security features, disaster recovery mechanisms, and high availability. This transformation not only streamlines your data operations but will also help you perform in-depth analysis.

In this article, you’ll learn two popular ways to replicate data from Google Cloud PostgreSQL to SQL Server.

Method 1: Load Data from PostgreSQL on Google Cloud SQL to SQL Server using CSV Files

To connect PostgreSQL on Google Cloud SQL to SQL Server using CSV files, you can follow the steps given below. Before you begin, make sure you have the following prerequisites in place:

  • Access to Google Cloud SQL: You need access to the Google Cloud SQL instance containing the PostgreSQL database. Ensure that you have the permission to export data from the PostgreSQL database.
  • SQL Server: Necessary permissions and access to the SQL Server instance where you want to import data.

Now, let’s walk through the detailed steps to migrate data.

Step 1: Export Data from Google Cloud SQL PostgreSQL

There are various ways to export data from Google Cloud SQL PostgreSQL into CSV files. Some of the common ways are using the COPY command, the Google Cloud Console, and the pg_dump command.

Using the COPY Command
  • You can use the COPY TO command in PostgreSQL to copy data from a table to an external file, including CSV. 
  • Connect to your PostgreSQL database using CLI:
psql -h <google-sql-host> -U <username> -d <database-name>
  • Now, use the COPY command to export data from the Google Cloud SQL PostgreSQL table into a CSV file.
COPY table_name TO '/path_to_your_directory/file.csv' WITH CSV HEADER;

Replace the necessary fields with your PostgreSQL database details and specify the name and path where you want to save the CSV file.

Using Google Cloud Console
  • Open the Google Cloud Console and select Google Cloud project where your Google Cloud SQL instance is located.
  • Select your PostgreSQL instance. In the Instance details, click on the name of the PostgreSQL database you want to export data from. Click on the Export tab.
  • Choose Offload export to enable concurrent operations while the export is in progress.
  • In the Cloud Storage export location section, you need to add the name of the bucket, folder, and file for your export.
  • Alternatively, you can choose the Browse option to locate your bucket, folder, or file. 
    • If you select the Browse option, add a Cloud Storage bucket or folder to export in the Location section. 
    • Within the Name field, enter a path for the CSV file or choose an already existing file from the options available in the Location section. 
    • Click Select.
  • Select CSV in the Format section.
  • From the Database for export section, select the database name from the drop-down menu.
  • Now, enter a SQL query to specify the PostgreSQL table you want to export. For example, your search query from a specific table would look like:
Select * from Company.employees;

Where employees is the name of the table in the Company database.:

  • Finally, click on the Export button to initiate the transfer process.
  • Now, open the Cloud Storage section where your CSV files are stored and download them to your local machine.
Using the pg_dump Command
  • You can use the pg_dump utility command to create an SQL dump or export specific data from the PostgreSQL table into a file. This file includes the schema, table definitions, and data in a custom PostgreSQL-specific format.
  • Open a command prompt on your local machine or a server with access to your Google Cloud SQL instance.
pg_dump -h <google-sql-host> -U <username> -d <database-name> -t <table-name> --column-inserts --file=/path_to_your_directory/data.sql

Replace <google-sql-host> with the hostname or IP address of your Google Cloud SQL instance. <username>, <database-name>, and <table-name> with the respective PostgreSQL database details. Specify the path and name of the output file where the data will be saved.

  • The downloaded file will be in SQL format. You need to convert it into a CSV file using a Bash or Python script.

By using any of these methods, you’ll have Google Cloud SQL PostgreSQL data in CSV file format on your local machine.

Step 2: Clean the CSV Files

After downloading CSV files, perform data mapping and transformation, considering the differences between PostgreSQL and SQL Server data types and schema structure. You also need to check duplicate values, remove missing or null values, and review accuracy. 

Before uploading to SQL Server, ensure that the data is in the correct format.

Step 3: Import Data into SQL Server

You can use the SQL Server Management Studio (SSMS) or T-SQL command to import CSV file data into the SQL Server. 

Below are the SSMS steps for importing data from a CSV file into SQL Server:

  • Open the SSMS and connect to your SQL Server instance. Create one or use an existing target table in your SQL Server database to hold CSV file data.
  • Select your database from the Object Explorer.
  • Right-click on the target database > Tasks > Import Data. In the SQL Server Import and Export Wizard window, click Next to start the process.
  • In the Data Source section, select the source of your data. Choose Flat File Source, as your data is in a CSV format. Click Browse to specify the path of your CSV file and configure options like delimiter and text qualifier as needed. Click Next.
  • In the Destination section, select the SQL Server database and table where you want to import the data. Provide the necessary details and click Next.
  • Check if the columns from the source match the target table from the Edit Mappings.
  • In the Run Package section, review your settings and click on Next.
  • Wait for SSMS to complete the import. You’ll see a summary of the transfer process. Finally, click on Finish.

These three steps complete the data migration process from PostgreSQL on Google Cloud SQL to SQL Server.

While the above method using CSV files looks time-consuming and resource-intensive, it offers several advantages:

  • Minimal Dependencies: Working with CSV files often requires minimal software or infrastructure dependencies, making it accessible even in resource-constrained environments. This minimal dependency contributes to lower complexity and resource demands during data migration.
  • One-Time Transfer: This approach is specifically suitable where you only need to perform an infrequent data transfer from Google Cloud SQL for PostgreSQL to SQL Server. It simplifies the process and reduces the complexity of building a robust data pipeline.

On the other hand, using CSV files does have some drawbacks:

  • Manual Process: Exporting and importing CSV files is a time-consuming process as you need to repeat the same steps for each table. This can be a burdensome task for larger datasets.
  • Lack of Incremental Updates: Using CSV files does not inherently support incremental updates. To keep data up-to-date in the SQL Server, you would need to repeat the import and export process for the entire dataset.

Method 2: Using a No-Code Tool like Hevo Data to Build PostgreSQL on Google Cloud SQL to SQL Server ETL Pipeline

No-code tool simplifies the data migration process by eliminating the need for manual coding, reducing errors, and saving time. It provides a user-friendly interface for seamlessly extracting and loading data from various sources and destinations. No-code tools often offer built-in automation and data transformation features, streamlining the entire data migration process.

Hevo Data is a robust, no-code, cloud-based data replication solution designed to streamline the process of collecting, transforming, and loading data. 

Here’s a step-by-step guide to complete PostgreSQL on Google Cloud SQL to SQL Server migration, but before that, let’s take a quick look at the prerequisites.

PostgreSQL on Google Cloud SQL:

  • PostgreSQL version should be 9.4 or higher.
  • You should have the hostname or IP address of your PostgreSQL server.
  • All the Hevo’s IP addresses should be whitelisted. 
  • SELECT, USAGE, and CONNECT permissions must be granted to the database user. 
  • To create the source pipeline in Hevo, you need to have the Team Administrator, Team Collaborator, or Pipeline Administrator role.

SQL Server:

  • Make sure your SQL Server is running.
  • SQL Server database host IP or DNS and port must be available.
  • Necessary user permissions on the database should be granted.
  • To create the destination in Hevo, you must be assigned the Team Collaborator or any administrator role except the Billing Administrator role.

Step 1: Configure Google Cloud PostgreSQL as a Source

PostgreSQL on Google Cloud SQL to SQL Server: Configure PostgreSQL on Google Cloud SQL as a Source
Image Source

Step 2: Configure SQL Server as a Destination

PostgreSQL on Google Cloud SQL to SQL Server: Configure SQL Server as a Destination
Image Source

That’s it! With these two simple steps, you’ve completed PostgreSQL on Google Cloud SQL to SQL Server migration process.

Some of the key features of Hevo Data include:

  • Pre-built Connectors: With its 150+ pre-built connectors, you can set up data pipelines in just two steps without manual interventions. This feature makes Hevo accessible to a wide range of technical as well as non-technical users.
  • Schema Mapping: Hevo automates schema mapping by analyzing changes in the source schema and updating them in SQL Server. This ensures a seamless solution for optimizing PostgreSQL on Google Cloud SQL pipeline without manual interventions. 
  • Customer Support: In case you feel stuck while creating a pipeline, the Hevo team is available to provide support, ensuring a smooth data integration experience. These features make Hevo the preferable choice, empowering worry-free, data-driven decision-making.

What can you Achieve from PostgreSQL on Google Cloud SQL to SQL Server Integration?

Centralizing data in SQL Server allows you to access comprehensive customer data and get answers to the following questions:

  • Are there any specific geographic regions with exceptional sales performance?
  • What are the trends that can help identify sales conversion opportunities?
  • Who are the most prominent customers, and what are their frequency of purchases?
  • What are the conversion rates for leads generated by the marketing team?

Conclusion

Connecting PostgreSQL on Google Cloud SQL to SQL Server elevates your capability to analyze data and extract actionable insights quickly. The CSV-based approach is suitable for infrequent transfers from PostgreSQL on Google Cloud SQL to SQL Server. It can also lead to manual interventions and limited real-time updates. This can stop you from achieving timely insights as your SQL Server won’t be up-to-date.

On the other hand, Hevo simplifies the data replication process from PostgreSQL on Google Cloud SQL to SQL Server with two straightforward steps. It offers 150+ in-built connectors, ensures real-time data synchronization, and automates schema mapping. These functionalities make Hevo an efficient solution to streamline various data integration needs.

If you don’t want SaaS tools with unclear pricing that burn a hole in your pocket, opt for a tool that offers a simple, transparent pricing model. Hevo has 3 usage-based pricing plans starting with a free tier, where you can ingest up to 1 million records.

Schedule a demo to see if Hevo would be a good fit for you, today!

Tejaswini Kasture
Freelance Technical Content Writer, Hevo Data

Tejaswini's profound enthusiasm for data science and passion for writing drive her to create high-quality content on software architecture, and data integration.

All your customer data in one place.