Azure MySQL is a MySQL service managed by Microsoft. It is a cost-effective relational data management platform that handles transactional workloads. However, it has limited scalability and analytics features. This is where Google BigQuery can appear to be a suitable option. 

Bigquery can handle data on a petabyte scale. Also, Google BigQuery has built-in machine-learning capabilities, geospatial analysis functions, and support for complex SQL queries. Thus, you can migrate your data from Azure MySQL to Google BigQuery to perform complex data analytics. 

In this article, you will learn about two methods for data migration from Azure MySQL to BigQuery. 

Why Integrate Azure MySQL to BigQuery?

You should integrate Azure MySQL into BigQuery for the following reasons:

  • By integrating with BigQuery, you can analyze data at a petabyte scale, which is not possible with Azure MySQL.
  • You can utilize the machine learning capabilities of BigQuery to perform complex analyses that are not supported by Azure MySQL.
  • Bigquery enables you to integrate your data with business intelligence tools like Tableau. This can help you visualize your data and gain better business insights. 

Overview of Azure MySQL

Azure Database for MySQL is Microsoft’s cloud-based relational database service that hosts and runs MySQL relational databases. It automates database management and maintenance tasks like routine updates, backups, and security. Thus, you can deploy, manage, and scale MySQL database workloads in Azure without worrying about infrastructure management. 

Some key features of Azure MySQL are:

  • Automatic Backup: Azure MySQL enables daily automated backups of your databases through point-in-time restoration. 
  • Security: Azure MySQL allows you to restrict data access using Virtual Network Service Endpoints. It also supports SSL/TLS data encryption for data in transit. 
  • Monitoring: Azure MySQL integrates with Azure Monitor to generate alerts for monitoring databases. 

Overview of Google BigQuery

Google BigQuery is a data warehouse with serverless architecture. It is fully managed and can manage and analyze data with SQL queries. It is highly scalable, allowing you to analyze huge datasets within minutes.

Here are some key features of BigQuery: 

  • Optimized Storage Solution: Google BigQuery can store large amounts of data through its columnar storage approach, accelerating data retrieval time compared to traditional row-based storage systems.
  • Real-Time Data Streaming: In addition to having batch processing capabilities, BigQuery supports real-time data streaming, enabling data analysis as it arrives. 
  • Security: Google BigQuery provides data encryption at rest and in transit, fine-grained access management, and audit logging capabilities to control who can access your data. 

Methods to Migrate Data from Azure MySQL to BigQuery

Here are two methods to export data from Azure MySQL to BigQuery file:

  • Method 1: Using Hevo to Migrate Data from Azure MySQL to BigQuery
  • Method 2: Using CSV File to Migrate Data from Azure MySQL to BigQuery

Method 1: Using Hevo to Integrate Azure MySQL to BigQuery

Hevo Data is a no-code ELT platform that provides real-time data integration and a cost-effective way to automate your data pipeline workflow. With over 150 source connectors, you can integrate your data into multiple platforms, conduct advanced analysis on your data, and produce useful insights. Here are some of the most important features of Hevo Data:

  • Data Transformation: Hevo Data allows you to transform your data for analysis with a simple Python-based drag-and-drop data transformation technique.
  • Automated Schema Mapping: Hevo Data automatically arranges the destination schema to match the incoming data. It also lets you choose between Full and Incremental Mapping. 
  • Incremental Data Load: It ensures proper bandwidth utilization at both the source and the destination by allowing real-time data transfer of the modified data.


You can use Hevo to load Azure MySQL file to BigQuery by following the steps below: 

Step 1: Configure Azure MySQL as Your Source

Prerequisites

After this, follow the below steps to configure Azure MySQL as the source:

  • Click on the Pipelines tab in the Navigation Bar.
  • Click the + CREATE button from Pipeline List View.
  • Select Azure MySQL on the Select Source Type page.
  • After setting up the prerequisites, you can Configure your Azure MySQL Source page by specifying details like the Pipeline Name, Database Host, Port, User, and Password

Click the Test & Continue button to complete the source setup.

Azure MySQL to BigQuery: Configure Source Settings

Azure MySQL to BigQuery: Configure Source Settings

For more information on the configuration of Azure MySQL as a source, refer to the Hevo documentation

Step 2: Configure Google BigQuery as Your Destination

Prerequisites

  • Create a Google Cloud Project if you do not have one already. 
  • Assign the essential roles for the GCP project to the connecting Google account in addition to the Owner or Admin role
  • Ensure that the active billing account is associated with the GCP project. 
  • To create a destination, you are assigned the role of Team Collaborator or any other administrative role except Billing Administrator. 

Here are the steps to configure BigQuery as a destination:

  • Click DESTINATIONS from the Navigation Bar.
  • In the Destinations List View, Click + CREATE.
  • In the Add Destination page, select Google BigQuery as the Destination type.
  • In the Configure your Google BigQuery Destination page, specify the following details:
Azure MySQL to BigQuery: Configure Destination Settings

Azure MySQL to BigQuery: Configure Destination Settings

For more information on the configuration of Google BigQuery as the destination, refer to Hevo documentation.  

The resulting Hevo Data pipeline will copy data from Azure MySQL to BigQuery, allowing you to conduct further analysis.

Method 2: Using CSV file to Integrate Azure MySQL to BigQuery

To export data from Azure MySQL to Google BigQuery, you can use a CSV file. Here are the steps for it:

Step 1: Loading Data from Azure MySQL to CSV File Using MySQL Workbench

MySQL Workbench exports data in two ways: using the object browser context menu and the Navigator Pane. You can use either of these as per your requirements.

Using Object Browser Context Menu to Export Data to CSV from Azure MySQL
Azure MySQL to BigQuery: Object Browser Context Menu for Azure MySQL to CSV Export

Azure MySQL to BigQuery: Object Browser Context Menu for Azure MySQL to CSV Export

  • Right-click the table of the database that you want to export.
  • Select Table Data Export Wizard. Choose the columns you want to export, along with row offset (if any) and count (if any).
  • Select Next on the Select data for the Export pane. Click on the file path and CSV file type.
  • Select Next from the Select output file location pane.
  • Click Next from the Export data pane.
Using Navigator Pane to Export Data to CSV from Azure MySQL
Azure MySQL to BigQuery: Navigation Pane to Export Azure MySQL to CSV

Azure MySQL to BigQuery: Navigation Pane to Export Azure MySQL to CSV

  • From the Navigator Pane in MySQL Workbench, select Data Export.
  • Select the schema you want to export from the Data Export pane. You can select the schema objects or tables you want to export for each schema. Alternatively, you can use Export a Result Set to export a particular result set in the SQL editor to a different format like CSV, JSON, or HTML.
  • Select the database object that you want to export and configure the options related to it. 
  • Click on Refresh to load the current objects. You can also select Advanced Options at the upper right to refine the export operation. 
  • Select Start Export to begin the CSV export process. 

Step 2: Load CSV File to Google Cloud Storage

You can upload the CSV file containing data extracted from Azure MySQL to Google Cloud Storage using the following steps:

Azure MySQL to BigQuery: CSV to GCS Export

  • Login to your Google Cloud account and click on Go to Console.
Azure MySQL to BigQuery: CSV to GCS Export

Azure MySQL to BigQuery: CSV to GCS Export

  • From the Navigation Menu on the left side, click Storage> Browser.
Azure MySQL to BigQuery: CSV to GCS Export

Azure MySQL to BigQuery: CSV to GCS Export

  • Click on Create Bucket to create a new bucket that acts like a folder to store your files. 
Azure MySQL to BigQuery: CSV to GCS Export

Azure MySQL to BigQuery: CSV to GCS Export

  • Enter a unique name for your bucket in the Name Your Bucket section and click on CREATE at the bottom of the page.
Azure MySQL to BigQuery: CSV to GCS Export

Azure MySQL to BigQuery: CSV to GCS Export

  • You can either Upload files or Drop them in the drop zone.
access your CSV file from the dashboard

Azure MySQL to BigQuery: CSV to GCS Export

  • You can now access your CSV file from the dashboard once it is uploaded.

Step 3: Load Data from CSV File in Google Cloud Storage to Google BigQuery

You can transfer the CSV data stored in Google Cloud Storage to BigQuery using the following Python code:

from google.cloud import bigquery
# Construct a BigQuery client object.
client = bigquery.Client()

# TODO(developer): Set table_id to the ID of the table to create.
# table_id = "your-project.your_dataset.your_table_name"

job_config = bigquery.LoadJobConfig(
    schema=[
        bigquery.SchemaField("name", "STRING"),
        bigquery.SchemaField("post_abbr", "STRING"),
    ],
    skip_leading_rows=1,
    # The source format defaults to CSV, so the line below is optional.
    source_format=bigquery.SourceFormat.CSV,
)
uri = "gs://cloud-samples-data/bigquery/us-states/us-states.csv"

load_job = client.load_table_from_uri(
    uri, table_id, job_config=job_config
)  # Make an API request.

load_job.result()  # Waits for the job to complete.

destination_table = client.get_table(table_id)  # Make an API request.
print("Loaded {} rows.".format(destination_table.num_rows))

Thus, with these steps, you can convert Azure MySQL to BigQuery table for better data analytics. 

Limitations of Using CSV Files for Data Migration from Azure MySQL to BigQuery

The custom method of using CSV files to load data from Azure MySQL to BigQuery has some limitations, such as: 

  • Complexity: Using a custom method with a CSV file is complicated as you export data through various intermediaries before finally loading it to BigQuery. This makes the integration process complex and time-consuming. 
  • Error-Prone: Custom scripts cannot usually handle errors, which may result in data inconsistencies. Also, debugging and troubleshooting may become complicated for large datasets. 
  • Dependency on Expertise: You need a strong team of experts to identify and troubleshoot issues quickly. However, hiring a skilled team of experts can be expensive. 
  • Limited Functionality of CSV files: CSV files do not support complex data types like geospatial data. It is prone to error with large datasets and has limited capabilities for data transformation during migration. 

Use Cases of Integrating Azure MySQL to BigQuery

Some of the applications of data migration from Azure MySQL to BigQuery are: 

  • Business Intelligence: BigQuery’s BI engine is an in-memory data analysis service that allows you to do faster data analysis with high concurrency. It quickly executes SQL queries irrespective of their source and also manages cached tables for optimization. Further, through its SQL engine, BigQuery can also interact with various business intelligence tools, including Looker, Power BI, and Tableau. You can use them to create impactful reports and visualizations.
  • Machine Learning Integration: With BigQuery’s built-in ML integration capabilities, you can build machine learning models using simple SQL commands within the platform.  You can innovate and make transformative advancements across domains when you have easier access to ML systems. 
  • Geospatial Analytics: BigQuery has a Geographic Information System (GIS) that allows you to do geospatial data analysis. It can work with BigQuery Geo Viz, Google Earth Engine, Jupyter Notebooks, and other such applications to convert latitude and longitude data into accurate geographic locations.
  • Real-Time Data Analytics: BigQuery can unlock valuable insights by monitoring user activities to understand their behavior and preferences. Moreover, in sectors where real-time response is important, like finance, where timely response is critical, BigQuery’s real-time analytics can allow you to track events as they unfold.

Conclusion

This blog comprehensively explains how to connect Azure MySQL to BigQuery using two data migration methods. One is a custom method, and the other method uses Hevo. The no-code configuration, easy pipeline setup, flexible replication, and pricing features of Hevo make it a suitable tool for data migration. You can schedule a demo today to take advantage of Hevo’s benefits!

FAQs

  1. What are the disadvantages of BigQuery?

Streaming data, especially large volumes of data, in BigQuery, can be more expensive than batch loading. Also, since BigQuery was designed for data ingestion from SAP to BigQuery, it might not support extensive ETL operations. Further, BigQuery does not have built-in transformation features. Any data transformation would require an external tool or process.

  1. What is the difference between Azure MySQL flexible and a single server? 

Flexible Server of Azure MySQL is a production-ready database service that offers more granular control and flexibility over database management functions. It is best suited for new developments and migration of production workloads to Azure Database for MySQL service. Single Server is an Azure MySQL database service that offers minimal customization.

Shuchi Chitrakar
Technical Content Writer

With a strong background in physics and journalism, Shuchi has an avid interest in writing about the latest happenings in the technological world, especially data science. She has written several articles related to data engineering and data analytics. To broaden the scope of her knowledge, Shuchi constantly strives to learn new skills related to data science and AI.

All your customer data in one place.