Member-only story

Ways of Loading Data (PostgreSQL) to Synapse SQL Database

Park Sehun
2 min readMay 30, 2023

Using Data Factory

You can create the linked service to PostgreSQL on-premise using a data factory.

After you get the PostgreSQL, you will use COPY activity to move(copy) the data to blob storage. (SQL pool can create the table from the external table from the blob storage)

Using PolyBase

(1) Export the table to a flat file

Ref: Link

You can use any SQL developer tools, or command lines to export flat files from the relational database. For instance, you can use COPY command or pg_dump to export CSV files from the PostgreSQL tables.

(2) Upload the flat file to blob storage

There are many ways to upload files to the blob storage, but the most common ways to do are using az CLI or storage explore (provided by Microsoft)

Ref: Link

(3) Run Polybase process

Polybase is a technology that accesses external data stored in Azure blob storage, Hadoop or Azure data lake using T-SQL language. This is the most scalable and fastest way of loading data into an Azure Synapse SQL pool.

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in

No responses yet

Write a response