I want to copy tables in various schemas from SQL Server to snowflake. I understand that snowflake COPY works well to load huge amount of data into snowflake, provided I have CSV data as input.
However, I am unable to figure out an efficient way to export SQL Server data in CSV format. I went through some of the threads in this forum on this topic and found that PowerShell tool export-csv is a good option. But does it work well with thousands of tables in SQL Server?
If not, what other option should I try to move the data from SQL Server to snowflake? Please note that this is not a one time data load. I am looking for a process that can run daily to load data from SQL Server to snowflake.
Thanks in advance!
P.S: I tried the SQL Server bcp
tool. But it doesn't generate a standardized CSV file.
1条答案
按热度按时间nbysray51#
BCP is usually the fastest and really simple tool to get data out of SQL server into a CSV file. If your data contains commas, you can qualify the text with double quotes so the import sees the commas as part of the data. In the past, I have used BCP with the query out option to replace these troubled characters on export as well. Going from a CSV file into Snowflake has been problematic for me. I have found that even trying to send CSV files into an S3 bucket and loading them into Snowflake is too slow for our needs.
I have found a tool called SQLpipe that does most of that work for me in a direct connection between one data source and another. I have been very successful using SQLpipe to copy data from SQL server into Snowflake. It works well with PostgreSQL to Snowflake as well. I would skip trying to export to a CSV and then copy and import into Snowflake.
SQLPipe - https://www.sqlpipe.com/ Here is a statement from their website. “Our original product, SQLpipe, is an open-source tool that allows you to move the results of a SQL query from any ODBC compliant data source, to a target data system.”
I hope that helps! Good luck!