将表从MS SQL Server迁移到MySQL

时间:2022-06-30 20:14:55

I have to migrate a table from MSSQL Server to MySql. The problem is that the table is quite big (65 millions records) and the whole process takes too much time. Does anyone have any idea how to speed things up ? Any useful tools that could improve this?

我必须将表从MSSQL Server迁移到MySql。问题是表格很大(6500万条记录),整个过程需要花费太多时间。有谁知道如何加快速度?有什么有用的工具可以改善这个吗?

7 个解决方案

#1


0  

you could export the data to text and then use the mysql load statement:

你可以将数据导出到文本然后使用mysql load语句:

load data local infile '/somefolder/text_file_with_data.txt' into table some_table fields terminated by '\t' lines terminated by '\n'

将数据本地infile'/somefolder/text_file_with_data.txt'加载到表中由'\ t'终止的some_table字段由'\ n'终止的行

or if you put the data file on the mysql server you can:

或者如果您将数据文件放在mysql服务器上,您可以:

load data infile '/somefolder_on_the_mysql_server/text_file_with_data.txt' into table some_table fields terminated by '\t' lines terminated by '\n'

将数据infile'/somefolder_on_the_mysql_server/text_file_with_data.txt'加载到表中由'\ t'以“\ n”结尾的'\ t'行终止的some_table字段

I am not sure what the syntax for mssql is to export

我不确定mssql的语法是什么导出

You can always export in sets of 10,000 or 100,000.

您始终可以以10,000或100,000的方式导出。

#2


1  

Need to do it only once? Don't waste too much time optimizing, wait 'till it's through and move on.

需要只做一次吗?不要浪费太多时间进行优化,等待它直到完成并继续前进。

Need to do it more often? Then elaborating what tools/techniques you use currently would be helpful.

需要更频繁地做吗?然后详细说明您目前使用的工具/技术将会有所帮助。

#3


1  

Here is how I did to migrate a 800K records table from MS Sql Server to MySQL.

以下是我如何将800K记录表从MS Sql Server迁移到MySQL。

Create a query to display the data in a tabular format :

创建查询以表格格式显示数据:

SELECT [PostalCode] + ' ' +
  [StateCode] + '   ' +
  [Latitude] + '    ' +
  [Longitude] + '   ' +
  [CityName]  
FROM [dbo].[PostalCode]

Execute the query with SQL Server Management Studio, and select to output the results into a file (Menu : Query -> Results To -> Results to File)

使用SQL Server Management Studio执行查询,然后选择将结果输出到文件中(菜单:查询 - >结果到 - >结果到文件)

The filename must be the name of the table in MySQL. The file extension doesn't matter.

文件名必须是MySQL中表的名称。文件扩展名无关紧要。

Then use mysqlimport.exe (on Windows) to import the data (the table must exist in the MySQL database) :

然后使用mysqlimport.exe(在Windows上)导入数据(该表必须存在于MySQL数据库中):

mysqlimport.exe --user=user_name
  --columns=postalcode,statecode,latitude,longitude,cityname 
  --ignore-lines=2 databaseName pathToFile

After the import, I had to delete the last 2 records of the table because the end of the file contained some garbage : (818193 row(s) affected)

导入后,我不得不删除表的最后2条记录,因为文件的末尾包含一些垃圾:( 818193行受影响)

For 800K it is pretty quick : 10 seconds to export, and then 10 seconds to import.

对于800K,它非常快:导出10秒,然后导入10秒。

Hope this helps.

希望这可以帮助。

#4


0  

Make sure that the mysql tables initially have no indexes; add them once the loads are finished.

确保mysql表最初没有索引;加载完成后添加它们。

#5


0  

Make sure the MySQL table storage file is big enough before starting insertion by using such a statement in your my.ini file:

通过在my.ini文件中使用这样的语句,确保MySQL表存储文件足够大,然后再开始插入:

innodb_data_file_path=ibdata1:1000M:autoextend

#6


0  

Try if you can get the data out of mssql and into mysql by using the SSIS/DTS packages you can generate with the import/export wizard of SQL Server. (Connect to MySQL with the appropriate OLEDB/ADO provider.

尝试使用可以使用SQL Server的导入/导出向导生成的SSIS / DTS包,从mssql和mysql中获取数据。 (使用适当的OLEDB / ADO提供程序连接到MySQL。

Start it when you head home on friday, and check back after the weekend ;)

星期五回家后开始,周末后回来看看;)

#7


0  

Make sure no columns have an index on them, and delete any foreign keys (if you are using InnoDB for the table type) for the import.

确保没有列上有索引,并删除任何外键(如果您使用InnoDB作为表类型)进行导入。

#1


0  

you could export the data to text and then use the mysql load statement:

你可以将数据导出到文本然后使用mysql load语句:

load data local infile '/somefolder/text_file_with_data.txt' into table some_table fields terminated by '\t' lines terminated by '\n'

将数据本地infile'/somefolder/text_file_with_data.txt'加载到表中由'\ t'终止的some_table字段由'\ n'终止的行

or if you put the data file on the mysql server you can:

或者如果您将数据文件放在mysql服务器上,您可以:

load data infile '/somefolder_on_the_mysql_server/text_file_with_data.txt' into table some_table fields terminated by '\t' lines terminated by '\n'

将数据infile'/somefolder_on_the_mysql_server/text_file_with_data.txt'加载到表中由'\ t'以“\ n”结尾的'\ t'行终止的some_table字段

I am not sure what the syntax for mssql is to export

我不确定mssql的语法是什么导出

You can always export in sets of 10,000 or 100,000.

您始终可以以10,000或100,000的方式导出。

#2


1  

Need to do it only once? Don't waste too much time optimizing, wait 'till it's through and move on.

需要只做一次吗?不要浪费太多时间进行优化,等待它直到完成并继续前进。

Need to do it more often? Then elaborating what tools/techniques you use currently would be helpful.

需要更频繁地做吗?然后详细说明您目前使用的工具/技术将会有所帮助。

#3


1  

Here is how I did to migrate a 800K records table from MS Sql Server to MySQL.

以下是我如何将800K记录表从MS Sql Server迁移到MySQL。

Create a query to display the data in a tabular format :

创建查询以表格格式显示数据:

SELECT [PostalCode] + ' ' +
  [StateCode] + '   ' +
  [Latitude] + '    ' +
  [Longitude] + '   ' +
  [CityName]  
FROM [dbo].[PostalCode]

Execute the query with SQL Server Management Studio, and select to output the results into a file (Menu : Query -> Results To -> Results to File)

使用SQL Server Management Studio执行查询,然后选择将结果输出到文件中(菜单:查询 - >结果到 - >结果到文件)

The filename must be the name of the table in MySQL. The file extension doesn't matter.

文件名必须是MySQL中表的名称。文件扩展名无关紧要。

Then use mysqlimport.exe (on Windows) to import the data (the table must exist in the MySQL database) :

然后使用mysqlimport.exe(在Windows上)导入数据(该表必须存在于MySQL数据库中):

mysqlimport.exe --user=user_name
  --columns=postalcode,statecode,latitude,longitude,cityname 
  --ignore-lines=2 databaseName pathToFile

After the import, I had to delete the last 2 records of the table because the end of the file contained some garbage : (818193 row(s) affected)

导入后,我不得不删除表的最后2条记录,因为文件的末尾包含一些垃圾:( 818193行受影响)

For 800K it is pretty quick : 10 seconds to export, and then 10 seconds to import.

对于800K,它非常快:导出10秒,然后导入10秒。

Hope this helps.

希望这可以帮助。

#4


0  

Make sure that the mysql tables initially have no indexes; add them once the loads are finished.

确保mysql表最初没有索引;加载完成后添加它们。

#5


0  

Make sure the MySQL table storage file is big enough before starting insertion by using such a statement in your my.ini file:

通过在my.ini文件中使用这样的语句,确保MySQL表存储文件足够大,然后再开始插入:

innodb_data_file_path=ibdata1:1000M:autoextend

#6


0  

Try if you can get the data out of mssql and into mysql by using the SSIS/DTS packages you can generate with the import/export wizard of SQL Server. (Connect to MySQL with the appropriate OLEDB/ADO provider.

尝试使用可以使用SQL Server的导入/导出向导生成的SSIS / DTS包,从mssql和mysql中获取数据。 (使用适当的OLEDB / ADO提供程序连接到MySQL。

Start it when you head home on friday, and check back after the weekend ;)

星期五回家后开始,周末后回来看看;)

#7


0  

Make sure no columns have an index on them, and delete any foreign keys (if you are using InnoDB for the table type) for the import.

确保没有列上有索引,并删除任何外键(如果您使用InnoDB作为表类型)进行导入。