将大型MySQL表导出为多个较小的文件

时间:2021-01-31 23:18:58

I have a very large MySQL table on my local dev server: over 8 million rows of data. I loaded the table successfully using LOAD DATA INFILE.

我的本地开发服务器上有一个非常大的MySQL表:超过800万行数据。我使用LOAD DATA INFILE成功加载了表。

I now wish to export this data and import it onto a remote host.

我现在希望导出这些数据并将其导入远程主机。

I tried LOAD DATA LOCAL INFILE to the remote host. However, after around 15 minutes the connection to the remote host fails. I think that the only solution is for me to export the data into a number of smaller files.

我尝试了LOAD DATA LOCAL INFILE到远程主机。但是,大约15分钟后,与远程主机的连接失败。我认为唯一的解决方案是我将数据导出到许多较小的文件中。

The tools at my disposal are PhpMyAdmin, HeidiSQL and MySQL Workbench.

我可以使用的工具是PhpMyAdmin,HeidiSQL和MySQL Workbench。

I know how to export as a single file, but not multiple files. How can I do this?

我知道如何导出为单个文件,但不是多个文件。我怎样才能做到这一点?

7 个解决方案

#1


15  

I just did an import/export of a (partitioned) table with 50 millions record, it needed just 2 minutes to export it from a reasonably fast machine and 15 minutes to import it on my slower desktop. There was no need to split the file.

我刚刚导入/导出了一个包含50万条记录的(分区)表,只需要2分钟就可以从一台速度相当快的机器中导出它,15分钟就可以将它导入到速度较慢的桌面上。没有必要拆分文件。

mysqldump is your friend, and knowing that you have a lot of data it's better to compress it

mysqldump是你的朋友,知道你有很多数据,最好压缩它

 @host1:~ $ mysqldump -u <username> -p <database> <table> | gzip > output.sql.gz
 @host1:~ $ scp output.sql.gz host2:~/
 @host1:~ $ rm output.sql.gz
 @host1:~ $ ssh host2
 @host2:~ $ gunzip < output.sql.gz | mysql -u <username> -p <database>
 @host2:~ $ rm output.sql.gz

#2


4  

Take a look at mysqldump

看看mysqldump

Your lines should be (from terminal):

你的行应该是(来自终端):

export to backupfile.sql from db_name in your mysql:

从mysql中的db_name导出到backupfile.sql:

mysqldump -u user -p db_name > backupfile.sql

import from backupfile to db_name in your mysql:

从mysql中的backupfile导入到db_name:

mysql -u user -p db_name < backupfile.sql

You have two options in order to split the information:

您有两个选项可以拆分信息:

  1. Split the output text file into smaller files (as many as you need, many tools to do this, e.g. split).
  2. 将输出文本文件拆分为较小的文件(根据需要,可以使用许多工具,例如拆分)。

  3. Export one table each time using the option to add a table name after the db_name, like so:

    每次使用选项在db_name之后添加表名,导出一个表,如下所示:

    mysqldump -u user -p db_name table_name > backupfile_table_name.sql

    mysqldump -u user -p db_name table_name> backupfile_table_name.sql

Compressing the file(s) (a text file) is very efficient and can minimize it to about 20%-30% of it's original size.

压缩文件(文本文件)非常有效,可以将其最小化到原始大小的20%-30%。

Copying the files to remote servers should be done with scp (secure copy) and interaction should take place with ssh (usually).

使用scp(安全副本)将文件复制到远程服务器,并且应该使用ssh(通常)进行交互。

Good luck.

#3


3  

I found that the advanced options in phpMyAdmin allow me to select how many rows to export, plus the start point. This allows me to create as many dump files as required to get the table onto the remote host.

我发现phpMyAdmin中的高级选项允许我选择要导出的行数以及起点。这允许我根据需要创建尽可能多的转储文件,以将表放到远程主机上。

I had to adjust my php.ini settings, plus the phpMyAdmin config 'ExecTimeLimit' setting as generating the dump files takes some time (500,000 rows in each).

我不得不调整我的php.ini设置,加上phpMyAdmin配置'ExecTimeLimit'设置,因为生成转储文件需要一些时间(每个500,000行)。

I use HeidiSQL to do the imports.

我使用HeidiSQL来进行导入。

#4


2  

As an example of the mysqldump approach for a single table

作为单个表的mysqldump方法的示例

mysqldump -u root -ppassword yourdb yourtable > table_name.sql

Importing is then as simple as

然后导入就像这样简单

mysql -u username -ppassword yourotherdb < table_name.sql

#5


1  

Use mysqldump to dump the table into a file. Then use tar with -z option to zip the file. Transfer it to your remote server (with ftp, sftp or other file transfer utility). Then untar the file on remote server Use mysql to import the file.

使用mysqldump将表转储到文件中。然后使用带-z选项的tar来压缩文件。将其传输到远程服务器(使用ftp,sftp或其他文件传输实用程序)。然后解压远程服务器上的文件使用mysql导入文件。

There is no reason to split the original file or to export in multiple files.

没有理由拆分原始文件或导出多个文件。

#6


1  

How do I split a large MySQL backup file into multiple files?

如何将大型MySQL备份文件拆分为多个文件?

You can use mysql_export_explode https://github.com/barinascode/mysql-export-explode

您可以使用mysql_export_explode https://github.com/barinascode/mysql-export-explode

<?php 
#Including the class

include 'mysql_export_explode.php';
$export = new mysql_export_explode;

$export->db = 'dataBaseName'; # -- Set your database name
$export->connect('host','user','password'); # -- Connecting to database
$export->rows = array('Id','firstName','Telephone','Address'); # -- Set which fields you want to export
$export->exportTable('myTableName',15); # -- Table name and in few fractions you want to split the table
?>

At the end of the SQL files are created in the directory where the script is executed in the following format
---------------------------------------
myTableName_0.sql
myTableName_1.sql
myTableName_2.sql
...

#7


0  

If you are not comfortable with using the mysqldump command line tool, here are two GUI tools that can help you with that problem, although you have to be able to upload them to the server via FTP!

如果您不习惯使用mysqldump命令行工具,这里有两个GUI工具可以帮助您解决该问题,尽管您必须能够通过FTP将它们上传到服务器!

Adminer is a slim and very efficient DB Manager tool that is at least as powerful as PHPMyAdmin and has only ONE SINGLE FILE that has to be uploaded to the server which makes it extremely easy to install. It works way better with large tables / DB than PMA does.

Adminer是一款纤薄且高效的数据库管理器工具,至少与PHPMyAdmin一样强大,并且只有一个单一文件必须上传到服务器,这使得它非常容易安装。对于大型表/ DB而言,它比PMA更好。

MySQLDumper is a tool developed especially to export / import large tables / DBs so it will have no problem with the situation you describe. The only dowside is that it is a bit more tedious to install as there are more files and folders (~350 files in ~1.5MB), but it shouldn't be a problem to upload it via FTP either, and it will definately get the job done :)

MySQLDumper是一个专门为导出/导入大型表/ DB而开发的工具,因此它对您描述的情况没有任何问题。唯一的缺点是安装起来有点繁琐,因为有更多的文件和文件夹(大约1.5MB中约350个文件),但是通过FTP上传它也不应该是一个问题,它肯定会得到完成的工作:)

So my advice would be to first try Adminer and if that one also fails go the MySQLDumper route.

所以我的建议是首先尝试Adminer,如果那个也失败了,那就去MySQLDumper路线。

#1


15  

I just did an import/export of a (partitioned) table with 50 millions record, it needed just 2 minutes to export it from a reasonably fast machine and 15 minutes to import it on my slower desktop. There was no need to split the file.

我刚刚导入/导出了一个包含50万条记录的(分区)表,只需要2分钟就可以从一台速度相当快的机器中导出它,15分钟就可以将它导入到速度较慢的桌面上。没有必要拆分文件。

mysqldump is your friend, and knowing that you have a lot of data it's better to compress it

mysqldump是你的朋友,知道你有很多数据,最好压缩它

 @host1:~ $ mysqldump -u <username> -p <database> <table> | gzip > output.sql.gz
 @host1:~ $ scp output.sql.gz host2:~/
 @host1:~ $ rm output.sql.gz
 @host1:~ $ ssh host2
 @host2:~ $ gunzip < output.sql.gz | mysql -u <username> -p <database>
 @host2:~ $ rm output.sql.gz

#2


4  

Take a look at mysqldump

看看mysqldump

Your lines should be (from terminal):

你的行应该是(来自终端):

export to backupfile.sql from db_name in your mysql:

从mysql中的db_name导出到backupfile.sql:

mysqldump -u user -p db_name > backupfile.sql

import from backupfile to db_name in your mysql:

从mysql中的backupfile导入到db_name:

mysql -u user -p db_name < backupfile.sql

You have two options in order to split the information:

您有两个选项可以拆分信息:

  1. Split the output text file into smaller files (as many as you need, many tools to do this, e.g. split).
  2. 将输出文本文件拆分为较小的文件(根据需要,可以使用许多工具,例如拆分)。

  3. Export one table each time using the option to add a table name after the db_name, like so:

    每次使用选项在db_name之后添加表名,导出一个表,如下所示:

    mysqldump -u user -p db_name table_name > backupfile_table_name.sql

    mysqldump -u user -p db_name table_name> backupfile_table_name.sql

Compressing the file(s) (a text file) is very efficient and can minimize it to about 20%-30% of it's original size.

压缩文件(文本文件)非常有效,可以将其最小化到原始大小的20%-30%。

Copying the files to remote servers should be done with scp (secure copy) and interaction should take place with ssh (usually).

使用scp(安全副本)将文件复制到远程服务器,并且应该使用ssh(通常)进行交互。

Good luck.

#3


3  

I found that the advanced options in phpMyAdmin allow me to select how many rows to export, plus the start point. This allows me to create as many dump files as required to get the table onto the remote host.

我发现phpMyAdmin中的高级选项允许我选择要导出的行数以及起点。这允许我根据需要创建尽可能多的转储文件,以将表放到远程主机上。

I had to adjust my php.ini settings, plus the phpMyAdmin config 'ExecTimeLimit' setting as generating the dump files takes some time (500,000 rows in each).

我不得不调整我的php.ini设置,加上phpMyAdmin配置'ExecTimeLimit'设置,因为生成转储文件需要一些时间(每个500,000行)。

I use HeidiSQL to do the imports.

我使用HeidiSQL来进行导入。

#4


2  

As an example of the mysqldump approach for a single table

作为单个表的mysqldump方法的示例

mysqldump -u root -ppassword yourdb yourtable > table_name.sql

Importing is then as simple as

然后导入就像这样简单

mysql -u username -ppassword yourotherdb < table_name.sql

#5


1  

Use mysqldump to dump the table into a file. Then use tar with -z option to zip the file. Transfer it to your remote server (with ftp, sftp or other file transfer utility). Then untar the file on remote server Use mysql to import the file.

使用mysqldump将表转储到文件中。然后使用带-z选项的tar来压缩文件。将其传输到远程服务器(使用ftp,sftp或其他文件传输实用程序)。然后解压远程服务器上的文件使用mysql导入文件。

There is no reason to split the original file or to export in multiple files.

没有理由拆分原始文件或导出多个文件。

#6


1  

How do I split a large MySQL backup file into multiple files?

如何将大型MySQL备份文件拆分为多个文件?

You can use mysql_export_explode https://github.com/barinascode/mysql-export-explode

您可以使用mysql_export_explode https://github.com/barinascode/mysql-export-explode

<?php 
#Including the class

include 'mysql_export_explode.php';
$export = new mysql_export_explode;

$export->db = 'dataBaseName'; # -- Set your database name
$export->connect('host','user','password'); # -- Connecting to database
$export->rows = array('Id','firstName','Telephone','Address'); # -- Set which fields you want to export
$export->exportTable('myTableName',15); # -- Table name and in few fractions you want to split the table
?>

At the end of the SQL files are created in the directory where the script is executed in the following format
---------------------------------------
myTableName_0.sql
myTableName_1.sql
myTableName_2.sql
...

#7


0  

If you are not comfortable with using the mysqldump command line tool, here are two GUI tools that can help you with that problem, although you have to be able to upload them to the server via FTP!

如果您不习惯使用mysqldump命令行工具,这里有两个GUI工具可以帮助您解决该问题,尽管您必须能够通过FTP将它们上传到服务器!

Adminer is a slim and very efficient DB Manager tool that is at least as powerful as PHPMyAdmin and has only ONE SINGLE FILE that has to be uploaded to the server which makes it extremely easy to install. It works way better with large tables / DB than PMA does.

Adminer是一款纤薄且高效的数据库管理器工具,至少与PHPMyAdmin一样强大,并且只有一个单一文件必须上传到服务器,这使得它非常容易安装。对于大型表/ DB而言,它比PMA更好。

MySQLDumper is a tool developed especially to export / import large tables / DBs so it will have no problem with the situation you describe. The only dowside is that it is a bit more tedious to install as there are more files and folders (~350 files in ~1.5MB), but it shouldn't be a problem to upload it via FTP either, and it will definately get the job done :)

MySQLDumper是一个专门为导出/导入大型表/ DB而开发的工具,因此它对您描述的情况没有任何问题。唯一的缺点是安装起来有点繁琐,因为有更多的文件和文件夹(大约1.5MB中约350个文件),但是通过FTP上传它也不应该是一个问题,它肯定会得到完成的工作:)

So my advice would be to first try Adminer and if that one also fails go the MySQLDumper route.

所以我的建议是首先尝试Adminer,如果那个也失败了,那就去MySQLDumper路线。