如何将大型sql文件导入mysql表

时间:2021-10-22 02:46:46

I have a php script that parses XML files and creates a large SQL file that looks something like this:

我有一个php脚本,它解析XML文件并创建一个大型SQL文件,看起来像这样:

INSERT IGNORE INTO table(field1,field2,field3...)
VALUES ("value1","value2",int1...),
("value1","value2",int1)...etc

This file adds up to be over 20GB (I've tested on a 2.5GB file but it fails too).

这个文件加起来超过20GB(我在一个2.5GB的文件上测试过,但是它也失败了)。

I've tried commands like:

我试着命令:

mysql -u root -p table_name < /var/www/bigfile.sql

mysql -u root -p table_name < /var/www/bigfile.sql

this works on smaller files, say around 50MB. but it doesn't work with a larger file.

这适用于较小的文件,比如大约50MB。但对于较大的文件,它不起作用。

I tried:

我试着:

mysql> source /var/www/bigfile.sql

I also tried mysqlimport but that won't even properly process my file.

我还尝试了sqmylimport,但它甚至不能正确地处理我的文件。

I keep getting an error that says

我不断得到一个错误

ERROR 2006 (HY000): MySQL server has gone away

Happens approx. 30 seconds after I start executing.

约发生。开始执行后30秒。

I set allowed_max_packet to 4GB but when verifying it with SHOW VARIABLES it only shows 1GB.

我将allowed_max_packet设置为4GB,但在使用SHOW变量验证时,它只显示1GB。

Is there a way to do this without wasting another 10 hours?

有没有一种方法可以不浪费10个小时?

1 个解决方案

#1


2  

Try splitting the file into multiple INSERT queries.

尝试将文件分割为多个插入查询。

#1


2  

Try splitting the file into multiple INSERT queries.

尝试将文件分割为多个插入查询。