如何在MySQL中插入大量行?

时间:2021-10-13 09:08:29

How do I insert for example 100 000 rows into MySQL table with a single query?

如何使用单个查询将例如100 000行插入MySQL表中?

4 个解决方案

#1


7  

insert into $table values (1, a, b), (2, c, d), (3, e, f);

That will perform an insertion of 3 rows. Continue as needed to reach 100,000. I do blocks of ~1,000 that way when doing ETL work.

这将执行3行的插入。根据需要继续达到100,000。在做ETL工作的时候我会用~1,000块。

If your data is statically in a file, transforming it and using load data infile will be the best method, but I'm guessing you're asking this because you do something similar.

如果你的数据静态存在于文件中,那么转换它并使用load data infile将是最好的方法,但我猜你是因为你做了类似的事情。

Also note what somebody else said about the max_allowed_packet size limiting the length of your query.

另请注意其他人对max_allowed_pa​​cket大小的说法限制了查询的长度。

#2


3  

You can do a batch insert with the INSERT statement, but your query can't be bigger than (slightly less than) max_allowed_packet.

您可以使用INSERT语句执行批量插入,但您的查询不能大于(略小于)max_allowed_pa​​cket。

For 100k rows, depending on the size of the rows, you'll probably exceed this.

对于100k行,根据行的大小,您可能会超过这一行。

One way would be to split it up into several chunks. This is probably a good idea anyway.

一种方法是将其分成几个块。无论如何,这可能是一个好主意。

Alternatively you can use LOAD DATA INFILE (or LOAD DATA LOCAL INFILE) to load from a tab-delimited (or other delimited) file. See docs for details.

或者,您可以使用LOAD DATA INFILE(或LOAD DATA LOCAL INFILE)从制表符分隔(或其他分隔)文件加载。有关详细信息,请参阅文档

LOAD DATA isn't subject to the max_allowed_packet limit.

LOAD DATA不受max_allowed_pa​​cket限制。

#3


0  

Try to use LoadFile() or Convert yor Data into XML file and then use Load and Extract() function to Load Data into MySQL database.

尝试使用LoadFile()或将yor Data转换为XML文件,然后使用Load和Extract()函数将数据加载到MySQL数据库中。

This is the One Query and Fastest option,

这是One Query和Fastest选项,

Even I'm doing the same,I had files if 1.5 GB around millions of rows. I have Used Both Option in my case.

即使我也这样做,我有文件,如果1.5 GB左右数百万行。在我的案例中,我使用了两种选项。

#4


-1  

You can't as far as I know. You will need a loop.

据我所知,你不能。你需要一个循环。

#1


7  

insert into $table values (1, a, b), (2, c, d), (3, e, f);

That will perform an insertion of 3 rows. Continue as needed to reach 100,000. I do blocks of ~1,000 that way when doing ETL work.

这将执行3行的插入。根据需要继续达到100,000。在做ETL工作的时候我会用~1,000块。

If your data is statically in a file, transforming it and using load data infile will be the best method, but I'm guessing you're asking this because you do something similar.

如果你的数据静态存在于文件中,那么转换它并使用load data infile将是最好的方法,但我猜你是因为你做了类似的事情。

Also note what somebody else said about the max_allowed_packet size limiting the length of your query.

另请注意其他人对max_allowed_pa​​cket大小的说法限制了查询的长度。

#2


3  

You can do a batch insert with the INSERT statement, but your query can't be bigger than (slightly less than) max_allowed_packet.

您可以使用INSERT语句执行批量插入,但您的查询不能大于(略小于)max_allowed_pa​​cket。

For 100k rows, depending on the size of the rows, you'll probably exceed this.

对于100k行,根据行的大小,您可能会超过这一行。

One way would be to split it up into several chunks. This is probably a good idea anyway.

一种方法是将其分成几个块。无论如何,这可能是一个好主意。

Alternatively you can use LOAD DATA INFILE (or LOAD DATA LOCAL INFILE) to load from a tab-delimited (or other delimited) file. See docs for details.

或者,您可以使用LOAD DATA INFILE(或LOAD DATA LOCAL INFILE)从制表符分隔(或其他分隔)文件加载。有关详细信息,请参阅文档

LOAD DATA isn't subject to the max_allowed_packet limit.

LOAD DATA不受max_allowed_pa​​cket限制。

#3


0  

Try to use LoadFile() or Convert yor Data into XML file and then use Load and Extract() function to Load Data into MySQL database.

尝试使用LoadFile()或将yor Data转换为XML文件,然后使用Load和Extract()函数将数据加载到MySQL数据库中。

This is the One Query and Fastest option,

这是One Query和Fastest选项,

Even I'm doing the same,I had files if 1.5 GB around millions of rows. I have Used Both Option in my case.

即使我也这样做,我有文件,如果1.5 GB左右数百万行。在我的案例中,我使用了两种选项。

#4


-1  

You can't as far as I know. You will need a loop.

据我所知,你不能。你需要一个循环。