I just created an empty table document_test
which is a copy of the document
table. I then add a new column host to document_test
and copy everything else from document
. The statement is like:
我刚刚创建了一个空表document_test,它是文档表的副本。然后,我将新的列主机添加到document_test并从文档中复制其他所有内容。声明如下:
INSERT INTO document_test (id,col1,col2,col3,col4) SELECT * from document;
The document
table contains about 5.5 million rows and I have disabled the keys in document_test
, set foreign_key_checks
to 0 and unique_checks
to 0. This INSERT
commands has been running for two days on a Linux server and has not been finished. The SHOW PROCESSLIST
displays the state is "Sending Data". The SELECT count(*) FROM document_test;
command returns 0. However, I do see from the SHOW TABLE STATUS
command that the number of rows in document_test
is non-zero. Interestingly, presumably, the row number should increase monotonously, but it is NOT. The results of four consecutive queries are 3299583, 3325775, 3299730 and 3341818!
文档表包含大约550万行,我已禁用document_test中的键,将foreign_key_checks设置为0,将unique_checks设置为0.此INSERT命令已在Linux服务器上运行了两天,但尚未完成。 SHOW PROCESSLIST显示状态为“发送数据”。 SELECT count(*)FROM document_test;命令返回0.但是,我确实从SHOW TABLE STATUS命令中看到document_test中的行数不为零。有趣的是,据推测,行数应该单调增加,但事实并非如此。连续四次查询的结果是3299583,3327575,3299730和3341818!
This is something that completely confuses me. Could someone help me on: (1) is it inserting data? (2) how to finish this insert in a timely manner? (2) is there any way to check the status of this INSERT INTO SELECT
command? (what next after "sending Data")
这完全让我感到困惑。有人可以帮助我:(1)是否插入数据? (2)如何及时完成此插入? (2)有没有办法检查这个INSERT INTO SELECT命令的状态? (“发送数据”之后的下一步)
Unfortunately, I did not turn on the slow-query-log and profiling before running this command.
不幸的是,在运行此命令之前,我没有打开慢查询日志和分析。
1 个解决方案
#1
2
Maybe try exporting the data using mysqldump
and then imprt it via LOAD DATA
? That should be way more faster.
也许尝试使用mysqldump导出数据,然后通过LOAD DATA进行输入?这应该更快。
#1
2
Maybe try exporting the data using mysqldump
and then imprt it via LOAD DATA
? That should be way more faster.
也许尝试使用mysqldump导出数据,然后通过LOAD DATA进行输入?这应该更快。