I have a PHP script set-up that parses over a JSON file that is split up in multiple pages.
我有一个PHP脚本设置,可以分析多个页面中分割的JSON文件。
This PHP script parses the JSON and inserts it into a MySQL database.
此PHP脚本解析JSON并将其插入MySQL数据库。
On a single query... (without the TRUNCATE
statement):
在单个查询上...(没有TRUNCATE语句):
if ($count > 0) {
//check toperform operation
foreach ($jsondecode as $entries) {
//getting variables here
$sql = "INSERT INTO table (title, handle, imagesrc)
VALUES ('".$title."', '".$handle."', '".$imagesrc."')";
if ($connect->query($sql) === TRUE) {
echo "New record created successfully";
} else {
echo "Error: " . $sql . "<br>" . $connect->error;
}
}
}
Results successfully with a script execution time: 16.451724052429
结果成功,脚本执行时间:16.451724052429
On a multi_query....:
在multi_query ....:
if ($count > 0) {
$sql = "TRUNCATE table;";
foreach ($jsondecode as $entries) {
//getting variables here
$sql.= "INSERT INTO table (title, handle, imagesrc)
VALUES ('".$title."', '".$handle."', '".$imagesrc."')";
if (!$mysqli->multi_query($sql)) {
echo "Multi query failed: (" . $mysqli->errno . ") " . $mysqli->error;
}
do {
if ($res = $mysqli->store_result()) {
var_dump($res->fetch_all(MYSQLI_ASSOC));
$res->free();
}
} while ($mysqli->more_results() && $mysqli->next_result());
}
}
Results successfully with script execution time: 278.05182099342
, almost 5 minutes.
结果成功,脚本执行时间:278.05182099342,差不多5分钟。
All I am trying to do is TRUNCATE
the table before the INSERT
.
我想要做的就是在INSERT之前TRUNCATE表。
I am going to be running this on a web server CRON job which is going to execute this script every 12 hours.
我将在一个Web服务器CRON作业上运行它,它将每12小时执行一次该脚本。
There is obviously such a huge execution time difference in the single query vs multi-query... is there anything I can do?
在单个查询与多个查询中显然存在如此巨大的执行时间差异......我能做些什么吗?
My only thought is to setup another CRON job script that simply does the TRUNCATE
statement every 12 hours, but 1 minute before this main one runs. This seems like it should work.. but is of course not ideal as then I'd have to deal with multiple scripts instead of just one.
我唯一的想法是设置另一个CRON作业脚本,它每12小时执行一次TRUNCATE语句,但在此主要运行之前1分钟。这似乎应该工作..但当然不理想,因为我必须处理多个脚本而不是一个。
1 个解决方案
#1
1
The reason for such a huge difference the presence of one extra character!
一个额外角色存在如此巨大差异的原因!
$sql.= "INSERT INTO table (title, handle, imagesrc)
VALUES ('".$title."', '".$handle."', '".$imagesrc."')";
You keep on adding a new query to the existing query but you continue to execute that ballooning query inside the loop. It may not have been obvious because you have not properly indented your code. So the crucial mistake here is indenting!!
您继续向现有查询添加新查询,但您继续在循环内执行该膨胀查询。它可能不是很明显,因为您没有正确缩进代码。所以这里的关键错误是缩进!!
You are truncating the table for each line of input in your JSON, and then inserting the whole thing all over again.
您正在截断JSON中每行输入的表,然后重新插入整个事物。
besides, this really isn't a case where you ought to be using multi query. Run the truncate query outside the loop. Then run the insert query inside the loop.
此外,这真的不是你应该使用多查询的情况。在循环外运行截断查询。然后在循环内运行insert查询。
As others have pointed out building a single insert query with multiple VALUES sets might be a bit faster. Alternatively turn transaction auto commit to off and switch back on at the end.
正如其他人所指出的,构建具有多个VALUES集的单个插入查询可能会更快一些。或者,将事务自动提交设置为关闭,并在结束时重新打开。
if ($count > 0) {
$connect->query('TRUNCATE `table`');
//check toperform operation
foreach ($jsondecode as $entries) {
//getting variables here
$sql = "INSERT INTO `table` (title, handle, imagesrc)
VALUES ('".$title."', '".$handle."', '".$imagesrc."')";
if ($connect->query($sql) === TRUE) {
echo "New record created successfully";
} else {
echo "Error: " . $sql . "<br>" . $connect->error;
}
}
}
#1
1
The reason for such a huge difference the presence of one extra character!
一个额外角色存在如此巨大差异的原因!
$sql.= "INSERT INTO table (title, handle, imagesrc)
VALUES ('".$title."', '".$handle."', '".$imagesrc."')";
You keep on adding a new query to the existing query but you continue to execute that ballooning query inside the loop. It may not have been obvious because you have not properly indented your code. So the crucial mistake here is indenting!!
您继续向现有查询添加新查询,但您继续在循环内执行该膨胀查询。它可能不是很明显,因为您没有正确缩进代码。所以这里的关键错误是缩进!!
You are truncating the table for each line of input in your JSON, and then inserting the whole thing all over again.
您正在截断JSON中每行输入的表,然后重新插入整个事物。
besides, this really isn't a case where you ought to be using multi query. Run the truncate query outside the loop. Then run the insert query inside the loop.
此外,这真的不是你应该使用多查询的情况。在循环外运行截断查询。然后在循环内运行insert查询。
As others have pointed out building a single insert query with multiple VALUES sets might be a bit faster. Alternatively turn transaction auto commit to off and switch back on at the end.
正如其他人所指出的,构建具有多个VALUES集的单个插入查询可能会更快一些。或者,将事务自动提交设置为关闭,并在结束时重新打开。
if ($count > 0) {
$connect->query('TRUNCATE `table`');
//check toperform operation
foreach ($jsondecode as $entries) {
//getting variables here
$sql = "INSERT INTO `table` (title, handle, imagesrc)
VALUES ('".$title."', '".$handle."', '".$imagesrc."')";
if ($connect->query($sql) === TRUE) {
echo "New record created successfully";
} else {
echo "Error: " . $sql . "<br>" . $connect->error;
}
}
}