I am trying to insert a data file from excel pivot table into mysql using C#. I am using phpmyadmin. I was able to use the insert statement to insert the data. However, I am finding the performance for the insertion is extremely slow for my purposes as the data file will contain at least 15000 rows and this operation will have high number of occurrences. Currently I have an array list which holds the data before insertion. I was reading online and found that I should be using Load Data Infile and have my data in CSV file. My CSV file does not have all the same amount of columns as my SQL table. I am not sure how to pass the address to the CSV file and I don't want the whole operation to fail if one of the rows is a duplicate. I want to use the in file method if it can help my situation.
我正在尝试使用c#将excel数据文件从数据透视表插入到mysql中。我用phpmyadmin。我可以使用insert语句插入数据。但是,我发现插入的性能对于我来说是非常慢的,因为数据文件将包含至少15000行,并且这个操作将有大量的出现。目前我有一个数组列表,它保存插入前的数据。我在网上阅读,发现应该使用Load Data Infile,并将数据保存在CSV文件中。我的CSV文件的列数与我的SQL表不同。我不知道如何将地址传递到CSV文件中,我也不希望整个操作失败,如果其中一行是重复的。如果可以的话,我想使用in file方法。
The error I am getting now is: File not found (Errcode: 22 "Invalid argument")
我现在得到的错误是:文件未找到(错误码:22“无效参数”)
Here is my code so far for attempting the in file method.
这是我目前尝试使用in file方法的代码。
OpenFileDialog ofd3 = new OpenFileDialog();
if (ofd3.ShowDialog() == DialogResult.OK)
{
DirectoryInfo di = new DirectoryInfo("../../files/");
string hours = ofd3.FileName;
string cmd = "LOAD DATA INFILE '" + hours + "' INTO TABLE timesheet FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'";
mysql_insert_infile(cmd);
}
private void mysql_insert_infile(string query)
{
using (var connection = new MySqlConnection("Server=localhost;Database=projectdash;Uid=root;Pwd=;allow zero datetime=yes;Allow User Variables=True"))
using (var cmd = connection.CreateCommand())
{
connection.Open();
cmd.CommandText = query;
try
{
cmd.ExecuteNonQuery();
}
catch (Exception w)
{
// MessageBox.Show(w.Message);
}
}
}
1 个解决方案
#1
1
My bet would be to load the entire file in memmory and then create a "BULK" insert command containing at least 1000 rows.
You can clean your data to avoid duplicates and then create the command using only teh columns you need.
Just remember to insert multiple rows as described here:
我的想法是在memmory中加载整个文件,然后创建一个包含至少1000行的“批量”插入命令。您可以清理数据以避免重复,然后只使用需要的teh列创建命令。只需记住插入这里描述的多行:
Inserting multiple rows in mysql
在mysql中插入多个行
#1
1
My bet would be to load the entire file in memmory and then create a "BULK" insert command containing at least 1000 rows.
You can clean your data to avoid duplicates and then create the command using only teh columns you need.
Just remember to insert multiple rows as described here:
我的想法是在memmory中加载整个文件,然后创建一个包含至少1000行的“批量”插入命令。您可以清理数据以避免重复,然后只使用需要的teh列创建命令。只需记住插入这里描述的多行:
Inserting multiple rows in mysql
在mysql中插入多个行