I have a stored procedure which does bulk insert on a SQL server 2005 database. When I call this stored procedure from some SQL (passing in the name of a local format file and data file) it works fine. Every time.
我有一个存储过程,它在SQL Server 2005数据库上进行批量插入。当我从一些SQL(传入本地格式文件和数据文件的名称)调用此存储过程时,它工作正常。每次。
However, when this same stored procedure gets called from C# .NET 3.5 code using SqlCommand.ExecuteNonQuery
it works intermittently.
但是,当使用SqlCommand.ExecuteNonQuery从C#.NET 3.5代码调用此相同的存储过程时,它会间歇性地工作。
When it fails a SqlException
is generated stating:
失败时会生成SqlException,说明:
Cannot bulk load. Invalid column number in the format file "c:\bulkinsert\MyFile.fmt"
无法批量加载。格式文件“c:\ bulkinsert \ MyFile.fmt”中的列号无效
I don't think this error message is correct.
我不认为此错误消息是正确的。
Has anyone experienced similar problems with calling bulk insert from code?
从代码调用批量插入有没有人遇到过类似的问题?
Thanks.
3 个解决方案
#1
How are you doing the bulk insert? Usually, the problem (in this scenario) is whether "c:\" is the server's "c:\", or the client's "c:\".
你是如何进行批量插入的?通常,问题(在这种情况下)是“c:\”是服务器的“c:\”,还是客户端的“c:\”。
However. from C# code, the simplest approach is to use SqlBulkCopy
. This class provides direct access to bulk-insert functionality from managed code, including mappings (although I never bother with them).
然而。从C#代码开始,最简单的方法是使用SqlBulkCopy。此类提供从托管代码直接访问批量插入功能,包括映射(尽管我从不打扰它们)。
If the file is something like csv / tsv / similar, then CsvReader is highly recommended. This provides the IDataReader
interface that WriteToServer
uses most efficiently.
如果文件类似于csv / tsv / similar,那么强烈建议使用CsvReader。这提供了WriteToServer最有效使用的IDataReader接口。
#2
I think the problem was todo with the format file. I am no longer using the format file and it seems to work 100% of the time now. I specify field and row terminators in the SQL instead.
我认为这个问题与格式文件有关。我不再使用格式文件,它现在似乎100%的工作。我改为在SQL中指定字段和行终止符。
Declare @Sql Nvarchar(2000);
SET @Sql =
'Bulk Insert TestOutputPretest
From ''c:\rawdata\bulkinsert\Segment1839204.dat''
WITH
(
FIELDTERMINATOR ='','',
ROWTERMINATOR = ''\n''
)'
Exec sp_ExecuteSql @Sql;
#3
use
Exec sp_ExecuteSql @Sql;
(100% working)
exec sp_ExecuteSql @Sql; (100%工作)
exec(@sql)
is not very powerful as it has some limitations
exec(@sql)不是很强大,因为它有一些限制
#1
How are you doing the bulk insert? Usually, the problem (in this scenario) is whether "c:\" is the server's "c:\", or the client's "c:\".
你是如何进行批量插入的?通常,问题(在这种情况下)是“c:\”是服务器的“c:\”,还是客户端的“c:\”。
However. from C# code, the simplest approach is to use SqlBulkCopy
. This class provides direct access to bulk-insert functionality from managed code, including mappings (although I never bother with them).
然而。从C#代码开始,最简单的方法是使用SqlBulkCopy。此类提供从托管代码直接访问批量插入功能,包括映射(尽管我从不打扰它们)。
If the file is something like csv / tsv / similar, then CsvReader is highly recommended. This provides the IDataReader
interface that WriteToServer
uses most efficiently.
如果文件类似于csv / tsv / similar,那么强烈建议使用CsvReader。这提供了WriteToServer最有效使用的IDataReader接口。
#2
I think the problem was todo with the format file. I am no longer using the format file and it seems to work 100% of the time now. I specify field and row terminators in the SQL instead.
我认为这个问题与格式文件有关。我不再使用格式文件,它现在似乎100%的工作。我改为在SQL中指定字段和行终止符。
Declare @Sql Nvarchar(2000);
SET @Sql =
'Bulk Insert TestOutputPretest
From ''c:\rawdata\bulkinsert\Segment1839204.dat''
WITH
(
FIELDTERMINATOR ='','',
ROWTERMINATOR = ''\n''
)'
Exec sp_ExecuteSql @Sql;
#3
use
Exec sp_ExecuteSql @Sql;
(100% working)
exec sp_ExecuteSql @Sql; (100%工作)
exec(@sql)
is not very powerful as it has some limitations
exec(@sql)不是很强大,因为它有一些限制