使用c#在SQL服务器中保存数据的最快方式是什么?

时间:2022-10-01 16:59:24

I am currently working on a small .NET app in C#, that fetches data through some web service.

我目前正在开发c#中的一个小的。net应用程序,它通过一些web服务获取数据。

The data is represented in objects, so it would have been logical to store the data in a document based database, but there is a demand to use SQL Server.

数据在对象中表示,因此将数据存储在基于文档的数据库中是合乎逻辑的,但是需要使用SQL Server。

So what might be the fastest way to insert many thousands, perhaps millions of rows into a database.

那么,将成千上万甚至上百万行插入数据库的最快方法是什么呢?

I an open to any framework, that might could support that, but I haven't been able to find any benchmarking on this e.g. on Entity Framework.

我对任何框架都持开放态度,这些框架可能会支持这一点,但是我还没有找到任何针对这个框架的基准。

To iterate over the data an do an insert per row is simply to slow, then it would be quicker to dump the data in a file, and then do a bulk import using SSIS, but for this scenario I would rather avoid that, and keep all logic in the C# app.

遍历数据一个做一个插入每一行是缓慢的,那么它将更快地转储文件中的数据,然后做一个使用SSIS批量导入,但对于这个场景我宁愿避免,并保持在c#应用程序逻辑。

1 个解决方案

#1


4  

You might want to use the SqlBulkCopy class. It is quite efficient for large data.

您可能希望使用SqlBulkCopy类。它对大数据非常有效。

#1


4  

You might want to use the SqlBulkCopy class. It is quite efficient for large data.

您可能希望使用SqlBulkCopy类。它对大数据非常有效。