使用SQL中使用不在SQL Server中的DataTable中的值来使用SQL更新数据库表

时间:2021-04-19 09:46:40

I have a SQL Server database table that I need to update with values that I have in a C# application DataTable retrieved from an outside source (it could be from file, external API call, etc I guess it is irrelevant).

我有一个SQL Server数据库表,我需要更新我在C#应用程序中的值DataTable从外部源检索(它可能来自文件,外部API调用等我想这是无关紧要的)。

Is there a way to do a "mass" update using SQL similar to:

有没有办法使用SQL进行“大规模”更新,类似于:

UPDATE myTable1 t1
SET    custAddress = t2.address
FROM   myTable2 t2
WHERE  t1.id = t2.id

I suppose I could use the above example as long as both tables are in the same SQL Server database; but in this case, myTable2 is an application (C#) DataTable and not a SQL Server table.

我想我可以使用上面的例子,只要两个表都在同一个SQL Server数据库中;但在这种情况下,myTable2是一个应用程序(C#)DataTable而不是SQL Server表。

Is there a way to accomplish the update similar to the one above?

有没有办法完成类似于上面的更新?

I know I could "loop" through each of the application table rows and trigger a SQL update for each row, but if my data table has thousands of rows, I would not want (whenever possible) to have to execute one SQL update for each row that I need to update.

我知道我可以“遍历”每个应用程序表行并触发每行的SQL更新,但如果我的数据表有数千行,我不希望(只要有可能)必须为每个行执行一次SQL更新我需要更新的行。

Any help would be appreciated. I am open to solutions or suggestion that involve LINQ or any other C# or SQL logic.

任何帮助,将不胜感激。我愿意接受涉及LINQ或任何其他C#或SQL逻辑的解决方案或建议。

2 个解决方案

#1


2  

  1. Take the data-table (C#) and convert it to xml. Hint, if you add the data-table to a data-set, you can call ds.GetXml(). (See here)

    获取数据表(C#)并将其转换为xml。提示,如果将数据表添加到数据集,则可以调用ds.GetXml()。 (看这里)

  2. Push the xml to sql-server.

    将xml推送到sql-server。

  3. "Shred" the xml into #temp or @variable table(s) using sql-server-xml functionality.

    使用sql-server-xml功能将xml“分解”为#temp或@variable表。

  4. Perform CUD (create/update/delete) operations using the data in the #temp or @variable tables.

    使用#temp或@variable表中的数据执行CUD(创建/更新/删除)操作。

The basic idea is here:

基本想法在这里:

HOW TO: Perform Bulk Updates and Inserts Using OpenXML with .NET Providers in Visual C# .NET

HOW TO:在Visual C#.NET中使用OpenXML和.NET提供程序执行批量更新和插入

But you don't want to use OPENXML (Sql Server 2000 syntax), instead, use examples for "shredding found here:

但是你不想使用OPENXML(Sql Server 2000语法),而是使用示例“shredding here here:

http://pratchev.blogspot.com/2007/06/shredding-xml-in-sql-server-2005.html

http://pratchev.blogspot.com/2007/06/shredding-xml-in-sql-server-2005.html

IMHO : This is a great way to do "set based" operation for CUD (create/update/delete). One of the biggest bonuses of doing it this way is that all the index-updating happens after the set based operation, where if you do it RBAR (row by agonizing row), the index(es) has to rebuild after each single operation. :(

恕我直言:这是为CUD(创建/更新/删除)执行“基于集合”操作的好方法。这样做的最大好处之一就是所有的索引更新都发生在基于集合的操作之后,如果你这样做RBAR(通过痛苦的行排),索引必须在每次操作后重建。 :(

Note, #3 (b) is optional. You have to shred. BUT, you could go straight from xml-shredding to the "real" tables (and skip the #temp or @variable table). I usually test a real world "load" (amount of xml data) and if it doesn't make much difference, I stick with #temp or @variable tables. Its easier to debug later, because you can put in a temp "select * from #myTempTable" and see if the data got shredded correctly. You should also test #temp vs @variable tables as well. Sometimes it makes a difference.

注意,#3(b)是可选的。你必须撕碎。但是,您可以直接从xml-shredding到“真实”表(并跳过#temp或@variable表)。我通常测试一个真实世界的“加载”(xml数据量),如果它没有太大的区别,我坚持使用#temp或@variable表。以后更容易调试,因为你可以放入一个临时“select * from #myTempTable”并查看数据是否被正确切碎。您还应该测试#temp vs @variable表。有时它会有所不同。

Please note there is an old-old "performance" bug on element-based-xml.

请注意,基于元素的xml存在一个旧的“性能”错误。

https://connect.microsoft.com/SQLServer/feedback/details/250407/insert-from-nodes-with-element-based-xml-has-poor-performance-on-sp2-with-x64

https://connect.microsoft.com/SQLServer/feedback/details/250407/insert-from-nodes-with-element-based-xml-has-poor-performance-on-sp2-with-x64

It may no longer apply. But it is something to be aware of. "back in the day" (because of the injected bug) my team and I had to change our element-based-xml into attribute-based xml (in C#) before sending it down to sql-server.

它可能不再适用。但这是需要注意的事情。 “回到当天”(由于注入了错误)我的团队和我必须将基于元素的xml更改为基于属性的xml(在C#中),然后再将其发送到sql-server。

#2


1  

I think the most straightforward way to accomplish this is to do a bulk insert of your application data using SqlBulkCopy into a staging table. There is also an option to set the batch size if you want to do this in batches.

我认为最直接的方法是使用SqlBulkCopy将应用程序数据批量插入到临时表中。如果要批量执行此操作,还可以选择设置批量大小。

After the bulk insert, run a stored procedure to do the update. Another option is to use an after insert trigger, but that seems to only make sense if you bulk insert in batches.

批量插入后,运行存储过程以执行更新。另一种选择是使用后插入触发器,但这似乎只有在批量插入批量时才有意义。

#1


2  

  1. Take the data-table (C#) and convert it to xml. Hint, if you add the data-table to a data-set, you can call ds.GetXml(). (See here)

    获取数据表(C#)并将其转换为xml。提示,如果将数据表添加到数据集,则可以调用ds.GetXml()。 (看这里)

  2. Push the xml to sql-server.

    将xml推送到sql-server。

  3. "Shred" the xml into #temp or @variable table(s) using sql-server-xml functionality.

    使用sql-server-xml功能将xml“分解”为#temp或@variable表。

  4. Perform CUD (create/update/delete) operations using the data in the #temp or @variable tables.

    使用#temp或@variable表中的数据执行CUD(创建/更新/删除)操作。

The basic idea is here:

基本想法在这里:

HOW TO: Perform Bulk Updates and Inserts Using OpenXML with .NET Providers in Visual C# .NET

HOW TO:在Visual C#.NET中使用OpenXML和.NET提供程序执行批量更新和插入

But you don't want to use OPENXML (Sql Server 2000 syntax), instead, use examples for "shredding found here:

但是你不想使用OPENXML(Sql Server 2000语法),而是使用示例“shredding here here:

http://pratchev.blogspot.com/2007/06/shredding-xml-in-sql-server-2005.html

http://pratchev.blogspot.com/2007/06/shredding-xml-in-sql-server-2005.html

IMHO : This is a great way to do "set based" operation for CUD (create/update/delete). One of the biggest bonuses of doing it this way is that all the index-updating happens after the set based operation, where if you do it RBAR (row by agonizing row), the index(es) has to rebuild after each single operation. :(

恕我直言:这是为CUD(创建/更新/删除)执行“基于集合”操作的好方法。这样做的最大好处之一就是所有的索引更新都发生在基于集合的操作之后,如果你这样做RBAR(通过痛苦的行排),索引必须在每次操作后重建。 :(

Note, #3 (b) is optional. You have to shred. BUT, you could go straight from xml-shredding to the "real" tables (and skip the #temp or @variable table). I usually test a real world "load" (amount of xml data) and if it doesn't make much difference, I stick with #temp or @variable tables. Its easier to debug later, because you can put in a temp "select * from #myTempTable" and see if the data got shredded correctly. You should also test #temp vs @variable tables as well. Sometimes it makes a difference.

注意,#3(b)是可选的。你必须撕碎。但是,您可以直接从xml-shredding到“真实”表(并跳过#temp或@variable表)。我通常测试一个真实世界的“加载”(xml数据量),如果它没有太大的区别,我坚持使用#temp或@variable表。以后更容易调试,因为你可以放入一个临时“select * from #myTempTable”并查看数据是否被正确切碎。您还应该测试#temp vs @variable表。有时它会有所不同。

Please note there is an old-old "performance" bug on element-based-xml.

请注意,基于元素的xml存在一个旧的“性能”错误。

https://connect.microsoft.com/SQLServer/feedback/details/250407/insert-from-nodes-with-element-based-xml-has-poor-performance-on-sp2-with-x64

https://connect.microsoft.com/SQLServer/feedback/details/250407/insert-from-nodes-with-element-based-xml-has-poor-performance-on-sp2-with-x64

It may no longer apply. But it is something to be aware of. "back in the day" (because of the injected bug) my team and I had to change our element-based-xml into attribute-based xml (in C#) before sending it down to sql-server.

它可能不再适用。但这是需要注意的事情。 “回到当天”(由于注入了错误)我的团队和我必须将基于元素的xml更改为基于属性的xml(在C#中),然后再将其发送到sql-server。

#2


1  

I think the most straightforward way to accomplish this is to do a bulk insert of your application data using SqlBulkCopy into a staging table. There is also an option to set the batch size if you want to do this in batches.

我认为最直接的方法是使用SqlBulkCopy将应用程序数据批量插入到临时表中。如果要批量执行此操作,还可以选择设置批量大小。

After the bulk insert, run a stored procedure to do the update. Another option is to use an after insert trigger, but that seems to only make sense if you bulk insert in batches.

批量插入后,运行存储过程以执行更新。另一种选择是使用后插入触发器,但这似乎只有在批量插入批量时才有意义。