当数据号很大时,DataContext性能很差

时间:2021-01-06 16:53:57

If I used DataContext DB to store large number of data. I found that the performance will be very very slow when data number grows up.

如果我使用DataContext DB来存储大量数据。我发现当数据数量增长时,性能会非常慢。

The number of data is about 6000 record.

数据的数量大约有6000个记录。

If I insert one data and SubmitChange, the SubmitChange will cost 1.X secs.

如果我插入一个数据并提交更改,那么提交更改将花费1。X秒。

Is there any way to improve the performance or it is the limitation.....

有什么方法可以改善你的表现吗?

Thanks.

谢谢。

2 个解决方案

#1


2  

I didn't test it myself but try to not call SumbitChanges( ) after each insert. Perform all 6000 inserts and then call SubmitChanges( ) just once. The DataContext should be aware of all the changes you have made.

我自己没有测试它,但是尝试在每次插入之后都不调用SumbitChanges()。执行所有6000个插入,然后只调用SubmitChanges()一次。DataContext应该知道您所做的所有更改。

var _db = new YourDbContext( );

for( int i = 0; i < 6000; i++ )
{
    FoobarObj newObject = new FoobarObj( )
    {
        Name = "xyz_" + i.ToString( );
    };

    _db.FoobarObjects.InsertOnSubmit( newObject );
}

_db.SubmitChanges( );

#2


1  

One second sounds like too long. Much longer than I have measured. In a trivial test I just did:

一秒钟听起来太长了。比我测量的要长得多。在我刚刚做的一个小测试中:

using(var dc = new MyDc(@"isostore:/stuff.sdf"))
{
    if(!dc.DatabaseExists())
       dc.CreateDatabase();

    dc.Data.InsertAllOnSubmit(
       Enumerable.Range(0, 6000).Select( i => new MyData { Data = "Hello World" }));
    dc.SubmitChanges();
}

I can insert at a rate of 1~2 item per ms in 6000 batches, and this remained pretty steady as the data size continued to grow. If I change it to smaller batches (like say 5 items) then it drops to about 10ms per item, since there is quite a bit of overhead involved in initializing the datacontext and that dominates the execution time.

我可以在6000个批次中以1~2个项目的速度插入,而且随着数据规模的不断增长,这个速度仍然相当稳定。如果我将它更改为更小的批(比如5个项),那么它将降到每个项10ms左右,因为初始化datacontext需要相当大的开销,而这将控制执行时间。

So there must be something else going on. Can you give some details about what you are inserting? Maybe a code sample which demonstrates the problem end to end?

所以一定还有别的事情发生。你能提供一些关于插入内容的细节吗?也许是一个演示问题从头到尾的代码示例?

#1


2  

I didn't test it myself but try to not call SumbitChanges( ) after each insert. Perform all 6000 inserts and then call SubmitChanges( ) just once. The DataContext should be aware of all the changes you have made.

我自己没有测试它,但是尝试在每次插入之后都不调用SumbitChanges()。执行所有6000个插入,然后只调用SubmitChanges()一次。DataContext应该知道您所做的所有更改。

var _db = new YourDbContext( );

for( int i = 0; i < 6000; i++ )
{
    FoobarObj newObject = new FoobarObj( )
    {
        Name = "xyz_" + i.ToString( );
    };

    _db.FoobarObjects.InsertOnSubmit( newObject );
}

_db.SubmitChanges( );

#2


1  

One second sounds like too long. Much longer than I have measured. In a trivial test I just did:

一秒钟听起来太长了。比我测量的要长得多。在我刚刚做的一个小测试中:

using(var dc = new MyDc(@"isostore:/stuff.sdf"))
{
    if(!dc.DatabaseExists())
       dc.CreateDatabase();

    dc.Data.InsertAllOnSubmit(
       Enumerable.Range(0, 6000).Select( i => new MyData { Data = "Hello World" }));
    dc.SubmitChanges();
}

I can insert at a rate of 1~2 item per ms in 6000 batches, and this remained pretty steady as the data size continued to grow. If I change it to smaller batches (like say 5 items) then it drops to about 10ms per item, since there is quite a bit of overhead involved in initializing the datacontext and that dominates the execution time.

我可以在6000个批次中以1~2个项目的速度插入,而且随着数据规模的不断增长,这个速度仍然相当稳定。如果我将它更改为更小的批(比如5个项),那么它将降到每个项10ms左右,因为初始化datacontext需要相当大的开销,而这将控制执行时间。

So there must be something else going on. Can you give some details about what you are inserting? Maybe a code sample which demonstrates the problem end to end?

所以一定还有别的事情发生。你能提供一些关于插入内容的细节吗?也许是一个演示问题从头到尾的代码示例?