抛出了类型'System.OutOfMemoryException'的异常

时间:2022-07-14 20:28:35

Basically I use Entity Framework to query a huge database. I want to return a string list then log it to a text file.

基本上我使用Entity Framework来查询庞大的数据库。我想返回一个字符串列表,然后将其记录到文本文件中。

List<string> logFilePathFileName = new List<string>();
var query = from c in DBContext.MyTable where condition = something select c;
foreach (var result in query)
{
    filePath = result.FilePath;
    fileName = result.FileName;
    string temp = filePath + "." + fileName;
    logFilePathFileName.Add(temp);
    if(logFilePathFileName.Count %1000 ==0)
        Console.WriteLine(temp+"."+logFilePathFileName.Count);
}

However I got an exception when logFilePathFileName.Count=397000. The exception is:

但是当logFilePathFileName.Count = 397000时我遇到了异常。例外是:

Exception of type 'System.OutOfMemoryException' was thrown.

抛出了类型'System.OutOfMemoryException'的异常。

A first chance exception of type 'System.OutOfMemoryException' occurred in System.Data.Entity.dll

System.Data.Entity.dll中发生了'System.OutOfMemoryException'类型的第一次机会异常

UPDATE:

更新:

What I want to use a different query say: select top 1000 then add to the list, but I don't know after 1000 then what?

我想用一个不同的查询说:选择前1000然后添加到列表中,但我不知道1000之后又是什么?

6 个解决方案

#1


13  

Most probabbly it's not about a RAM as is, so increasing your RAM or even compiling and running your code in 64 bit machine will not have a positive effect, in this case.

最可能的是它不是关于RAM的,因此在这种情况下,增加RAM甚至在64位机器中编译和运行代码将不会产生积极影响。

I think it's related to a fact that .NET collections are limited to maximum 2GB RAM space (no difference either 32 or 64 bit).

我认为这与.NET集合限制为最大2GB RAM空间(32或64位无差异)有关。

To resolve this, split your list to much smaller chunks and most probabbly your problem will gone.

要解决此问题,请将列表拆分为更小的块,最可能的是问题将会消失。

Just one possible solution:

一个可能的解决方案:

foreach (var result in query)
{
    ....
    if(logFilePathFileName.Count %1000 ==0) {
        Console.WriteLine(temp+"."+logFilePathFileName.Count);
        //WRITE SOMEWHERE YOU NEED 
        logFilePathFileName = new List<string>(); //RESET LIST !|
    }
}

EDIT

编辑

If you want fragment a query, you can use Skip(...) and Take(...)

如果要片段查询,可以使用Skip(...)和Take(...)

Just an explanatory example:

只是一个解释性示例:

var fisrt1000 = query.Skip(0).Take(1000);
var second1000 = query.Skip(1000).Take(1000);

... and so on..

... 等等..

Naturally put it in your iteration and parametrize it based on bounds of data you know or need.

自然地将它放在迭代中并根据您知道或需要的数据边界对其进行参数化。

#2


3  

Why are you collecting the data in a List<string> if all you need to do is write it to a text file?

如果您只需要将其写入文本文件,为什么要在List 中收集数据?

You might as well just:

你可能只是:

  • Open the text file;
  • 打开文本文件;
  • Iterate over the records, appending each string to the text file (without storing the strings in memory);
  • 迭代记录,将每个字符串附加到文本文件中(不将字符串存储在内存中);
  • Flush and close the text file.
  • 刷新并关闭文本文件。

You will need far less memory than now, because you won't be keeping all those strings unnecessarily in memory.

您将需要比现在少得多的内存,因为您不会在内存中不必要地保留所有这些字符串。

#3


1  

You probably need to set some vmargs for memory! Also... look into writing it straight to your file and not holding it in a List

您可能需要为内存设置一些vmargs!另外......请直接将其写入您的文件,而不是将其保存在列表中

#4


1  

What Roy Dictus says sounds the best way. Also you can try to add a limit to your query. So your database result won't be so large.

Roy Dictus所说的听起来是最好的方式。您也可以尝试为查询添加限制。所以你的数据库结果不会那么大。

For info on: Limiting query size with entity framework

有关以下内容的信息:使用实体框架限制查询大小

#5


0  

You shouldn't read all records from database to list. It required a lot of memory. You an combine reading records and writing them to file. For example read 1000 records from db to list and save(append) them to text file, clear used memory (list.Clear()) and continue with new records.

您不应该从数据库中读取所有记录。它需要大量的内存。你组合阅读记录并将它们写入文件。例如,从db到list中读取1000条记录,并将它们保存(追加)到文本文件,清除已用内存(list.Clear())并继续新记录。

#6


0  

From several other topics on * I read that the Entity Framework is not designed to handle bulk data like that. The EF will cache/track all data in the context and will cause the exception in cases of huge bulks of data. Options are to use SQL directly or split up your records in smaller sets.

从*上的其他几个主题我读到实体框架不是为处理那样的批量数据而设计的。 EF将缓存/跟踪上下文中的所有数据,并在出现大量数据时导致异常。选项是直接使用SQL或将记录拆分为较小的集合。

#1


13  

Most probabbly it's not about a RAM as is, so increasing your RAM or even compiling and running your code in 64 bit machine will not have a positive effect, in this case.

最可能的是它不是关于RAM的,因此在这种情况下,增加RAM甚至在64位机器中编译和运行代码将不会产生积极影响。

I think it's related to a fact that .NET collections are limited to maximum 2GB RAM space (no difference either 32 or 64 bit).

我认为这与.NET集合限制为最大2GB RAM空间(32或64位无差异)有关。

To resolve this, split your list to much smaller chunks and most probabbly your problem will gone.

要解决此问题,请将列表拆分为更小的块,最可能的是问题将会消失。

Just one possible solution:

一个可能的解决方案:

foreach (var result in query)
{
    ....
    if(logFilePathFileName.Count %1000 ==0) {
        Console.WriteLine(temp+"."+logFilePathFileName.Count);
        //WRITE SOMEWHERE YOU NEED 
        logFilePathFileName = new List<string>(); //RESET LIST !|
    }
}

EDIT

编辑

If you want fragment a query, you can use Skip(...) and Take(...)

如果要片段查询,可以使用Skip(...)和Take(...)

Just an explanatory example:

只是一个解释性示例:

var fisrt1000 = query.Skip(0).Take(1000);
var second1000 = query.Skip(1000).Take(1000);

... and so on..

... 等等..

Naturally put it in your iteration and parametrize it based on bounds of data you know or need.

自然地将它放在迭代中并根据您知道或需要的数据边界对其进行参数化。

#2


3  

Why are you collecting the data in a List<string> if all you need to do is write it to a text file?

如果您只需要将其写入文本文件,为什么要在List 中收集数据?

You might as well just:

你可能只是:

  • Open the text file;
  • 打开文本文件;
  • Iterate over the records, appending each string to the text file (without storing the strings in memory);
  • 迭代记录,将每个字符串附加到文本文件中(不将字符串存储在内存中);
  • Flush and close the text file.
  • 刷新并关闭文本文件。

You will need far less memory than now, because you won't be keeping all those strings unnecessarily in memory.

您将需要比现在少得多的内存,因为您不会在内存中不必要地保留所有这些字符串。

#3


1  

You probably need to set some vmargs for memory! Also... look into writing it straight to your file and not holding it in a List

您可能需要为内存设置一些vmargs!另外......请直接将其写入您的文件,而不是将其保存在列表中

#4


1  

What Roy Dictus says sounds the best way. Also you can try to add a limit to your query. So your database result won't be so large.

Roy Dictus所说的听起来是最好的方式。您也可以尝试为查询添加限制。所以你的数据库结果不会那么大。

For info on: Limiting query size with entity framework

有关以下内容的信息:使用实体框架限制查询大小

#5


0  

You shouldn't read all records from database to list. It required a lot of memory. You an combine reading records and writing them to file. For example read 1000 records from db to list and save(append) them to text file, clear used memory (list.Clear()) and continue with new records.

您不应该从数据库中读取所有记录。它需要大量的内存。你组合阅读记录并将它们写入文件。例如,从db到list中读取1000条记录,并将它们保存(追加)到文本文件,清除已用内存(list.Clear())并继续新记录。

#6


0  

From several other topics on * I read that the Entity Framework is not designed to handle bulk data like that. The EF will cache/track all data in the context and will cause the exception in cases of huge bulks of data. Options are to use SQL directly or split up your records in smaller sets.

从*上的其他几个主题我读到实体框架不是为处理那样的批量数据而设计的。 EF将缓存/跟踪上下文中的所有数据,并在出现大量数据时导致异常。选项是直接使用SQL或将记录拆分为较小的集合。