JSON使用的内存量是多少?客户端上的弦化减少了吗?

时间:2022-09-26 06:49:25

Is there a way to reduce memory usage on client when converting a large javascript object to a string using JSON.stringify?

在使用JSON.stringify将大型javascript对象转换为字符串时,是否有一种方法可以减少客户机上的内存使用?

I'm looking for something that addresses the question below but for javascript on the client.

我正在寻找一些能够解决以下问题的东西,但是对于客户端上的javascript。

Writing JSON to a stream without buffering the string in memory

将JSON写入流而不缓冲内存中的字符串

When I try a simple JSON.stringify( big_object ) it quickly takes up all the RAM and freezes my computer.

当我尝试一个简单的JSON时。stringify(big_object)它迅速占据了所有的内存并冻结了我的电脑。

The same memory-usage issue takes place when I try to write a large object to indexedDB, as described in detail here.

当我尝试为indexedDB编写一个大型对象时,也会发生相同的内存使用问题,这里有详细的描述。

Example of memory leak in indexedDB at store.add (see Example at Edit)

在indexedDB中存储内存泄漏的例子。添加(参见编辑中的示例)

These two questions from three years ago seem to have the same problem, but I can't find that a solution was ever found.

这两个三年前的问题似乎有相同的问题,但我始终找不到解决办法。

How can I make a really long string using IndexedDB without crashing the browser?

如何使用IndexedDB创建一个真正长的字符串而不破坏浏览器?

JSON.stringify optimization

JSON。stringify优化

The larger question is this: in an off-line web app in which the user can accumulate a large amount of data in an indexedDB database, the process to back that up to the hard disk appears to be to write the data to an object, convert the object to a string, the string to a blob of text, and download the blob to disk. And to upload the file and write it back to the database, perform the reverse. However, the JSON.stringify and JSON.parse on the large object grabs all the memory and crashes the browser or entire computer.

更大的问题是:在一个离线web应用程序中,用户可以积累大量的数据在一个indexedDB数据库,支持过程,硬盘似乎写数据到一个对象,该对象转换为一个字符串,该字符串的blob文本,并下载blob到磁盘。并上传文件并将其写回数据库,执行相反的操作。然而,JSON。stringify和JSON。解析大对象获取所有内存并崩溃浏览器或整个计算机。

This link appears to state that the large blob issue in indexedDB has been resolved, but that doesn't appear to solve this problem, does it? The object can't be directly converted to a blob, can it? And, if so, can the organized object be recovered from a blob?

这个链接似乎表明,indexedDB中的大blob问题已经解决了,但是这并没有解决这个问题,不是吗?对象不能直接转换成blob,对吗?如果是这样,那么有组织的对象可以从blob中恢复吗?

Erratic IndexedDB large file store operations cause IndexedDB consume a large amount of memory that is not freed. https://bugzilla.mozilla.org/show_bug.cgi?id=1223782

不稳定的IndexedDB大型文件存储操作导致IndexedDB消耗大量未释放的内存。https://bugzilla.mozilla.org/show_bug.cgi?id=1223782

Apart from having the user download and upload several files to backup and restore their work saved in the database, is there another way to accomplish this when it's all on the client and off-line?

除了让用户下载并上传几个文件以备份和恢复保存在数据库中的工作之外,在客户机和离线的情况下,是否还有其他方法来实现这一点?

Thank you for any direction you can provide.

谢谢你提供的任何方向。

1 个解决方案

#1


1  

"The same memory-usage issue takes place when I try to write a large object to indexedDB, as described in detail here."

“当我试图向indexedDB编写一个大型对象时,也发生了相同的内存使用问题,本文将对此进行详细描述。”

indexDB has a limit of 5mb on mobile / 50mb on Desktop so if your object exceeds those device-based bounds that is your issue with indexDB

indexDB在移动端/桌面端有5mb的限制,所以如果你的对象超过了基于设备的边界,这就是你在indexDB中遇到的问题

As to creating a massive sting from a massive object, you may be running up against the V8 string length limitation, which is currently limited to 512MB. So you will need to use a stream based parse / serialization like big-json

对于从一个大型对象创建一个大的刺,您可能会遇到V8字符串长度限制,该限制目前限制为512MB。因此,您将需要使用基于流的解析/序列化,如big-json

#1


1  

"The same memory-usage issue takes place when I try to write a large object to indexedDB, as described in detail here."

“当我试图向indexedDB编写一个大型对象时,也发生了相同的内存使用问题,本文将对此进行详细描述。”

indexDB has a limit of 5mb on mobile / 50mb on Desktop so if your object exceeds those device-based bounds that is your issue with indexDB

indexDB在移动端/桌面端有5mb的限制,所以如果你的对象超过了基于设备的边界,这就是你在indexDB中遇到的问题

As to creating a massive sting from a massive object, you may be running up against the V8 string length limitation, which is currently limited to 512MB. So you will need to use a stream based parse / serialization like big-json

对于从一个大型对象创建一个大的刺,您可能会遇到V8字符串长度限制,该限制目前限制为512MB。因此,您将需要使用基于流的解析/序列化,如big-json