We are managing html contents from datasource and directly write on the web pages using asp.net C#.
我们从datasource管理html内容,并使用asp.net C#直接在网页上编写。
Here we are facing the problem : On the page complete contents are not displaying but while we check the Page source and copy/paste it into a static html page all contents will be displayed.
这里我们面临的问题是:在页面上没有显示完整的内容,但是当我们检查页面源并将其复制/粘贴到静态html页面时,将显示所有内容。
Is there any limitation of browser related to maximum length of a web page.
浏览器是否存在与网页最大长度相关的限制。
I googled and found that the limit of a web page should be 10-30KB but in the same project we have pages with length upto 55 KB.
我用谷歌搜索,发现网页的限制应该是10-30KB,但在同一个项目中我们有长度高达55 KB的页面。
Can anyone help me out?
谁能帮我吗?
3 个解决方案
#1
I've recently been benchmarking browser load times for very large text files. Here are some data:
我最近一直在为非常大的文本文件对浏览器加载时间进行基准测试。以下是一些数据:
IE --dies around 35 megs
Firefox --dies around 60 megs
Safari --dies around 60 megs
Chrome --dies around 50 megs
IE浏览器 - 大约35兆的火狐 - 大约60兆美元的Safari - 大约60兆的Chrome - 大约50兆的死
Again, this is simple browser load time of a basic (if large) English text files. One strange note is that Firefox seems to handle close to 60 megs before becoming non-responsive, but it only puts 55.1 megs out on the viewport. (However I can ctrl-a to get all 60 megs onto the clipboard.)
Naturally your mileage will vary, and this is all related to network latency, and we'll probably see vast differences if you're talking about downloading pictures etc. This is just for a single very large file of english text.
同样,这是基本(如果大)英文文本文件的简单浏览器加载时间。一个奇怪的注意事项是Firefox似乎在没有响应之前处理了接近60兆的问题,但它只在视口上输出了55.1兆。 (但是我可以通过ctrl-a将所有60个megs放到剪贴板上。)当然你的里程会有所不同,这都与网络延迟有关,如果你在谈论下载图片等,我们可能会看到很大的差异。这仅适用于单个非常大的英文文本文件。
#2
The limits (if they only exist) are higher than 50KB:
限制(如果它们仅存在)高于50KB:
$ wget --quiet "http://www.cnn.com" -O- | wc -c
99863
I wouldn't believe there's any particular constant limit for page size. I would guess it rather depends on the memory size the web browser process can allocate.
我不相信页面大小有任何特定的常量限制。我猜它取决于Web浏览器进程可以分配的内存大小。
#3
Install firefox and firebug and try to examine any factors that could be affecting the source code. Unless you are changing something odd in the C# it shouldn't be cut off.
安装firefox和firebug并尝试检查可能影响源代码的任何因素。除非你在C#中改变一些奇怪的东西,否则它不应该被切断。
#1
I've recently been benchmarking browser load times for very large text files. Here are some data:
我最近一直在为非常大的文本文件对浏览器加载时间进行基准测试。以下是一些数据:
IE --dies around 35 megs
Firefox --dies around 60 megs
Safari --dies around 60 megs
Chrome --dies around 50 megs
IE浏览器 - 大约35兆的火狐 - 大约60兆美元的Safari - 大约60兆的Chrome - 大约50兆的死
Again, this is simple browser load time of a basic (if large) English text files. One strange note is that Firefox seems to handle close to 60 megs before becoming non-responsive, but it only puts 55.1 megs out on the viewport. (However I can ctrl-a to get all 60 megs onto the clipboard.)
Naturally your mileage will vary, and this is all related to network latency, and we'll probably see vast differences if you're talking about downloading pictures etc. This is just for a single very large file of english text.
同样,这是基本(如果大)英文文本文件的简单浏览器加载时间。一个奇怪的注意事项是Firefox似乎在没有响应之前处理了接近60兆的问题,但它只在视口上输出了55.1兆。 (但是我可以通过ctrl-a将所有60个megs放到剪贴板上。)当然你的里程会有所不同,这都与网络延迟有关,如果你在谈论下载图片等,我们可能会看到很大的差异。这仅适用于单个非常大的英文文本文件。
#2
The limits (if they only exist) are higher than 50KB:
限制(如果它们仅存在)高于50KB:
$ wget --quiet "http://www.cnn.com" -O- | wc -c
99863
I wouldn't believe there's any particular constant limit for page size. I would guess it rather depends on the memory size the web browser process can allocate.
我不相信页面大小有任何特定的常量限制。我猜它取决于Web浏览器进程可以分配的内存大小。
#3
Install firefox and firebug and try to examine any factors that could be affecting the source code. Unless you are changing something odd in the C# it shouldn't be cut off.
安装firefox和firebug并尝试检查可能影响源代码的任何因素。除非你在C#中改变一些奇怪的东西,否则它不应该被切断。