Setting aside the heap's capacity, are there ways to go beyond Integer.MAX_VALUE constraints in Java?
抛开堆的容量,有没有办法超越Java中的Integer.MAX_VALUE约束?
Examples are:
- Collections limit themselves to Integer.MAX_VALUE.
- StringBuilder / StringBuffer limit themselves to Integer.MAX_VALUE.
集合将自己限制为Integer.MAX_VALUE。
StringBuilder / StringBuffer将自己限制为Integer.MAX_VALUE。
6 个解决方案
#1
If you have a huge Collection you're going to hit all sorts of practical limits before you ever have 231 - 1 items in it. A Collection with a million items in it is going to be pretty unwieldy, let alone one with more than a thousands times more than that.
如果你有一个巨大的收藏品,你将会遇到各种实际限制,然后才会有231 - 1项。一个拥有一百万件物品的藏品将会非常笨拙,更不用说超过千万件的物品了。
Similarly, a StringBuilder can build a String that's 2GB in size before it hits the MAX_VALUE
limit which is more than adequate for any practical purpose.
类似地,StringBuilder可以在达到MAX_VALUE限制之前构建一个大小为2GB的String,这对于任何实际目的来说都是足够的。
If you truly think that you might be hitting these limits your application should be storing your data in a different way, probably in a database.
如果您真的认为可能达到了这些限制,那么您的应用程序应该以不同的方式存储您的数据,可能是在数据库中。
#2
With a long? Works for me.
用了很久?适合我。
Edit: Ah, clarification of the question. Cool. My new and improved answer:
编辑:啊,澄清问题。凉。我新的和改进的答案:
With a paging algorithm.
使用分页算法。
Coincidentally, somewhat recently for another question (Binary search in a sorted (memory-mapped ?) file in java), I whipped up a paging algorithm to get around the int parameters in the java.nio.MappedByteBuffer API.
巧合的是,最近在另一个问题(在java中的有序(内存映射?)文件中进行二进制搜索),我掀起了一个分页算法来绕过java.nio.MappedByteBuffer API中的int参数。
#3
You can create your own collections which have a long size() based on the source code for those collections. To have larger arrays of Objects for example, you can have an array of arrays (and stitch these together)
您可以根据这些集合的源代码创建自己的集合,这些集合具有long size()。例如,要拥有更大的对象数组,您可以拥有一个数组数组(并将这些数组合在一起)
This approach will allow almost 2^62 elements.
这种方法将允许几乎2 ^ 62个元素。
#4
Array indexes are limited by Integer.MAX_VALUE, not the physical size of the array.
数组索引受Integer.MAX_VALUE限制,而不是数组的物理大小。
Therefore the maximum size of an array is linked to the size of the array-type.
因此,数组的最大大小与数组类型的大小相关联。
byte = 1 byte => max 2 Gb data
char = 2 byte => max 4 Gb data
int = 4 byte => max 8 Gb data
long = 8 byte => max 16 Gb data
Dictionaries are a different story because they often use techniques like buckets or an internal data layout as a tree. Therefore these "limits" usually dont apply or you will need even more data to reach the limit.
字典是一个不同的故事,因为它们经常使用桶等技术或内部数据布局作为树。因此,这些“限制”通常不适用,或者您需要更多数据才能达到极限。
Short: Integer.MAX_VALUE is not really a limit because you need lots of memory to actually reach the limit. If you should ever reach this limit you might want to think about improving your algorithm and/or data-layout :)
短:Integer.MAX_VALUE实际上不是限制因为你需要大量内存来实际达到限制。如果您应该达到此限制,您可能需要考虑改进算法和/或数据布局:)
#5
Yes, with BigInteger class.
是的,有BigInteger类。
#6
A memory upgrade is necessary.. :)
内存升级是必要的.. :)
#1
If you have a huge Collection you're going to hit all sorts of practical limits before you ever have 231 - 1 items in it. A Collection with a million items in it is going to be pretty unwieldy, let alone one with more than a thousands times more than that.
如果你有一个巨大的收藏品,你将会遇到各种实际限制,然后才会有231 - 1项。一个拥有一百万件物品的藏品将会非常笨拙,更不用说超过千万件的物品了。
Similarly, a StringBuilder can build a String that's 2GB in size before it hits the MAX_VALUE
limit which is more than adequate for any practical purpose.
类似地,StringBuilder可以在达到MAX_VALUE限制之前构建一个大小为2GB的String,这对于任何实际目的来说都是足够的。
If you truly think that you might be hitting these limits your application should be storing your data in a different way, probably in a database.
如果您真的认为可能达到了这些限制,那么您的应用程序应该以不同的方式存储您的数据,可能是在数据库中。
#2
With a long? Works for me.
用了很久?适合我。
Edit: Ah, clarification of the question. Cool. My new and improved answer:
编辑:啊,澄清问题。凉。我新的和改进的答案:
With a paging algorithm.
使用分页算法。
Coincidentally, somewhat recently for another question (Binary search in a sorted (memory-mapped ?) file in java), I whipped up a paging algorithm to get around the int parameters in the java.nio.MappedByteBuffer API.
巧合的是,最近在另一个问题(在java中的有序(内存映射?)文件中进行二进制搜索),我掀起了一个分页算法来绕过java.nio.MappedByteBuffer API中的int参数。
#3
You can create your own collections which have a long size() based on the source code for those collections. To have larger arrays of Objects for example, you can have an array of arrays (and stitch these together)
您可以根据这些集合的源代码创建自己的集合,这些集合具有long size()。例如,要拥有更大的对象数组,您可以拥有一个数组数组(并将这些数组合在一起)
This approach will allow almost 2^62 elements.
这种方法将允许几乎2 ^ 62个元素。
#4
Array indexes are limited by Integer.MAX_VALUE, not the physical size of the array.
数组索引受Integer.MAX_VALUE限制,而不是数组的物理大小。
Therefore the maximum size of an array is linked to the size of the array-type.
因此,数组的最大大小与数组类型的大小相关联。
byte = 1 byte => max 2 Gb data
char = 2 byte => max 4 Gb data
int = 4 byte => max 8 Gb data
long = 8 byte => max 16 Gb data
Dictionaries are a different story because they often use techniques like buckets or an internal data layout as a tree. Therefore these "limits" usually dont apply or you will need even more data to reach the limit.
字典是一个不同的故事,因为它们经常使用桶等技术或内部数据布局作为树。因此,这些“限制”通常不适用,或者您需要更多数据才能达到极限。
Short: Integer.MAX_VALUE is not really a limit because you need lots of memory to actually reach the limit. If you should ever reach this limit you might want to think about improving your algorithm and/or data-layout :)
短:Integer.MAX_VALUE实际上不是限制因为你需要大量内存来实际达到限制。如果您应该达到此限制,您可能需要考虑改进算法和/或数据布局:)
#5
Yes, with BigInteger class.
是的,有BigInteger类。
#6
A memory upgrade is necessary.. :)
内存升级是必要的.. :)