I'm fairly new to python, and I have 5 big arrays A,B,C,D,E with shapes:
我是python的新手,我有5个大阵列A,B,C,D,E和形状:
((1000000, 8), (1000000, 7), (1000000, 13840), (1000000, 204), (1000000, 3))
dtypes:
dtypes:
(dtype('float64'), dtype('float64'), dtype('int64'), dtype('int64'), dtype('float64'))
Now i would like to join them all into a single array with a shape of
现在我想将它们全部加入到一个形状为的单个数组中
(1000000, 8+7+13840+204+3) = (1000000, 14062)
I have tried all possible ways (hstack/concate),
我已经尝试了所有可能的方法(hstack / concate),
data_feature = np.concatenate((A,B,C,D,E), axis=1)
data_feature = np.hstack([A,B,C,D,E])
data_feature = np.hstack((A,B,C,D,E))
data_feature = np.column_stack([A,B,C,D,E])
but it all kills my system (Macbook Pro 2017/ 2.8GHz Intel Core i7/16 GB 2133 MHz LPDDR3), I think this could be a kernel problem, any suggestions that i can do this with my computer?
但这一切都杀死了我的系统(Macbook Pro 2017 / 2.8GHz Intel Core i7 / 16 GB 2133 MHz LPDDR3),我认为这可能是一个内核问题,我可以用我的电脑做任何建议吗?
1 个解决方案
#1
3
Given 64-bit (8 byte) values, you are trying to process:
给定64位(8字节)值,您正在尝试处理:
1000000 * 14062 * 8 * 2 = 224'992'000'000 bytes
The 2 on the end is because you have inputs plus equal-size outputs.
最后的2是因为你有输入加上相等大小的输出。
That is 209 GiB of data. You have 16 GiB of RAM. It is not feasible. You'll need to think harder about how you're processing your data, and how you can reduce it by a factor of 10. Or buy a machine with 192 GiB of RAM (which is very possible these days, just not on a laptop).
这是209 GiB的数据。你有16 GiB的RAM。这是不可行的。您需要更加思考如何处理数据,以及如何将数据减少10倍。或者购买具有192 GiB RAM的机器(这些天很可能,而不是在笔记本电脑上)。
#1
3
Given 64-bit (8 byte) values, you are trying to process:
给定64位(8字节)值,您正在尝试处理:
1000000 * 14062 * 8 * 2 = 224'992'000'000 bytes
The 2 on the end is because you have inputs plus equal-size outputs.
最后的2是因为你有输入加上相等大小的输出。
That is 209 GiB of data. You have 16 GiB of RAM. It is not feasible. You'll need to think harder about how you're processing your data, and how you can reduce it by a factor of 10. Or buy a machine with 192 GiB of RAM (which is very possible these days, just not on a laptop).
这是209 GiB的数据。你有16 GiB的RAM。这是不可行的。您需要更加思考如何处理数据,以及如何将数据减少10倍。或者购买具有192 GiB RAM的机器(这些天很可能,而不是在笔记本电脑上)。