I am wondering if you have any data on concurrent connections to websockets? I am using Socket.io on Node.js server. How many clients can connect to socket and receive data without bringing my server down? 1000? 1000.0000?
我想知道您是否有关于websockets的并发连接的数据?我用插座。输入输出节点。js服务器。有多少客户端可以连接到套接字并接收数据而不会使我的服务器宕机?1000年?1000.0000 ?
Thanks!
谢谢!
2 个解决方案
#1
22
This highly depends on your hardware configuration, what exactly are you doing/processing on the server side and if your system is optimized for many concurrent connections. For example on Linux machine by default you would probably first hit maximum number of opened files or other limits (which can be increased) before running into hardware resources exhaustion or similar scalability issues. Key resource may be the amount of RAM which can be allocated by your node.js program to keep concurrent connections opened and ability to receive new ones.
这很大程度上取决于您的硬件配置,您在服务器端做什么/处理什么,如果您的系统对许多并发连接进行了优化。例如,在Linux机器上默认情况下,在遇到硬件资源耗尽或类似的可伸缩性问题之前,您可能首先要达到打开文件的最大数量或其他限制(可以增加)。关键资源可能是您的节点可以分配的RAM的数量。js程序保持并发连接打开,并能够接收新的连接。
#2
15
http://blog.caustik.com/2012/08/19/node-js-w1m-concurrent-connections/
http://blog.caustik.com/2012/08/19/node-js-w1m-concurrent-connections/
Check this blog. We use the same principle. Previously our nodejs server will crash after 100 concurrent connections due to hardware constraints. But after we move to Amazon EC2 it is now highly scalable.
检查这个博客。我们使用同样的原则。以前,由于硬件限制,nodejs服务器在100个并发连接之后会崩溃。但是在我们迁移到Amazon EC2之后,它现在是高度可伸缩的。
#1
22
This highly depends on your hardware configuration, what exactly are you doing/processing on the server side and if your system is optimized for many concurrent connections. For example on Linux machine by default you would probably first hit maximum number of opened files or other limits (which can be increased) before running into hardware resources exhaustion or similar scalability issues. Key resource may be the amount of RAM which can be allocated by your node.js program to keep concurrent connections opened and ability to receive new ones.
这很大程度上取决于您的硬件配置,您在服务器端做什么/处理什么,如果您的系统对许多并发连接进行了优化。例如,在Linux机器上默认情况下,在遇到硬件资源耗尽或类似的可伸缩性问题之前,您可能首先要达到打开文件的最大数量或其他限制(可以增加)。关键资源可能是您的节点可以分配的RAM的数量。js程序保持并发连接打开,并能够接收新的连接。
#2
15
http://blog.caustik.com/2012/08/19/node-js-w1m-concurrent-connections/
http://blog.caustik.com/2012/08/19/node-js-w1m-concurrent-connections/
Check this blog. We use the same principle. Previously our nodejs server will crash after 100 concurrent connections due to hardware constraints. But after we move to Amazon EC2 it is now highly scalable.
检查这个博客。我们使用同样的原则。以前,由于硬件限制,nodejs服务器在100个并发连接之后会崩溃。但是在我们迁移到Amazon EC2之后,它现在是高度可伸缩的。