I am getting "too many open files error" when a certain number of users exceeds (its around 1200 concurrent users).
当一定数量的用户超过(大约1200个并发用户)时,我得到“太多打开文件错误”。
I increased the limit using this but I was getting same error.
我使用这个增加了限制但是我得到了相同的错误。
Then I followed this and no change getting the same error.
然后我跟着这个没有变化得到同样的错误。
For creating connection I am using in my django settings and using REDIS
when I need it.
为了创建连接,我在我的django设置中使用并在需要时使用REDIS。
REDIS = redis.StrictRedis(host='localhost', port=6379, db=0)
Why I did it like that because it was suggested in redis mailing list like below:
为什么我这样做,因为它是在redis邮件列表中建议的,如下所示:
a. create a global redis client instance and have your code use that.
一个。创建一个全局redis客户端实例并让您的代码使用它。
Is that approach right for connection pooling? Or how I avoid this error of too many open files? In Django response I am getting
这种方法适合连接池吗?或者我如何避免太多打开文件的错误?在Django的反应中我得到了
Connection Error (Caused by : [Errno 24] Too many open files)",),)'
连接错误(由以下原因引起:[Errno 24]打开的文件太多)“,),)'
Thanks.
1 个解决方案
#1
2
You are creating a ConnectionPool per connection; depending on where you create the REDIS connection you might end up creating a new connection pool every time (eg. if its in view function).
您正在为每个连接创建一个ConnectionPool;根据您创建REDIS连接的位置,您可能最终每次都创建一个新的连接池(例如,如果它在视图功能中)。
You should make sure you create connections reusing an long lived connection pool; if you define the connection pool instance at module level and reuse that when you init connections you will be sure only 1 pool is created (one per python process at least).
您应该确保重新使用长期连接池创建连接;如果您在模块级别定义连接池实例并在初始化连接时重用它,您将确保只创建了一个池(至少每个python进程一个)。
If you see the "too many open files error" on Redis with ulimit set way higher than the amount of users (eg. ulimit 10k and 1k connections from django) than you might be doing something that leads to Redis connection leaking (and therefore not being closed for an amount of time).
如果您在Redis上看到“太多打开文件错误”,其中ulimit设置的方式高于用户数量(例如,来自django的ulimit 10k和1k连接),那么可能会导致Redis连接泄漏(因此不会被关闭了一段时间)。
I suggest you to start adding a connection pool and set a max connection limit there (its part of init signature); make sure the pool raises an exception only when the actual amount of connect users > than the limit.
我建议你开始添加连接池并在那里设置一个最大连接限制(它是init签名的一部分);确保只有当连接用户的实际数量超过限制时,池才会引发异常。
If you can, increase the ulimit; Redis can easily take more than 1k connections.
如果可以的话,增加ulimit; Redis可以轻松地连接超过1k。
If you really want to limit the amount of connections between your python scripts and Redis you should consider using the BlockingConnectionPool which will let clients wait when all connections are in use (rather than throw an exception) or perhaps use something like twemproxy in between.
如果你真的想限制python脚本和Redis之间的连接数量,你应该考虑使用BlockingConnectionPool,它允许客户端在所有连接都在使用时等待(而不是抛出异常),或者在它们之间使用类似twemproxy的东西。
#1
2
You are creating a ConnectionPool per connection; depending on where you create the REDIS connection you might end up creating a new connection pool every time (eg. if its in view function).
您正在为每个连接创建一个ConnectionPool;根据您创建REDIS连接的位置,您可能最终每次都创建一个新的连接池(例如,如果它在视图功能中)。
You should make sure you create connections reusing an long lived connection pool; if you define the connection pool instance at module level and reuse that when you init connections you will be sure only 1 pool is created (one per python process at least).
您应该确保重新使用长期连接池创建连接;如果您在模块级别定义连接池实例并在初始化连接时重用它,您将确保只创建了一个池(至少每个python进程一个)。
If you see the "too many open files error" on Redis with ulimit set way higher than the amount of users (eg. ulimit 10k and 1k connections from django) than you might be doing something that leads to Redis connection leaking (and therefore not being closed for an amount of time).
如果您在Redis上看到“太多打开文件错误”,其中ulimit设置的方式高于用户数量(例如,来自django的ulimit 10k和1k连接),那么可能会导致Redis连接泄漏(因此不会被关闭了一段时间)。
I suggest you to start adding a connection pool and set a max connection limit there (its part of init signature); make sure the pool raises an exception only when the actual amount of connect users > than the limit.
我建议你开始添加连接池并在那里设置一个最大连接限制(它是init签名的一部分);确保只有当连接用户的实际数量超过限制时,池才会引发异常。
If you can, increase the ulimit; Redis can easily take more than 1k connections.
如果可以的话,增加ulimit; Redis可以轻松地连接超过1k。
If you really want to limit the amount of connections between your python scripts and Redis you should consider using the BlockingConnectionPool which will let clients wait when all connections are in use (rather than throw an exception) or perhaps use something like twemproxy in between.
如果你真的想限制python脚本和Redis之间的连接数量,你应该考虑使用BlockingConnectionPool,它允许客户端在所有连接都在使用时等待(而不是抛出异常),或者在它们之间使用类似twemproxy的东西。