将MongoDB和Redis合并会是一个很好的场景吗?

时间:2021-04-17 08:43:23

I'm developing a site with Node.js which has many realtime features. One feature is, that users can publish something in a certain channel and this will be immediately pushed to everyone watching the same channel.

我正在开发一个有Node的网站。具有很多实时特性的js。一个特性是,用户可以在一个特定的频道上发布内容,这将被立即推送给每个观看同一频道的人。

I am trying to figure out the best way to approach this. The data structure of MongoDB fits my needs very well, but Redis Pub/Sub feature seems extremely fitting for this problem.

我想找出解决这个问题的最佳方法。MongoDB的数据结构非常符合我的需要,但是Redis Pub/Sub特性似乎非常适合这个问题。

So I thought I might store the full dataset in MongoDB and just add a reference in Redis to push it to the neccessary channels. Then the subscriber clients of these channels could read the full data from the MongoDB database.

所以我想我可以在MongoDB中存储完整的数据集,然后在Redis中添加一个引用,将其推到必要的通道上。然后这些通道的订阅方客户端可以从MongoDB数据库中读取完整的数据。

Does this make sense or am I missing something?

这说得通还是我漏掉了什么?

Thanks!

谢谢!

1 个解决方案

#1


3  

It makes sense. You can use Redis natively or through the socket.io API, and use MongoDB for persistent data.

它是有意义的。您可以使用Redis本机或通过套接字。io API,并使用MongoDB作为持久数据。

The only gotcha is to realize that Redis pub/sub does not provide any queuing facility. Redis does not buffer the notifications, they go straight from the publisher socket to the subscriber sockets, in the same event loop iteration. This is fast, but if a subscriber closes its connection, it may lose some items before the connection is established again. The delivery is not guaranteed at all.

唯一的问题是,Redis pub/sub不提供任何排队设施。Redis不缓存通知,它们直接从发行者套接字到订阅服务器套接字,在相同的事件循环迭代中。这是快速的,但是如果一个订阅者关闭了它的连接,它可能在连接再次建立之前丢失一些项。交货没有保证。

However, a subscriber reconnecting after a disconnection could use the MongoDB data to get the items it has missed if required.

但是,断开连接后重新连接的订阅服务器可以使用MongoDB数据获取它在需要时漏掉的项。

#1


3  

It makes sense. You can use Redis natively or through the socket.io API, and use MongoDB for persistent data.

它是有意义的。您可以使用Redis本机或通过套接字。io API,并使用MongoDB作为持久数据。

The only gotcha is to realize that Redis pub/sub does not provide any queuing facility. Redis does not buffer the notifications, they go straight from the publisher socket to the subscriber sockets, in the same event loop iteration. This is fast, but if a subscriber closes its connection, it may lose some items before the connection is established again. The delivery is not guaranteed at all.

唯一的问题是,Redis pub/sub不提供任何排队设施。Redis不缓存通知,它们直接从发行者套接字到订阅服务器套接字,在相同的事件循环迭代中。这是快速的,但是如果一个订阅者关闭了它的连接,它可能在连接再次建立之前丢失一些项。交货没有保证。

However, a subscriber reconnecting after a disconnection could use the MongoDB data to get the items it has missed if required.

但是,断开连接后重新连接的订阅服务器可以使用MongoDB数据获取它在需要时漏掉的项。