I have a C++ process running in the background that will be generating 'events' infrequently that a Python process running on the same box will need to pick up.
我有一个在后台运行的c++进程,它很少生成“事件”,而在同一个框上运行的Python进程需要接收这些事件。
- The code on the C side needs to be as lightweight as possible.
- C端的代码需要尽可能地轻量级。
- The Python side is read-only.
- Python端是只读的。
- The implementation must be cross-platform.
- 实现必须是跨平台的。
- The data being sent is very simple.
- 发送的数据非常简单。
What are my options?
我的选择是什么?
Thanks
谢谢
7 个解决方案
#1
39
zeromq -- and nothing else. encode the messages as strings.
零,没有别的了。将消息编码为字符串。
However, If you want to get serialiazation from a library use protobuf it will generate classes for Python and C++. You use the SerializeToString() and ParseFromString() functions on either end, and then pipe the strings via ZeroMq.
但是,如果您想从一个库中获得序列化,它将为Python和c++生成类。您可以在两端使用SerializeToString()和ParseFromString()函数,然后通过ZeroMq管道字符串。
Problem solved, as I doubt any other solution is faster, and neither will any other solution be as easy to wire-up and simple to understand.
问题解决了,因为我怀疑任何其他的解决方案都是更快的,而且任何其他的解决方案都不会那么容易理解和理解。
If want to use specific system primitives for rpc such as named pipes on Windows and Unix Domain Sockets on unix then you should look at Boost::ASIO. However, unless you have (a) a networking background, and (b) a very good understanding of C++, this will be very time consuming
如果想要为rpc使用特定的系统原语,比如Windows上的命名管道和Unix上的Unix域套接字,那么应该查看Boost::ASIO。然而,除非你有(a)网络背景,(b)非常了解c++,否则这将非常耗时
#2
5
Google's protobuf is a great library for RPC between programs. It generates bindings for Python and C++.
谷歌的protobuf是程序之间的RPC的一个很好的库。它为Python和c++生成绑定。
If you need a distributed messaging system, you could also use something like RabbitMQ, zeromq, or ActiveMQ. See this question for a discussion on the message queue libraries.
如果需要分布式消息传递系统,还可以使用RabbitMQ、zeromq或ActiveMQ之类的东西。有关消息队列库的讨论,请参见此问题。
#4
1
How complex is your data? If it is simple I would serialize it as a string. If it was moderately complex I would use JSON. TCP is a good cross-platform IPC transport. Since you say that this IPC is rare the performance isn't very important, and TCP+JSON will be fine.
你的数据有多复杂?如果它很简单,我将它序列化为字符串。如果它比较复杂,我会使用JSON。TCP是一个很好的跨平台的IPC传输。由于您说这个IPC很少见,所以性能不是很重要,TCP+JSON也没问题。
#5
1
Another option is to just call your C code from your Python code using the ctypes
module rather than running the two programs separately.
另一种选择是使用ctypes模块从Python代码中调用C代码,而不是分别运行这两个程序。
#7
-4
I will say you create a DLL that will manage the communication between the two. The python will load DLL and call method like getData() and the DLL will in turn communicate with process and get the data. That should not be hard. Also you can use XML file or SQLite database or any database to query data. The daemon will update DB and Python will keep querying. There might be a filed for indicating if the data in DB is already updated by daemon and then Python will query. Of course it depends on performance and accuracy factors!
我将说您创建了一个DLL来管理两者之间的通信。python将加载DLL和调用getData()等方法,DLL与进程通信并获取数据。这应该不难。还可以使用XML文件或SQLite数据库或任何数据库查询数据。守护进程将更新DB, Python将继续查询。可能会有一个用于指示数据库中的数据是否已由守护进程更新,然后Python将查询的文件。当然这取决于性能和精度因素!
#1
39
zeromq -- and nothing else. encode the messages as strings.
零,没有别的了。将消息编码为字符串。
However, If you want to get serialiazation from a library use protobuf it will generate classes for Python and C++. You use the SerializeToString() and ParseFromString() functions on either end, and then pipe the strings via ZeroMq.
但是,如果您想从一个库中获得序列化,它将为Python和c++生成类。您可以在两端使用SerializeToString()和ParseFromString()函数,然后通过ZeroMq管道字符串。
Problem solved, as I doubt any other solution is faster, and neither will any other solution be as easy to wire-up and simple to understand.
问题解决了,因为我怀疑任何其他的解决方案都是更快的,而且任何其他的解决方案都不会那么容易理解和理解。
If want to use specific system primitives for rpc such as named pipes on Windows and Unix Domain Sockets on unix then you should look at Boost::ASIO. However, unless you have (a) a networking background, and (b) a very good understanding of C++, this will be very time consuming
如果想要为rpc使用特定的系统原语,比如Windows上的命名管道和Unix上的Unix域套接字,那么应该查看Boost::ASIO。然而,除非你有(a)网络背景,(b)非常了解c++,否则这将非常耗时
#2
5
Google's protobuf is a great library for RPC between programs. It generates bindings for Python and C++.
谷歌的protobuf是程序之间的RPC的一个很好的库。它为Python和c++生成绑定。
If you need a distributed messaging system, you could also use something like RabbitMQ, zeromq, or ActiveMQ. See this question for a discussion on the message queue libraries.
如果需要分布式消息传递系统,还可以使用RabbitMQ、zeromq或ActiveMQ之类的东西。有关消息队列库的讨论,请参见此问题。
#3
#4
1
How complex is your data? If it is simple I would serialize it as a string. If it was moderately complex I would use JSON. TCP is a good cross-platform IPC transport. Since you say that this IPC is rare the performance isn't very important, and TCP+JSON will be fine.
你的数据有多复杂?如果它很简单,我将它序列化为字符串。如果它比较复杂,我会使用JSON。TCP是一个很好的跨平台的IPC传输。由于您说这个IPC很少见,所以性能不是很重要,TCP+JSON也没问题。
#5
1
Another option is to just call your C code from your Python code using the ctypes
module rather than running the two programs separately.
另一种选择是使用ctypes模块从Python代码中调用C代码,而不是分别运行这两个程序。
#6
#7
-4
I will say you create a DLL that will manage the communication between the two. The python will load DLL and call method like getData() and the DLL will in turn communicate with process and get the data. That should not be hard. Also you can use XML file or SQLite database or any database to query data. The daemon will update DB and Python will keep querying. There might be a filed for indicating if the data in DB is already updated by daemon and then Python will query. Of course it depends on performance and accuracy factors!
我将说您创建了一个DLL来管理两者之间的通信。python将加载DLL和调用getData()等方法,DLL与进程通信并获取数据。这应该不难。还可以使用XML文件或SQLite数据库或任何数据库查询数据。守护进程将更新DB, Python将继续查询。可能会有一个用于指示数据库中的数据是否已由守护进程更新,然后Python将查询的文件。当然这取决于性能和精度因素!