I am currently in the process of writing a network data usage monitor in C (program A) on a 64 bit linux platform. As a packet sniffer there isn't much time for writing data to a file or db without risking the loss of packets. I could use another thread for such purposes, but I was thinking that a much cleaner solution (if it exists) would be to access this data stored in memory from another C program or ideally a python cgi script (program B). That way the data would be available on demand. The data in memory would be accessed read only from Program B. Is this possible? If so how? Thanks.
Updates:
I see that this is possible with mmap() to store the data in memory, and shmget() to retreive it. I've heard some say that shmget is old though. What other options are available?
The 2D arrays to be passed could be as big as 5000x4 int or 5000x15 char.
posix_ipc Sounds promising as a way of accessing shared memory from python (program B). Does anyone know if this will work with shared memory created in C (program A)?
I downloaded posix_ipc and it has some really cool demos. The first one demonstrates two processes talking to each other using shared memory. The two processes can be any combination of C and python and the source for each of the four are provided. It looks like the most efficient way to handle what I am trying to do, but I haven't had the time to play with it yet. I will report back when I do.
che's suggestion below sounds like it will also work, and I will keep that as my plan B.
Thanks to everyone for the help!
mmap
is perhaps a more general-purpose facility. But, yes, bit-twiddling from Python would be difficult. If both processes are in a language with binary-level access like C, though, it's pretty common to use shared memory in this way. Something likeswig
might help bind to Python. – BRPocock