Open and link multiple programs from single python interface

I’m working on a single “Bridge” application to allow sending live mesh and camera data back and forth between Houdini and Maya, and potentially other programs too.
I’d like the user to be able to open new sessions of these programs from the Bridge interface, and I’m having trouble doing it in Python.
Subprocess and multiprocessing each seem to create child processes of the main Python one, which is not what I want - from an OS level they should be entirely separate, as if the user had run the exes separately by hand.
From there I think I can work out the communication between them, as both Maya and Houdini have server ports, but if anyone has any experience or knows of any gotchas, I’d appreciate hearing about them.
Thanks

1 Like

On Windows, you can do this with subprocess.Popen using the creationflags parameter

from subprocess import Popen, PIPE

DETACHED_PROCESS = 0x00000008
CREATE_NEW_PROCESS_GROUP = 0x00000200

path_to_exe = 'C:/Example/OtherApp.exe'

p = Popen(path_to_exe, stdin=PIPE, stdout=PIPE, stderr=PIPE, 
          creationflags=DETACHED_PROCESS | CREATE_NEW_PROCESS_GROUP)

This tells Windows to run the executable as a separate process. Passing in PIPE to stdin, stdout, and stderr tells it to create new streams for each of those, otherwise they default to None and the child process’s input / output file handlers will be inherited from the parent process. In my experience it’s best to set them explicitly.

Those magic creationflags numbers come from Microsoft (see MS process creation flags here). That means this exact approach won’t work for Mac or Linux systems.

If you need this to work on Max / Linux you have to use a different parameter called start_new_session instead of creationflags, but it’s pretty straightforward. there’s a Stack Overflow post that demonstrates how to do this.

7 Likes

Excellent, thank you!
That solves the issue of getting a separate process, but now I get WinError 5: Access is denied - since I’m only running the houdini exe from program files, and it had no problem in running it before, is there some permission I need to give Python itself to spawn new separate processes?

1 Like

Hi all, an update - I can now create new detached processes for dccs from the python bridge UI, and by providing a script to the system call for each dcc (for example houdini.exe bridgesetup.py ) I can set up a basic network interface in the new instance, using sockets to communicate with the original branch.

Using sockets in this way is fine, but it relies on constant ticking within all the processes to send and receive data - ideally I would like relevant updates to propagate instantly, so when a user moves a camera in Maya, it matches it exactly in Houdini, etc. The ideal way would be to set up dcc callbacks which somehow send signals through the same socket interface, instead of just information.

Additionally, I see that Houdini specifically defines a binding of rpyc for direct remote operation: https://www.sidefx.com/docs/houdini/hom/rpc
Is there any “correct” way to make this module visible to the main python bridge (which is running in the system python interpreter), or does this module just need to be added to the python path for the main bridge process?

Thanks for your help

1 Like

I’ve spent some time recently trying to make GRPC, RabbitMQ and threaded sockets work as Producer/Consumer. I’m trying to avoid third party libraries, where I can, and found threaded sockets to handle things pretty well.

I treat all the applications (DCCs, etc) as both producers and consumers.

It’s sort of a poor man version of this Publish–subscribe pattern - Wikipedia

If I was to suggest a ‘correct’ way, I could only offer my preference for this pubsub pattern.

I was hoping to use GRPC and Kafka for something like this, but compiling libs for all the flavors of interpreter, isn’t a good time, and I am not getting paid to do it. :wink:

Can’t wait to hear what you settle on.

Hey, at the moment I’ve settled on pretty much the same structure you describe. When each program runs, a child “emitter” thread is created, which itself spins off a child “listener” thread. The emitter thread constructs the socket interface, and is connected to the dcc by whatever callbacks you want. The listener’s only purpose is to constantly check the socket in a while True loop, and if an event is received, pass it to a dcc-specific subclass method for handling it.
Together the two threads form a sort of server, and that server is roughly mirrored within the bridge program - the bridge checks all its connected sockets within the same iteration in the same thread, then processes whatever it finds.
The publish/subscriber model would have been ideal if there were real events to hook into - the while True checking seems very primitive, but every implementation I’ve seen suggests it.

1 Like

Yeah. I think the key there would be a messaging backend, like Kafka or RabbitMQ, to subscribe to. At that point I dropped my investigation, as I’m running Windows. I promised myself to dive back in once we get public releases of Maya and Houdini with Python 3.x that I can “reliably” compile libraries against.

I’ve been tempted by the betas, but, …yuck.

1 Like

On windows there is a constant for the creation flags:

Popen(cmd, creationflags=subprocess.CREATE_NEW_CONSOLE)

for Unix/Mac you can use:

Popen(cmd, preexec_fn=os.setsid)

I think this has been covered mostly. I have not really tried to pipe data between these applications, more just launch them with a configured environment.

1 Like

Uses RabbitMQ and/or Redis right out of the box.

1 Like

I did some tests some time ago.
I wanted to be able to control multiples python interpreters (including maya cmds and different version of python) from a single python instance.

I used pyzmq for communication.
The server expose basic python operations to the client (getGlobal, getattr, getitem, setitem, call, importModule) and only basic types are serialized through the connection (bool, int, float, str) all the orther types are referenced by their id in what I called a “MappedRef”.
I did’t try to transfert geometry from one maya to an other. If I have some time, I’ll do a test.

p = subprocess.Popen(["subpython", "run", "-E", "tcp://127.0.0.1:45555"], cwd="/", stderr=subprocess.STDOUT)
time.sleep(2) # Wait for the interpretor to be ready
client = Client("tcp://127.0.0.1:45555")
client.spImportModule("os")
cwd = client.os.getcwd()
print(cwd)
data = client.dict()
data["test"] = "Test"
print(data)
print(data["test"])

You can look at the implementation here:

1 Like

That is very cool, and far beyond what I’m doing lol. I ended up stopping a level below the object facade, I just construct and send command events in the same way. It also helps that I don’t require crazy precise functionality from the bridge itself, and all the geometry passing I’m doing is through files - basically if a bridge file changes, the relevant client scenes just reimport it.
It’s primitive but it’s fine for now, I think the next level up would be something like protobuffers (although at that point might as well just use omniverse)