Hello all,

Has anyone used Docker in production?

We currently have quite a few web based tools (QA, baking, asset mgmt, etc.) deployed at different studios. Even with automated testing putting these services into production isn’t as easy as I would like. Docker promises to make it much easier, pretty much saying that “if it runs on your (dev) machine” it will very likely run in production. Deploying docker images that you just swap out also sounds quite easy, compared to manual deployment. So I’m curious if anyone has thoughts or advice for using Docker for deploying pipeline components.

Studio Art Pipeline Tools Question

I haven’t, as our pipeline and distribution tools are well defined. Starting over now on a different platform would be a huge step back. I think in the future you will see quite a lot of in house pipelines being distributed via containers. The upcoming Anniversary Update (aka SP1) for Windows 10 includes Docker support in the desktop client, not just the server. With that in place I think Docker distribution will quickly proliferate.


I’ve looked into it for a while. It’s a very appealing idea, but the big problem for us seems to be that the nature of the arrangement – living inside a VM there’s a level of indirection between the docker container and the host machine that is worrisome for things like tools that trawl lots of big binary files, though I haven’t gotten around to trying it out. It seems like the real use case is for server-style applications which are config-heavy and data-light, whereas most of our stuff is more the reverse. I’m planning on revsiting when the native windows support comes out to see how it looks then.


haha, yes, having this on the workstations would be awesome. From what I’ve seen it’s not possible yet to take advantage of containers when it comes to deploy tools to artists. Currently containers run Linux inside and are within a VM. Windows server 16 offers “windows containers” which run Windows inside and don’t need to run in a VM either. i.e. the containers are run right on top of the host system and interaction with it should be very straight forward. But practically, right now, you have to work with Linux containers running in a VM if you use Windows. I think Windows 10 now allows Hyper-V instead of docker’s default Virtualbox, but that doesn’t seem much of an advantage.

However I would definitely watch what happens on Server 16. Containerizing Windows applications sounds very powerful. Especially since docker images can form dependencies. e.g. you have 1 core python image and all your tools include it. Yet each tool has its own container and will never “pollute” your core Python image. Within the image you can have parts of the file system which are persistent and which aren’t. Especially for a studio like mine, where we have up to 20 projects at a time, the ability to have containers which don’t “pollute” the base Windows installation, sounds great.

But I’m not thinking that far yet. My immediate concern is really server based software. Server based pipeline components for asset management, asset storage, databases, tracking, or server based bake/render jobs, etc. We try to distribute the servers because internet connectivity can be an issue for us. But updating even 4 studios with new builds, and testing them, can be quite time consuming.


It’s really interesting to see Docker discussion inside CG portal!

For configuring and running web applications - go for it, it’s a very reasonable platform for something that will be delivered via browser. Never measured how it deals with ingesting large files, but don’t forget that it’s not a VM, but instead a process that “thinks” it’s a separate computer; also, you should use mapped volumes for a storage of your application anyway, which again translate to actual files on the host machine, and should

Definitely not a good choice for tool deployment on workstations though. Setting up docker host is an involving task, and container-to-host interoperability is minimal by design. I’d go with native instalers and standard tools on OS’es to administer upgrades.