What I like the most about Docker project is new opportunity to deploy and distribute software. Many times I’ve been to situation when I wanted to play with some software and get exited about, but after I read installation manual my excitement totally gone. Non trivial applications, requires quite a lot dependencies: runtimes, libraries, databases.
With docker, the installation instruction got reduced to something like:
Simply like that, forget about missing Java Runtime on your server. It suits perfectly for TCP/HTTP applications.
Being messing around Seismo project I realized, I want to go exactly same way. Since it has few dependencies now, MongoDB and NodeJS – it should be easier to anyone to try it, even if they do not use that setup. I was happy to see, that GitHub currently offers great support for Docker. Namely, if you have repo with
Dockerfile inside, each time you push the code, docker image got rebuild and pushed to public index.
Dockerfile that would build up image, ready to have Seismo run inside.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
It’s based on latest Ubuntu server, installs Git, MongoDB and NodeJS runtime and clones Seismo itself inside image.
But, I’ve met a problem to start few processes inside the container. Since I need both MongoDB for storage and NodeJS for API server, it’s required both be running inside one container. If shell script just starts one,
mongod for example,
node app.js is not executed.
I was a little worried, thinking it’s not possible to run more that one process inside container.
But solution was found. I’ve created another shell script that starts
mongod as background process and starts
That worked as charm.