Remember my success story of moving from MongoDB full text search index to ElasticSearch? After about one month of work our search service unexpectedly stopped. I received a bunch of emails from Logentries and NewRelic that server is no longer responsive.
Once I logged on to DigitalOcean account, I’ve seen message from administrators that server is sending a lot of traffic (UDP Flood), very likely compromised and therefore stopped. The link they give as instructions to fix the problem was quite comprehensive, but the most important info was in comments. A lot of people who were hit by problem had ElasticSearch deployed on their machines.
It appeared, if ES is left on default configuration with port opened outside, it will be easy target for bad guys, both of Java and ES vulnerability. Basically, I had to restore server from scratch, but this time I’m not goint to be naive, server have to be properly secured.
I’ll describe my setup that involves: Node.js, Dokku / Docker, SSL.
I had to reinstall my machine from scratch, so instead of creating plain Ubuntu 14 server, but I decided to go dokku/docker path. Something that I tried before and very happy with the results. DigitalOcean offers pre-packed image with dokku/docker already on board. As usually it takes just a few seconds to spin up machine on DO.
The plan was the following: deploy ElasticSearch instance inside Docker container, disable dynamic script features of Elastic, deploy Node.js based proxy server with custom authentication by Dokku and link those containers, so only Node.js proxy will have access to Elastic. Finally, all traffic between will by crypted by SSL.
The benefits are obvious, no more
:9200 is outside the machine, one potential vulnerability with dynamic script feature is disabled, only authenticated client could access Elastic server.
First we need to have Docker image of ElasticSearch. There few ES plugging for Dokku, I decided to install one by myself, since I thought it is easier to configure. There is a great instruction from Docker here.
Once the image is there, we need to prepare volume there Elastic stores data and configuration.
Elastic folder should contain
elasticsearch.yml file, with all required configurations. My case is very simple, I have a cluster of one machine, so default configuration applies to me. One thing, that I mentioned above – I need to disable dynamic scripting features.
The content of the file is just one string,
Once it’s done, we are ready to launch the server inside Docker container. Since, it could be done few times (during configuration and debugging), I’ve created a small shell script,
Please note the important thing,
-p 127.0.0.1:9200:9200 – here we are binding
:9200 to be only accessible on
localhost. I’ve spend some hours to close
iptables without any success, but that thing works as expected. Thanks a lot to @darkproger and @kkdoo for great help.
-v /elastic:/data will map the containers volume
/data to local one
Now, we need to deploy front-end proxy server. It would proxy all traffic from
http://localhost:9200 into outside world, securely. I’ve created small project, based on http-proxy called elastic-proxy. It’s very simple one and can be easily re-used.
The server itself,
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33
It proxies all the requests and only by-pass ones, who provide
access_token as query parameter. The
access_token value is configured on server, by applications environment variable
As you already prepared Dokku all you need to do is to push application to your server.
After the deployment, you should go to server to configure applications environment,
I wanted to have SSL and it’s very easy to configure with Dokku. Just place your
We need to link both containers, so Node.js application is able to access ElasticSeach container. I really liked dokku-link plugin, that does exactly that’s required in very easy way. So, will install it
Now, we need to link containers,
And application have to be redeployed again. If everything is good, you will be able to access the server by
https://proxy.yourserver.com?access_token=your_secret_value and see the response from ElasticSearch,
1 2 3 4 5 6 7 8 9 10 11 12
Depending on a client you use, you need to apply a little tweak to configuration. Basically, we need to use
access_token for all our request. For Node.js applications,
1 2 3 4 5 6 7 8 9 10 11
ELASTIC_ACCESS_TOKEN is env variable to hold
Now, restart the application, make sure everything is running and exhale.
PS. I would like to say thank to DigitalOcean for support and a little credit I received as downtime compensation. Thats awesome.