https://blog.obscuredfiles.com/mitigating-ddos-attacks-against-your-onion-service/
Mitigating DDoS Attacks Against Your Onion Service
Since October 30th we have been the recipient of an inconsistent DoS and DDoS attack. It first started on our main onion service for a couple of days then stopped and went to our main clearnet site and now it is back to our onion service.
Why would anybody attack us? ¯_(ツ)_/¯ Your guess is as good as ours.
But it is happening to us and, from what you guys have let us know, it seems to be happening to you too.
First, let's understand how DoS and DDoS attacks work and why it is so extremely hard to mitigate on an onion service. A little Information First (For those who are new)
DoS stands for Denial of Service. It is where a single individual floods the victim's network or server with tons of traffic preventing or severely delaying legitimate traffic. The victim's network or server can't process the overwhelming flood of traffic fast enough and becomes effectively unavailable.
DDoS stands for Distributed Denial of Service. Exactly like DoS attacks, the goal is to make the victim's service unavailable. However, DDoS attacks are more effective than DoS attacks because the attacks are launched from multiple connected devices distributed across the internet and not just a singular connection. DDoS attacks are the most prominent method of modern attacks. Most networks and servers are fast enough to absorb any kind of DoS attacks but DDoS attacks are far more severe.
DoS and DDoS attacks are a simple, but devastating way, to disrupt a victim's service; preventing normal operation from happening.
DoS or DDoS attacks against onion services are a lot harder to mitigate. In traditional clearnet websites, you can block IPs, geographic regions, and do all kinds of other things to prevent DoS or DDoS attacks because there is individual connection information. If the server is receiving tons of requests from a single connection it can simply drop the connection and prevent it from reconnecting.
But none of the above mitigation methods are possible in an onion service. This is due to the simplest to explain but fundamentally complex design of onion services. The basis of why DDoS attacks are so hard to mitigate is because there is only a single connection, the tor connection running on localhost. This is why 127.0.0.1 is shown in any onion service connection address. The server only sees one connection. If it tries to block it, everybody will be blocked. This is a part of the protection that onion services bring because it's impossible to know the IP of who is connecting, but it is also the vulnerability to DoS and DDoS attacks. What Onion service OPERATORS Can do about it
Oh no, your onion service is getting DoS or DDoS. Nobody can visit, your server is on fire, and you cried yourself to sleep last night. The time to act was yesterday. While you can't necessarily prevent DoS or DDoS attacks from happening you can to better equip your system to handle sudden traffic spikes and respond accordingly. Attack Vectors
There are two main attack vectors that DoS and DDoS attacks exploit; overpowering the resources of the server and overpowering the resources of the network. By updating your server and making sure it's performing efficiently, you can reduce the attack footprint of the limited server resources. Limited network speed is the main exploit of DDoS attacks being that regardless of the performance tweaks if you get enough traffic the network will be overloaded. Some of these below methods won't even be just for onion services, they are best practices regardless of your site or system. 1. Be aware of security fixes and update your system accordingly.
Recently Tor was updated with important fixes. 0.2.8.9 release fixed a pretty bad security hole which would allow remote attackers to crash Tor clients. If you didn't know that get on the tor-announce mailing list and maybe even the tor-onions mailing list. If you are running tor from the default Debian or ubuntu distros stop it. Add the Tor project's repository to your sources list to make sure you always have the latest version.
Last month there was also a local privilege escalation in the Linux Kernel (appropriately called dirty cow). It is utmost important to update your system continually. 2. Your software is most likely not performing as fast as it can.
Most individuals might think it is just them not having a powerful enough server to handle the extra spike of traffic. But there is always room to improve your service's performance.
One of the biggest upgrades you can make to improve the responsiveness of your service is ditching apache and going with Nginx. All our servers run on Nginx because apache would crash under our regular load.
If you are using PHP and it is not PHP 7, upgrade. If you use Nginx make sure to also use PHP-FPM. By just upgrading to PHP 7 from PHP 5.6 your server can respond to up to 3 time more requests while using 30% less memory. When we upgraded we seen a decrease of 1/3 the response time (100-200 ms). With that being said some frameworks or software systems are not yet compatible with PHP 7; do some research to see if upgrading is right for you.
Caching is something everybody should consider. If your service is a forum or something that has a lot of database requests, look at getting a caching engine.
Increasing the amount of requests your server can handle is not only a good practice to have but overall makes your service able to absorb most DoS attacks. Servers are getting faster and if you aren't using it efficiently you are doing a disservice to yourself and your users. 3. Tor gradually increases traffic when your site is first introduced
Check out this Tor Project's Blog Post for information about why your network bandwidth is not fully used when you first introduce your system to the Tor network. This doesn't just affect relays but your onion service itself (from our experience). If your onion service does file sharing, the first three days you are up will be the slowest days. Over time, the allocation will increase and you should see good network use after a month.
We found that there is a trick that onion service operators can do to increase the allocation quicker than normal. Unfortunately, it's at the expense of operator anonymity.
If you are on the clearnet and just looking to provide an onion service for your users there is no reason why you require three hops to the rendezvous point. Supercharge your Tor experience and reduce the load on the Tor network by running Tor in one hop mode. By setting the configuration options you can trade anonymity for blazing fast onion service speed.
- Anti-Bot repellent
Bots are extreamly prominent in the world; even more so on the darknet. While some individuals like having bots search their site, so they can be indexed on search engines and get organic user growth, the vast majority of DDoS attacks come bot requests. Some of these bot requests will look exactly the same as regular users but some of them won't.
One of the glories of having a singular browser that makes everybody look the same (wait for it) is that everybody looks the same. The requests are all the same, the user string is all the same, everything is the same. So if someone isn't using the Tor Browser to access your onion site it's pretty easy to identify them and drop their requests.
You do need to think of the repercussions of just dropping individuals who aren't using the Tor Browser. In certain cases, it isn't appropriate. You will need to decide for yourself if you want to do it. It also isn't a foolproof solution being that some bots change their requests to look like regular users. But (from our tests) it does help.
Hopefully, you ditched apache and went with Nginx already. The following code will drop any package that is not directly from the Tor Browser. It will be like your site doesn't exist.
if ($http_user_agent != "Mozilla/5.0 (Windows NT 6.1; rv:45.0) Gecko/20100101 Firefox/45.0") { return 444; }
If you want to go one step further you can also block tor2web gateways. Some bots use tor2web gateways to launch attacks when they don't have Tor on their system.
if ($http_x_tor2web) { return 400; }
Of course, these solutions are only a user whitelisting type and will not stop all bot attacks. However, it will stop most lower end bot types which can reduce the load overall. It also isn't a clear cut solution for all services so you will need to decide what is best for your case. By using the above two code blocks you will effectively make your service drop any connection that doesn't come from the Tor Browser (anything that looks exactly like it). 5. Look at your Logs
Logs are very valuable. If you are getting continual errors with your website or background daemons (ex. PHP, Ruby) error logs are a gold mine. Not only do they allow you pinpoint deficiencies in your software but provide insight on things that can be configured better. What not to do
When sites get attacked operators start to get desperate. Looking for makeshift ways to reduce the load but most times these "solutions" only add on to the load; helping the attacker. Here a couple of things that you should not consider doing on an onion service. 1. Rate limiting
Remember the purpose of a DoS or DDoS attack is to make the site unavailable. If you start rate limiting site connections you are only helping the attacker accomplish his goal.
This goes back into making sure your server can serve as many requests as possible; the more the server can absorb the attack the better. It's only when your site shows the effects of the DoS or DDoS when the problem gets worse. You don't want to purposefully help the attacker accomplish his goal so don't rate limit. 2. Code solutions to an overload problem
This is a little bit tricky being that sometimes code solutions can help. Some code like a captcha can weed out attacking bot requests without having that much of an effect on user experience. Code to provide more protection for heavy server-side processes (login, posting, etc) can have positive effects and potentially stop attacks.
But most of these solutions don't work against DoS or DDoS attacks because they are just throwing traffic and trying to overload the server regardless of whatever it returns. When looking at an overload code solution make sure that the solution's overhead doesn't increase the overload it was made to decrease. Conclusion
DoS or DDoS attacks can happen at any time. So it is important to prepare your service to counteract or absorb as much of the attack as possible. In some cases, it will not be enough, no matter what you do, but in others, it might keep your service alive and running to live another day.
Not matter what anybody says there isn't a clean cut anti-DoS solution for every use case. It's hard to handle on its own and even more so when you can't distinguish between each individual.
The Tor Project is working on ways to reduce the effects that DoS or DDoS attacks have on onion services. We wish them luck on this endless struggle of betterment.
TL;DR: All connections coming down one pipe makes it hard to block or limit requests, always update your system, your software is a bottleneck and can always be improved to handle more requests, Tor is a shy girl and limits bandwidth when you first meet, Bots are bad and you can do something about it, logs are good information resources, it's bad to rate limit on an onion service, and more code is almost always a bad solution to an overload problem.
Author: CodeFate
You really don't want to know about me.
Beautiful read - thanks!