Tor to Combat Malicious Node Problem
The discovery of over a hundred malicious nodes has prompted the Tor Network to develop a new design which is designed to fight this ongoing problem.
Developer Sebastian Hahn assured that code has already been written to address this issue, and that the release date is being determined. The Tor Network has said that the attacks do not unmask the operator behind the hidden service, which the law enforcement community has been trying to accomplish for some time now.
Amirali Sanatinia, who is working on his PhD in computer science at Northeastern University, is responsible for this discovery. Alongside his professor at the College of Computer and Information Science, Guevara Noubir; they are set to present their paper next week at DEF CON.
The paper, âHOnions: Towards Detection and Identification of Misbehaving Tor HSDirsâ, tells of a framework called Honey onions, or HOnions that Sanatinia and Noubir developed that identifies these malicious HSDirs.
They two launched the framework in daily, weekly, and monthly runs from Feb. 12 to April 24 and found exactly 110 malicious nodes, most of which were hosted in the US. Others from Germany, France, the Netherlands and the UK were found. They exposed Tor relays wish HSDirs capabilities that have been made to spoof hidden services. Tor estimated that there are currently around 3,000 HSDirs within its network.
âWhat the attack allows you to do is to learn about the existence of a hidden service. This does not mean that the identity of the operator is revealed or anything catastrophic like that,â Hahn said.
âNoubir and Sanatiniaâs attack essentially snoops hidden serviceâs metadata and tells the attacker that a service exists and when itâs available,â Hahn added.
In an interview Noubir said that the hidden service directories they have found to be malicious could be run by researchers studying the dark web, or law enforcement or other state agencies as part of investigations.
âAt this stage, hard to tell who is doing what. What we could see is there is some diversity in what they are doing. Some are attacking these hidden services, or in some way collecting information about them,â Noubir said in his interview.
More than 70 percent of the malicious directories they discovered are hosted on cloud infrastructure. A quarter of which are exit nodes. Hosting the services on cloud infrastructure makes it difficult to find out who is behind them. These are paid for in bitcoin, and have no contact information attacked to them.
Hahn reported that the number of exit nodes found in this research is alarming and is most likely and indicator saying that the operators didnât take necessary care as being an exit node is the default configuration.
âThe way weâre working on it for the future is by using a stronger cryptographic protocol that does not allow the Tor servers involved in the regular operation of the network to see a portion of the metadata about hidden services,â Hahn said.
Noubir commented that the ones snooping are trying to learn information about services like the .onion.market address where it is being operated on. Theyâre paper explains how the attack could use the information gained to build a list of targets to launch other attacks at.
âTheyâre trying to look inside the .onion.market and carry out user enumeration or run cross site scripting attacks, typical attacks youâd see against regular websites which are more interesting in context. If youâre running a hidden service, you donât want to be discovered,â Noubir said.
The paper described one snooping directory as sending hourly queries to an Apache server for status updates, that are provided by mod status in Apache. More were executed using SQL injection, path traversal attacks, and XSS.
Running their investigation Noubir and Sanatinia used HOnions to detect these malicious nodes. They ran 1,500 at one time, on the daily, weekly or monthly schedule. Each answering to a process running locally that would log visits by those HSDirs. In the paper, they said that most of the 40,000 visits from that were logged were said to be automated and queried the root path of the server, but that they also didnât detect manual probing in around 20 percent of the requests. Some of them would not visit a service right away after hosting it, to avoid suspicion and be detected.