Information Security News
by Sean Gallagher
WASHINGTON, DC—For years, the government and security experts have warned of the looming threat of "cyberwar" against critical infrastructure in the US and elsewhere. Predictions of cyber attacks wreaking havoc on power grids, financial systems, and other fundamental parts of nations' fabric have been foretold repeatedly over the past two decades, and each round has become more dire. The US Department of Energy declared in its Quadrennial Energy Review, just released this month, that the electrical grid in the US "faces imminent danger from a cyber attack."
So far, however, the damage done by cyber attacks, both real (Stuxnet's destruction of Iranian uranium enrichment centrifuges and a few brief power outages alleged to have been caused by Russian hackers using BlackEnergy malware) and imagined or exaggerated (the Iranian "attack" on a broken flood control dam in Rye, New York), cannot begin to measure up to an even more significant cyber-threat—squirrels.
That was the message delivered at the Shmoocon security conference on Friday by Cris "SpaceRogue" Thomas, former member of the L0pht Heavy Industries hacking collective and now a security researcher at Tenable. In his presentation—entitled, "35 Years of Cyberwar: The Squirrels Are Winning"—SpaceRogue revealed the scale of the squirrelly threat to worldwide critical infrastructure by presenting data gathered by CyberSquirrel 1, a project that gathers information on animal-induced infrastructure outages collected from sources on the Internet.
Last week, Xavier published a great diary about the dangers of leaving behind backup files on your web server. There are a few different ways to avoid this issues, and as usual, defense in depth applies and one should consider multiple controls to prevent these files from hurting you. Many approaches blacklist specific extensions, but as always with blacklists, it is dangerous as it may miss some files. For example, different editors will use different extensions to marks backups files, and Emacs (yes... I am an Emacs fan), may not only leave a backup file by appending a ~ at the end, but it may also leave a second file with a # prefix and postfix if you abort the editor.
For all these reasons, it is nice if you can actually white list extensions that are required for your application.
As a first step, enumerate what file extensionsare in use on your site (I am assuming that /srv/www/html is the document root):
find /srv/www/html -type f | sort | sed s/.*\.// | sort | uniq -c | sort -n 19 html~ 20 css 20 pdf 23 js 50 gif 93 html 737 png 3012 jpg
As you see in the abbreviated output above, most of the extensions are what you would expect from a normal web server. We also got a few Emacs backup HTML files (html~).
We will set up a simple text filegoodext.txt with a list of all allowed extensions. This file will then help us create the Apache configuration, and we can use it for other configuration files as well (anybody knows how to do this well in mod_security?) . The output of the command above can be used to get us started, but of course, we have to remove extensions we dont want to see.
find . -type f | sort | sed s/.*\.// | sort -u ~/goodext.txt
Next, lets run a script to delete all the files that do not match these extensions. I posted a script that I have used in the past on GitHub.
The script does use thegoodext.txt file we created above. The first couple lines can be used to configure it. Of course, run it in debug mode first, to see what files will be deleted, and make a backup of your site first!
Next, we create an Apache configuration file. Currently, the script only works forApache 2.2. Apache 2.4 changed the syntax somewhat, and I need to test if the order of the directives needs to change. Include it as part of the Directory section of the configuration file:
Order allow,denyAllow from all Include www.goodext
(I dont name the extension file .conf so it will not be included automatically but only in this one specific spot).
The two, rather simple, bash scripts to delete the bad files and then create the Apache configuration files, can be found here:https://github.com/jullrich/fixbadwebfiles
Why use a script for this vs. just editing the files manually?
Note that the scripts are strictly in the works for me state. Any bug reports and comments are welcome (use GitHub for bugs)