Last year, over the Thanksgiving break, Justin Henderson and I worked ona tool to provide a web API interface foranother tool I released last year called freq.py. freq.py is used to identify randomized DNS names used by malware. Providing a web API would allow a Security Information andEvent Management (SEIM) system to automatically score character frequency data from a variety of log sources. So freq_server.py was born.Since then it has been great to see different organizations using it and finding malware in their domain. ThisThanksgivingJustin contacted me again with a new project and I was immediately intrigued.

This timeJustin wanted to automatically score phishing domains based upon the born on date of the domain registration. Justin told me about some really great techniquesthat organizations can use to identify potential threats if you could query the data at the speed of SEIM. The difficulty is in processing the huge number of records that a SEIMcollectswithout rudely overwhelming the whoisservers. The system has to be quick andwe need to cache the data for frequent domains. Python makes building these types of interfaces quick and easy. Justin told me what he wanted and a few hours later I sent him working prototype. The tool is available for download and integration into your SEIM now. Download a copy here.Check out SANS SEC573to learn how to quickly develop programs like this on your own. There are two opportunities to take the Python course from me in the near future. Come see me">in Londonor come see me">in Orlando Florida at SANS 2017

So what can you do this new tool? Well, it was Justins idea, so Ill let him tell you. Take it away Justin!

Thanks Mark. We live in a day were data is everywhere. The security community is constantly stepping up in new ways to defend and protect this data. They are constantly inventing new techniques, tools, and processes. Yet some good techniques have been lost due to lack of publication or an inability to easily apply the technique at scale.

One such technique is using WHOIS information and specifically creation dates. Many who came before me have made mention that companies typically do not perform business with new domains. I dub these baby domains" />

Running a simple whois kicks back lots of information included a creation date. The problem: performance. Each run of whois varies greatly. In fact, in my testing it would vary between about 0.50 seconds and 10 seconds which if ran against millions or billions of domains would not be able to keep up. Then there is Mark Baggett" />

The number returned if using an older version of the Alexa top 1 million is the sites rank. Just remember that instead of using an old Alexa file you can also generate a custom list of most frequently accessed sites or known good sites or even known bad sites. This can then be used to tag domains in order to skip certain checks that are more time consuming such as WHOIS lookups or perform other tasks/logic.

So how would you put this all together? Where I regularly use domain_stats.py is by invoking it from SIEM products. For example, I take incoming DNS logs and use Logstash, a log aggregator and parser, to query domain_stats.py. If a DNS log matches certain query types such as an A record or MX record, then Logstash applies the following logic:

Step 1 - Does the query match an internal domain name? If yes, no additional processing required. If no, move to step # 2.

Step 2 Pass the domain to domain_stats.py using /alexa. If a result is returned the domain matches a whitelist and no additional processing is required. If 0 is returned it is not a well-known domain. Move to step # 3.

Step 3 Pass the domain to domain_stats.py using /domain/creation_date. Store the creation date for manual analysis. (I then use a dashboard/report to display baby domains that are less than 90 days old)

The ultimate deliverable is a list of baby domains being accessed and knowing which systems are possibly engaging them. This could provide early detection of emails coming in from phishing domains or end users accessing phishing websites. When combined with other techniques such as fuzzy phishing (https://www.hasecuritysolutions.com/blog/fuzzy-phishing) catching phishing domains becomes much more likely and quite frankly a lot more fun.

A big shot out to Mark Baggett for his awesome Python scripting skills that enabled this. While written in Python it has proven through repetitive testing to outperform other solutions. I encourage everyone to consider using domain_stats.py for WHOIS queries (such as for creation dates) or its whitelisting capabilities.

Follow Justin Henderseon@securitymapper

Follow Mark Baggett @MarkBaggett

(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.
 
 
Oracle Java SE CVE-2014-6493 Remote Security Vulnerability
 

Beware its furry cyber-wrath. (credit: Washington State)

WASHINGTON, DC—For years, the government and security experts have warned of the looming threat of "cyberwar" against critical infrastructure in the US and elsewhere. Predictions of cyber attacks wreaking havoc on power grids, financial systems, and other fundamental parts of nations' fabric have been foretold repeatedly over the past two decades, and each round has become more dire. The US Department of Energy declared in its Quadrennial Energy Review, just released this month, that the electrical grid in the US "faces imminent danger from a cyber attack."

So far, however, the damage done by cyber attacks, both real (Stuxnet's destruction of Iranian uranium enrichment centrifuges and a few brief power outages alleged to have been caused by Russian hackers using BlackEnergy malware) and imagined or exaggerated (the Iranian "attack" on a broken flood control dam in Rye, New York), cannot begin to measure up to an even more significant cyber-threat—squirrels.

That was the message delivered at the Shmoocon security conference on Friday by Cris "SpaceRogue" Thomas, former member of the L0pht Heavy Industries hacking collective and now a security researcher at Tenable. In his presentation—entitled, "35 Years of Cyberwar: The Squirrels Are Winning"—SpaceRogue revealed the scale of the squirrelly threat to worldwide critical infrastructure by presenting data gathered by CyberSquirrel 1, a project that gathers information on animal-induced infrastructure outages collected from sources on the Internet.

Read 6 remaining paragraphs | Comments

 
(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.
 
PHP 'bzread()' Function Out of Bounds Remote Code Execution Vulnerability
 
Oracle Java SE CVE-2014-6468 Local Security Vulnerability
 
[SECURITY] CVE-2016-8748: Apache NiFi XSS vulnerability in connection details dialogue
 
 
IBM Kenexa LMS on Cloud CVE-2016-8930 Unspecified SQL-Injection Vulnerability
 
Multiple AttacheCase Products CVE-2016-7843 Directory Traversal Vulnerability
 
GStreamer Good Plug-ins Incomplete Fix CVE-2016-9808 Buffer Overflow Vulnerability
 
AttacheCase CVE-2016-7842 Directory Traversal Vulnerability
 
IBM Kenexa LMS on Cloud CVE-2016-8928 Unspecified SQL-Injection Vulnerability
 
IBM Kenexa LMS on Cloud CVE-2016-5942 Unspecified Cross-Site Scripting Vulnerability
 
RETIRED: Symantec Norton Download Manager DLL Loading Remote Code Execution Vulnerability
 
LibTIFF CVE-2016-5317 Out Of Bounds Write Denial of Service Vulnerability
 
Matroska libEBML CVE-2016-1514 Information Disclosure Vulnerability
 
RETIRED: Matroska libEBML CVE-2015-8790 Information Disclosure Vulnerability
 
WordPress Prior to 4.7.1 Cross Site Scripting Vulnerability
 
HP Diagnostics Cross Site Scripting and Click Jacking Vulnerabilities
 
[security bulletin] HPSBST03671 rev.2 - HPE StoreEver MSL6480 Tape Library Management Interface, Multiple Remote Vulnerabilities
 
[security bulletin] HPSBGN03689 rev.1 - HPE Diagnostics, Remote Cross-Site Scripting and Click Jacking
 
[SECURITY] [DSA 3765-1] icoutils security update
 
[SECURITY] [DSA 3743-2] python-bottle regression update
 

Last week, Xavier published a great diary about the dangers of leaving behind backup files on your web server. There are a few different ways to avoid this issues, and as usual, defense in depth applies and one should consider multiple controls to prevent these files from hurting you. Many approaches blacklist specific extensions, but as always with blacklists, it is dangerous as it may miss some files. For example, different editors will use different extensions to marks backups files, and Emacs (yes... I am an Emacs fan), may not only leave a backup file by appending a ~ at the end, but it may also leave a second file with a # prefix and postfix if you abort the editor.

For all these reasons, it is nice if you can actually white list extensions that are required for your application.

As a first step, enumerate what file extensionsare in use on your site (I am assuming that /srv/www/html is the document root):

find /srv/www/html -type f | sort | sed s/.*\.// | sort | uniq -c | sort -n     19 html~     20 css     20 pdf     23 js     50 gif     93 html    737 png   3012 jpg

As you see in the abbreviated output above, most of the extensions are what you would expect from a normal web server. We also got a few Emacs backup HTML files (html~).

We will set up a simple text filegoodext.txt with a list of all allowed extensions. This file will then help us create the Apache configuration, and we can use it for other configuration files as well (anybody knows how to do this well in mod_security?) . The output of the command above can be used to get us started, but of course, we have to remove extensions we dont want to see.

find . -type f | sort | sed s/.*\.// | sort -u  ~/goodext.txt

Next, lets run a script to delete all the files that do not match these extensions. I posted a script that I have used in the past on GitHub.

The script does use thegoodext.txt file we created above. The first couple lines can be used to configure it. Of course, run it in debug mode first, to see what files will be deleted, and make a backup of your site first!

Next, we create an Apache configuration file. Currently, the script only works forApache 2.2. Apache 2.4 changed the syntax somewhat, and I need to test if the order of the directives needs to change. Include it as part of the Directory section of the configuration file:

Order allow,denyAllow from all Include www.goodext     

(I dont name the extension file .conf so it will not be included automatically but only in this one specific spot).

The two, rather simple, bash scripts to delete the bad files and then create the Apache configuration files, can be found here:https://github.com/jullrich/fixbadwebfiles

Why use a script for this vs. just editing the files manually?

  1. typos
  2. faster if you have multiple servers
  3. there are two kinds of sysadmins: those that script, and those that will be replaced by a script.

Note that the scripts are strictly in the works for me state. Any bug reports and comments are welcome (use GitHub for bugs)

---
Johannes B. Ullrich, Ph.D.
STI|Twitter|LinkedIn

(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.
 
Internet Storm Center Infocon Status