Tuesday, March 8, 2011

Poor Man's Malware RSS Feed

A customer of mine was in need of a way to be fed well known malware sites on a daily basis so they could feed it into their web inspection proxy. They operated on a blacklist style filtering strategy as opposed to the preferred whitelist methodology.

I figured that a quick and dirty way would be to create for them a daily mail reminder with a list of sites identified as dirty over the last 24 hours. I chose to use some well known malware research sites such as the ones below:

https://spyeyetracker.abuse.ch/blocklist.php?download=domainblocklist
https://zeustracker.abuse.ch/blocklist.php?download=domainblocklist
http://isc.sans.edu/feeds/suspiciousdomains_High.txt


The script I came up with will go out and rip down the data from these lists and store it into a file. Every 24 hours (when combined with cron) it will repeat this and then it will email the recipient any new sites over the last 24 hours.

#!/bin/bash
DATE=`date +%m-%d-%Y`

# Let's grab the daily malware lists and save them into a sorted file...
wget -i /root/Malware/website-urls-to-download.txt --no-check-certificate -O /root/Malware/malwaresites-blacklist-urls.txt
wget http://www.malwaredomainlist.com/hostslist/hosts.txt -O /root/Malware/bad-format-urls.txt
cat /root/Malware/bad-format-urls.txt | cut -c12-100 >> /root/Malware/malwaresites-blacklist-urls.txt
cat /root/Malware/malwaresites-blacklist-urls.txt | sed 's/[ \t]*$//' | sort | uniq > /root/Malware/evil-url-list-$DATE.txt

# We are going to take the sorted file and look for any new URLs, if they appear then we'll mail the changes out
if [ -e /root/Malware/evil-url-list-$DATE.txt ]; then
comm -13 /root/Malware/previous-url-scan.txt /root/Malware/evil-url-list-$DATE.txt > /root/Malware/evil_urls_for_$DATE.txt
sendEmail -f malware_police@somedomain.com -t someone@somedomain.com -u $DATE Daily Evil URL BlockList -o message-file=/root/Malware/evil_urls_for_$DATE.txt -s ex-smtp.somedomain.com:25
fi

# Once we are done with today's sorted file, we are going to save it for later comparison
ln -sf /root/Malware/evil-url-list-$DATE.txt /root/Malware/previous-url-scan.txt



This was done on Linux BackTrack4 distribution.

1 comment:

  1. We specialize in serving intelligent network administrators high quality blacklists for effective, targeted inline web filtering leveraging Squid proxy. We are the worlds leading and ONLY publisher of blacklists tailored specifically for use with Squid Proxy Native ACL. We also publish the worlds LARGEST adult domain blacklist, as well, as the worlds first blasphemy blacklist. Our works are available in several alternative formats for compatibility with multiple other web filter platforms. There is a demand for a better blacklist. And with few alternatives available, we intend to fill that gap.

    Squidblacklist.org Est. 2012. Owned and maintained by Benjamin E. Nichols & Co. It is an extension of the work I have been doing for years applying filters to my own networks with squid proxy and firewalls. Squidblacklist.org is platform whereby I hope to share the amalgamation of these works with the community, in the hopes that it will serve the greater good, helping to secure networks while providing a useful resource for individuals looking for a reasonable level of control of http traffic on their respective networks using a range of filtering solutions.


    It would be our pleasure to serve you,

    Signed,

    Benjamin E. Nichols
    http://www.squidblacklist.org

    ReplyDelete