chickencode: (Default)
The bulk of my day job is actually analyzing phish, phishkits, drop scripts etc.. Lately we have ran into an issue where the phishing campaign is only accepting local ip's to view the phishing content and blocking out everything else in the httaccess.

For this reason I wrote a little utility that would allow us to check to see if we get any kind of response from the phish based on the geographic location of a proxy connection.


echo " "
echo "------------------------------"
echo " GeoBlocked? "
echo "------------------------------"
echo " "
echo "Enter proxy list file name, if not in same directory provide full path: "
read LIST
echo "Enter URL to see if its being geoblocked"
read URL
echo " "
echo "Checking status of: $URL This could take some time"
echo " "
echo " "

PROXY="$(< "$LIST")"
red=`tput setaf 1`
green=`tput setaf 2`
reset=`tput sgr0`

function url_check()
export http_proxy="http://$i"

status="$(curl --max-time 15 --connect-timeout 15 -s -o /dev/null -I -w '%{http_code}' $URL)"
country="$(curl -s | sed -n 's|.*,\(.*\)|\1|p')"
DOWN="$(echo "${red} $i - URL IS DOWN - $country ${reset}")"
UP="$(echo "${green}$i - URL IS UP - $country ${reset}")"
TIMEOUT="$(echo "${red}$i - Proxy connection took too long${reset}")"

case "$status" in
"200") echo "$UP";;
"201") echo "$UP";;
"202") echo "$UP";;
"203") echo "$UP";;
"204") echo "$UP";;
"400") echo "$DOWN";;
"401") echo "$DOWN";;
"402") echo "$DOWN";;
"403") echo "$DOWN";;
"404") echo "$DOWN";;
"500") echo "$DOWN";;
"501") echo "$DOWN";;
"503") echo "$DOWN";;
*) echo "$TIMEOUT";;
unset http_proxy;

for i in $PROXY; do
url_check $i

chickencode: (Default)
For those that do not know I make my living as a "Cyber security analyst." Every day we recover tons of URL's from phish attacks that have base64 encoded user data attached to them, typically login information and more severly full credit card information.

This is what they look like.

These base64 credentials really added up day to day and took quite a while to process as our then current way of recovering them was to parse the base64 out ourselves and run them individually through an online decoder that could only take one string at a time. Since I strictly run Linux on my workstation I would avoid the tool alltogether and just run them all in a for loop in bash.

for i in "base64_string1" "base64_string2" "base64_string3"
echo "$i" | base64 -d;

This method while faster and could take multiple strings still had me parsing the base64 data from the links myself manually. I knew I could devise a way pragmatically to do everything for me. Inspired I set off to build the simplest tool to use that was fast, could handle multiple credential links at one time, and needed no human intervention besides copy and paste.

So I came up with an approach of copying credential urls > pasting > clicking a button > success.

This last click step would require a gui of course so I set off to find a framework that I could build off and since I would rather not work in java this led me to GTK in python or QT in C/C++. Since I am a far better C programmer I went with the latter and found QT to be an amazing framework to really get up to speed and build your product with.

After a few hours of playing around and trying different things I had a working application that did everything I needed flawlessly, and just for the hell of it and because it was super easy to implement added a raw base64 multi string decoder and a clear fields button.

my QT C++ app that parses and decodes base64 data from URLS

It works great in the field, I use it every day now. My boss loves it, though now he wants me to rewrite it in php since thats what all of our other in house tools are written in. So looks like I'll be coming familiar with programming in that language too soon.


chickencode: (Default)

March 2017

5 67891011


RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Oct. 19th, 2017 07:14 am
Powered by Dreamwidth Studios