02 May 2010
Posted in C
Please help us keep this glossary of Technical Terms up to date by sending us your criticism, comments or suggestions.
Anti-computer forensics (sometimes counter forensics) is a general term for a set of techniques used as countermeasures to forensic analysis.
Anti-forensics has only recently been recognized as a legitimate field of study. Within this field of study, numerous definitions of anti-forensics abound. One of the more widely known and accepted definitions comes from Dr. Marc Rogers of Purdue University. Dr. Rogers uses a more traditional “crime scene” approach when defining anti-forensics. “Attempts to negatively affect the existence, amount and/or quality of evidence from a crime scene, or make the analysis and examination of evidence difficult or impossible to conduct.”
A more abbreviated definition is given by Scott Berinato in his article entitled, The Rise of Anti-Forensics. “Anti-forensics is more than technology. It is an approach to criminal hacking that can be summed up like this: Make it hard for them to find you and impossible for them to prove they found you.” Neither author takes into account using anti-forensics methods to ensure the privacy of one's personal data.
Anti-forensics methods are often broken down into several sub-categories to make classification of the various tools and techniques simpler. One of the more widely accepted subcategory breakdowns was developed by Dr. Marcus Rogers. He has proposed the following sub-categories: data hiding, artifact wiping, trail obfuscation and attacks against the CF (computer forensics) processes and tools. Attacks against forensics tools directly has also been called computer forensics.
Purpose and goals
Within the field of digital forensics there is much debate over the purpose and goals of anti-forensic methods. The common conception is that anti-forensic tools are purely malicious in intent and design. Others believe that these tools should be used to illustrate deficiencies in digital forensic procedures, digital forensic tools, and forensic examiner education. This sentiment was echoed at the 2005 Blackhat Conference by anti-forensic tool authors, James Foster and Vinnie Liu. They stated that by exposing these issues, forensic investigators will have to work harder to prove that collected evidence is both accurate and dependable. They believe that this will result in better tools and education for the forensic examiner.
Data hiding is the process of making data difficult to find while also keeping it accessible for future use. “Obfuscation and encryption of data give an adversary the ability to limit identification and collection of evidence by investigators while allowing access and use to themselves.”
Some of the more common forms of data hiding include encryption, steganography and other various forms of hardware/software based data concealment. Each of the different data hiding methods makes digital forensic examinations difficult. When the different data hiding methods are combined, they can make a successful forensic investigation nearly impossible.
One of the more commonly used techniques to defeat computer forensics is data encryption. In a presentation he gave on encryption and anti-forensic methodologies the Vice President of Secure Computing, Paul Henry, referred to encryption as a “forensic analyst's nightmare”.
The majority of publicly available encryption programs allow the user to create virtual encrypted disks which can only be opened with a designated key. Through the use of modern encryption algorithms and various encryption techniques these programs make the data virtually impossible to read without the designated key.
File level encryption encrypts only the file contents. This leaves important information such as file name, size and timestamps unencrypted. Parts of the content of the file can be reconstructed from other locations, such as temporary files, swap file and deleted, unencrypted copies.
Most encryption programs have the ability to perform a number of additional functions that make digital forensic efforts increasingly difficult. Some of these functions include the use of a keyfile, full-volume encryption, and plausible deniability. The widespread availability of software containing these functions has put the field of digital forensics at a great disadvantage.
Steganography is a technique where information or files are hidden within another file in an attempt to hide data by leaving it in plain sight. “Steganography produces dark data that is typically buried within light data (e.g., a non-perceptible digital watermark buried within a digital photograph).” Some experts have argued that the use of steganography techniques are not very widespread and therefore shouldn’t be given a lot of thought. Most experts will agree that steganography has the capability of disrupting the forensic process when used correctly.
According to Jeffrey Carr, a 2007 edition of Technical Mujahid (a bi-monthly terrorist publication) outlined the importance of using a steganography program called Secrets of the Mujahedeen. According to Carr, the program was touted as giving the user the capability to avoid detection by current steganalysis programs. It did this through the use of steganography in conjunction with file compression.
Other forms of data hiding
Other forms of data hiding involve the use of tools and techniques to hide data throughout various different locations in a computer system. Some of these places can include “memory, slack space, hidden directories, bad blocks, alternate data streams, hidden partitions.”
One of the more well known tools that is often used for data hiding is called Slacker (part of the Metasploit framework). Slacker breaks up a file and places each piece of that file into the slack space of other files, thereby hiding it from the forensic examination software. Another data hiding technique involves the use of bad sectors. To perform this technique, the user changes a particular sector from good to bad and then data is placed onto that particular cluster. The belief is that forensic examination tools will see these clusters as bad and continue on without any examination of their contents.
See also: Data erasure
The methods used in artifact wiping are tasked with permanently eliminating particular files or entire file systems. This can be accomplished through the use of a variety of methods that include disk cleaning utilities, file wiping utilities and disk degaussing/destruction techniques.
Disk cleaning utilities
Disk cleaning utilities use a variety of methods to overwrite the existing data on disks (see data remanence). The effectiveness of disk cleaning utilities as anti-forensic tools is often challenged as some believe they are not completely effective. Experts who don’t believe that disk cleaning utilities are acceptable for disk sanitization base their opinions off current DOD policy, which states that the only acceptable form of sanitization is degaussing. (See National Industrial Security Program.) Disk cleaning utilities are also criticized because they leave signatures that the file system was wiped, which in some cases is unacceptable. Some of the widely used disk cleaning utilities include DBAN, srm, BCWipe Total WipeOut, KillDisk, PC Inspector and CyberScrubs cyberCide. Another option which is approved by the NIST and the NSA is CMRR Secure Erase, which uses the Secure Erase command built into the ATA specification.
File wiping utilities
File wiping utilities are used to delete individual files from an operating system. The advantage of file wiping utilities is that they can accomplish their task in a relatively short amount of time as opposed to disk cleaning utilities which take much longer. Another advantage of file wiping utilities is that they generally leave a much smaller signature than disk cleaning utilities. There are two primary disadvantages of file wiping utilities, first they require user involvement in the process and second some experts believe that file wiping programs don’t always correctly and completely wipe file information. Some of the widely used file wiping utilities include BCWipe, R-Wipe & Clean, Eraser, Aevita Wipe & Delete and CyberScrubs PrivacySuite.
Disk degaussing / destruction techniques
Disk degaussing is a process by which a magnetic field is applied to a digital media device. The result is a device that is entirely clean of any previously stored data. Degaussing is rarely used as an anti-forensic method despite the fact that it is an effective means to ensure data has been wiped. This is attributed to the high cost of degaussing machines, which are difficult for the average consumer to afford.
A more commonly used technique to ensure data wiping is the physical destruction of the device. The NIST recommends that “physical destruction can be accomplished using a variety of methods, including disintegration, incineration, pulverizing, shredding and melting.”
The purpose of trail obfuscation is to confuse, disorientate and divert the forensic examination process. Trail obfuscation covers a variety of techniques and tools that include “log cleaners, spoofing, misinformation, backbone hopping, zombied accounts, trojan commands.”
One of the more widely known trail obfuscation tools is Timestomp (part of the Metasploit Framework). Timestomp gives the user the ability to modify file metadata pertaining to access, creation and modification times/dates. By using programs such as Timestomp, a user can render any number of files useless in a legal setting by directly calling in to question the files credibility.
Another well known trail-obfuscation program is Transmogrify (also part of the Metasploit Framework). In most file types the header of the file contains identifying information. A (.jpg) would have header information that identifies it as a (.jpg), a (.doc) would have information that identifies it as (.doc) and so on. Transmogrify allows the user to change the header information of a file, so a (.jpg) header could be changed to a (.doc) header. If a forensic examination program or operating system were to conduct a search for images on a machine, it would simply see a (.doc) file and skip over it.
Attacks against computer forensics
In the past anti-forensic tools have focused on attacking the forensic process by destroying data, hiding data, or altering data usage information. Anti-forensics has recently moved into a new realm where tools and techniques are focused on attacking forensic tools that perform the examinations. These new anti-forensic methods have benefited from a number of factors to include well documented forensic examination procedures, widely known forensic tool vulnerabilities and digital forensic examiners heavy reliance on their tools.
During a typical forensic examination, the examiner would create an image of the computer's disks. This keeps the original computer (evidence) from being tainted by forensic tools. Hashes are created by the forensic examination software to verify the integrity of the image. One of the recent anti-tool techniques targets the integrity of the hash that is created to verify the image. By affecting the integrity of the hash, any evidence that is collected during the subsequent investigation can be challenged.
Use of chassis intrusion detection feature in computer case or a sensor (such as a photodetector) rigged with explosives for self-destruction.
Effectiveness of anti-forensics
Anti-forensic methods rely on several weaknesses in the forensic process including: the human element, dependency on tools, and the physical/logical limitations of computers. By reducing the forensic process's susceptibility to these weaknesses, an examiner can reduce the likelihood of anti-forensic methods successfully impacting an investigation. This may be accomplished by providing increased training for investigators, and corroborating results using multiple tools.