Interesting article that describes how botnets and the underground market work.
In today’s article (which will be a Q&A, a question & answer), I hope to be able to clear up the mystery behind these kits. I have been able to interview experts in the anti-malware world. They will each give their opinion on this particular subject.
This is a great analysis technique to use due to the fact that when you build a timeline from multiple data sources on and from within a system, you give yourself two things that you don’t normally have through more traditional analysis techniques…context, and a greater relative level of confidence in your data.
As to the overall relative level of confidence in our data, we have to understand that all data sources have a relative level of confidence associated with each of them. For example, from Chris’s post, we know that the relative confidence level of the time stamps within the $STANDARD_INFORMATION attributes within the MFT (and file system) is (or should be) low. That’s because these values are fairly easily changed, often through “time stomping”, so that the MACB times (particularly the “B” time, or creation date of the file) do not fall within the initial timeframe of the incident. However, the time stamps within the $FILE_NAME attributes can provide us with a greater level of confidence in the data source (MFT, in this case). By adding other data sources (Event Log, Registry, Prefetch file metadata, etc.), particularly data source whose time stamps are not so easily modified (such as Registry key LastWrite times), we can elevate our relative confidence level in the data.
It makes sense because an analyst cannot trust only a single source of information. We must correlate multiple data sources in order to get the full picture (context) and spot possible manipulations.
In systems with many noise does not make sense creating a full timeline because the background noise may cost you more problems than targeting only the areas you are interested in.
However, there is a method to my madness, which can be seen in part in Chris’s Sniper Forensics presentation. I tend to take a targeted approach, adding the information that is necessary to complete the picture. For example, when analyzing a system that had been compromised via SQL injection, I included the file system metadata and only the web server logs that contained the SQL injection attack information. There was no need to include user information (Registry, index.dat, etc.); in fact, doing so would have added considerable noise to the timeline, and the extra data would have required significantly more effort to analyze and parse through in order to find what I was looking for.
Use a knowledge database updated with findings in your previous investigations. Also, sharing information within a team is vital to achieve the goals and be more efficient.
There are also some comments about RegRipper,
RegRipper is a Windows Registry data extraction and correlation tool. RegRipper uses plugins (similar to Nessus) to access specific Registry hive files in order to access and extract specific keys, values, and data, and does so by bypassing the Win32API.
This tool can help to automate some analysis in the registry that could indicate compromises, presence of malware, etc.. with the help of the knowledge database.