[IN PROGRESS] Option to run only new .dmps?

writhziden

Administrator, .NET/UWP Developer
Staff member
Joined
May 23, 2012
Posts
2,943
Location
Colorado
Hi, I am experimenting with an idea to have the apps only run the latest .dmps for users.

I am sure you have all experienced the following scenario:

A user provides additional .dmps but has not cleaned out the Minidump folder since the last run, so the apps run more .dmps than are necessary. This takes more time to run when all you really need are a few of the latest .dmps. You as the analyst do not want to have to sort through the .dmps yourself to find the old ones due to the time that takes, so you just let the apps run.​

I am proposing an option to run only the newest .dmps and let the apps determine which .dmps were run previously. That option would be off by default and would have to be enabled each time an analyst runs .dmps if desired; I know I sometimes like to run all .dmps to refresh my memory as to what the user experienced last time, but a lot of times, I would like to have the option to only run the latest .dmps


What does everyone think?
 
I'm trying not to have the apps grow beyond what users can tolerate. I feel if they get too much bigger, they won't be manageable for everyone. This was just a simple option I would like to add.

Also, for choosing a date instead, I think it is just as easy to sort the directory listing of the .dmps and delete based on date that way. I do not see much need for that in the apps, but if others do, I will be happy to add it somehow; I would prefer if the first screen does not get much bigger since it is supposed to just be a simple interface to click run or choose one or two options and click run. Most of why I wanted to add this feature is that sometimes users have .dmps from the same date that have already been run, but new .dmps were created that day, as well. Having to sort and check with that scenario is a bit of a pain.

My proposal is for a simple, quick method to choose to run all .dmps through the processing apps or to choose to only run apps that have not yet been run through the processing apps.
 
Last edited:
Hmm, nice idea... I'm thinking of having a dmp folder and a display of the *.dmp files in that folder (all of this has already been done on my behalf), but give the user a time picker to choose how far back in time determines which dumps to scan. 2 weeks ago from today, 1 week, etc...

Perhaps just something as simple as a checkbox beside each file in the display i've got?

Like you though i'm still experimenting, but trying to cover all the bases leaving none left. I've already got a listview display for the dump files in the DMP folder, and I can sort the columns accordingly by name of the dump file, last modified date, and creation date, in both ascending and descending order by implementing the IComparer interface with my sorter.

I've hooked the filesystem as well, so any dumps that are added to the folder in explorer view by the user, will trigger an event to update my listview display for the dumps in that folder automatically. Any other files that don't have the .dmp file extension are NOT included.

edit: The advantage you have over me though is experience with crash dump analysis. I have a bit, but perhaps not as much as yourself.
 
Last edited:
The problem with excluding all other files from the .dmp list is you get users who remove the extension and replace it with .txt or .dump; the current apps check each file to see if it is a .dmp by first running it through a superficial analysis in the kernel debugger.

I could probably do something similar to what you are proposing within the current apps to sort the .dmps list after determining which files are .dmps; I'll think about this more and see what I can come up with that would not be too restrictive to the user.
 
The problem with excluding all other files from the .dmp list is you get users who remove the extension and replace it with .txt or .dump; the current apps check each file to see if it is a .dmp by first running it through a superficial analysis in the kernel debugger.

If that's the problem that you've seen (i've never encountered it), that's good to know. Only thing is I wouldn't run it through the apps, that is a bit of wasted time. I'd check the file headers to validate whether it is supposed to be a DMP file or not. The metadata should tell you.

Let me take a look and see how I can find a way to validate each file manually.
 
I'm seeing these file signatures:
Code:
PAGEDU64
PAGEDUMP

The beauty of this is that they've kept each signature a length of a 64 bit integer (8 bytes), so you'd just need to read off the first 8 bytes of the .dmp file and check it to either one of these signatures. In case i'm missing any though, it'd be good to do a bit more research to see if there are others.

EDIT: From what i've read, the information in the header changes based on the thread that you're in when the system crashes (i.e. directory table base info and context record info)

Something I've found after some searching: http://support.microsoft.com/kb/315271

Gives me an idea.

edit:
Read More:


I have an idea for you though writhziden for what you want to do: I would keep a history of scanned dump file, file hashes, and this way you can easily evaluate whether any of the existing dump files in your to-be-scanned directory, has already been scanned. Even in the past 1000 dump files that you've scanned using your app if you wanted to keep a history of that many. MD5 should suffice. And it'll be faster than a larger hash like SHA1 or SHA256 as well. Prompt the user if one already scanned dump file was found, and if they want to re-scan it, don't add the file hash to the already scanned files hash list.
 
Last edited:
Just my 2ยข
I think it's a good idea.
If we presume that the user has made changes since the last post (not always true), then I only run the latest dumps
If the user hasn't made any changes, then I usually tell them to wait until after they've made changes before submitting new dumps (so some of the dumps may not get analyzed)
This'll save me a bit of time when running analysis' (and I rarely empty my backup folders).
 
In terms of new dumps, what about crash dumps that simply just may be too old to be worth scanning? Example: a 1 year old dump alongside a heap of day old dumps.
 
I was thinking along the same lines. I am currently testing out the latest version with these changes implemented. It has the option to run only new .dmps, and it also has a new window that opens to allow users to select .dmps to be analyzed. If only new .dmps are originally checked for, the new window has only the new .dmps checked. I figured that way, the user has the option to change his/her mind when the new window opens if so desired. There is also an option to select .dmps based on the age of the .dmps; the age is determined in number of days and defaults at 365 days.

By entering in a different number of days, the apps will select the .dmps to be run based on their age and whether they are newer than that number of days. For instance, a user can enter 7, and any .dmps more than a week old will be excluded from analysis.
 

Has Sysnative Forums helped you? Please consider donating to help us support the site!

Back
Top