How to backup the whole registry in a SMALL package ==D

Tj Belfiore

Well-known member
Joined
May 28, 2012
Posts
62
Location
Las Vegas, NV
I am trying to backup the entire registry in a zip file. How could I get the end file's size to be of little to nothing? Somewhere around ~1mb-3mb TOPS.

My registry size is 469mb (and it's in .txt format btw). Oh crap. Now what? I can't seem to figure out how to get it to desired size. I have tried Altap Salamander's compressor and here's the outcome: 33mb using the built-in ZIP method, 11mb using built-in 7-Zip on Maximum compression settings.

If there's no way to get it smaller than 11mb, is there keys in the registry that can be parsed out later? For instance, are there any useless keys? I don't mean delete them either, I am simply backing up the registry as a .txt file to be viewed later in case of system problem.

Thanks in advance.
 
I am simply backing up the registry as a .txt file to be viewed later in case of system problem.

Can I please come back to you on this in full but...

a .reg file cannot be used an appropriate method of registry backup, as ownership and permissions data on the keys is not retained. This can cause significant problems. You must backup the file as it is on the disk, as only this will retain permissions and ownership data. I will explain this much more fully later.

Richard
 
Last edited:
Using .rar on 7-Zip's maximum compression, I get my 754MB .txt file to 30.0MB. The best I can get using manual trials of a much more complicated method is 28.8MB, however, this is much better than it sounds as it actually includes permissions and ownership for every single key also - so all in all, a much better compression rate.

We are already compressing the archive 25x. It really isn't going to be possible to push this much further. Your ideal compression rate needs a compression factor of ~500x. This simply is never going to happen. The fact we get 25x on this sort of data is extremely impressive, but it cannot work miracles.

If there was a compression miracle available, we wouldn't be downloading all these huge service packs, enormous games, etc. etc. What you are asking is pretty much impossible. Your only option is to opt in relevant registry data, rather than to take the whole lot, IMO.
 
Why can't we? I understand it's virtually impossible, is this due to it being physically impossible as well? I mean, does the HDD play a role in all this? What's stopping the file from being decreased in size even more? For instance, why can't we just zip it multiple times and it still compress it? Thanks.. lots of questions sry.
 
Compression algorithms can only go so far in finding patterns in data that it can shrink. While a registry can be exported to a txt file, it is by no means entirely made of strings. There are a variety of keys with different data types that are stored in a registry, so this can complicate compression by not making the entire exported file consistent from beginning to end.

Understand that compression algorithms used by 7z and whatnot try their best to compress what's given to them by using only a superficial means of discovering patterns. They do not know what the actual material in the file represents; if they could, I'm sure the registry file can be compressed even further as actual structures of the registry file can be shrunk as well as keys using some means of relational information. Unless it holds knowledge of how a registry is designed and its internals, it can only compress in a manner no different than compressing any other file.

The demoscene is especially well known for people making "intros", which are animations in which they've managed to shove as much as they could within a small filesize, such as 8k or 64k. This is accomplished through the use of heavy - and often customized - compression algorithms that anticipates what kind of files will be thrown at it, such as texture files or specific code files. These in conjunction with manual "compressing" (using coding tricks to reduce filesize) they're able to accomplish some pretty impressive tasks when it comes to shoving stuff in a small space. However, again, this is through careful and professional operation, and cannot be substituted by a "one algorithm to rule them all" like you expect.
 
Last edited:
cannot be substituted by a "one algorithm to rule them all" like you expect.

First rule of everything: There's a solution to it all, somehow, someway. I highly doubt it would take a lot of skill to compress a file down to a smaller size. Is there intense programming involved? I'm gonna have to Google it and see how it can be done, I'm sure it's simple.

Please lock thread, thanks.
 
There is no need to lock threads you're finished with, Undocked_Windy as others may learn from them or have questions.
 
cannot be substituted by a "one algorithm to rule them all" like you expect.

First rule of everything: There's a solution to it all, somehow, someway. I highly doubt it would take a lot of skill to compress a file down to a smaller size. Is there intense programming involved? I'm gonna have to Google it and see how it can be done, I'm sure it's simple.

Please lock thread, thanks.

As VirGnarus has explained above, there comes a time when current algorithms simply cannot compress a file any further, and it is unlikely that you will be able to improve upon them unless you are either a master algorithm creator who deserves fame the world over, or who creates a custom algorithm for the exact type of data you are trying to compress.

Let's say you have the following data (this whole example is greatly over simplified)

fhaywtgggggggggwuwrhhhhh

Now let's assume that numbers will never be used in this example data.

Instead of storing a great row of characters, I could say that I will put the number of times a character will appear BEFORE the letter.

This creates:

fhaywt9gwuwr5h

Do you see how we now have to transmit less data?

Do you also notice that there are no more patterns of repeating data left? Compression algorithms work by storing patters only once, and refering back to them later (or similar). Once you have compressed once, you cannot make it any smaller, as all patterns have already been taken out. Also, there is nothing more that you can do.

There is simply no way which you will be able to get this data much smaller. My strong suggestion is to read around compression algorithms, and come to the understanding of why what you are asking for is impossible.


As for locking this thread, I am not overly keen. This discussion has not yet come to a conclusion, your other thread is already closed, ending any additional discussion opportunities there, and there has been no particular case of rule breaking which warrants a thread closure yet, only a lively debate. If I see this thread getting out of hand, then I will re-consider.
 
Yes, my friend, but you are missing some things. Say you have this situation:

6hhhhhhhhhhh9rtegregrgh0iir9grigrgig

You could make it into this:

6h9x9rtegregrgh0iir9grigrgig

You could simply replace all nine 9's with "9x", this would be a way around wouldn't it? Then, programmatically we could say 9x = nine 9's. Assuming I was a world-famous genius programmer right?

EDIT: I apologies in advance, I'm very busy and not sure if the above text came out coherent or not. I hope you understand.
 
First rule of everything: There's a solution to it all, somehow, someway.
In some other reality, perhaps, not here, not now. I want the moon to stay in the sky all night to light my way. No solution. Anyway to be serious, why do you find it necessary to compress your txt file to 1-3 MB?
 
I'm trying to compress my .txt file so that the program that I wrote, HotFixed (which outputs a .txt as logfile), can be easily uploaded by the user. If you try and upload something too big, it will go slow and the particular file I'm trying to compress is over 446mbs. That's a big file, my friend. A very large piece of pie, if you will.

How do we keep the pie in tact and whole, yet eat lots of it? The same could be said for How do we go about compressing a file that's 446mb all the way down to less then ~3mbs? Something that I think would improve the way we backup data everyday. Imagine if we took this concept further and expanded into different formats of media. Movies, perhaps (that are legally backed-up :smile9:). That's 700mb movie is now 4mbs, now instead of fitting only 8 on a dvd (as datadisc) - you can now fit roughly 1175 with no quality loss. That's like getting 822gbs out of a 4.7gb dvd. Wow. I'm an idiot for trying to accomplish that, huh? :banghead:

Would you or anybody else be interested in helping me do that programmatically? Sounds fun, huh?
 
Last edited by a moderator:
Lots of good info above and I second JCGriff's suggestion for ERUNT, awesome program and a true registry backup. What you're asking is equivalent to packing for a vacation. You want to put an entire closet into one suitcase and it just isn't gonna happen. You can shrink stuff down, fold it, cram it, roll it into a ball, but you can't expect to cut off a pant leg then have it reappear when you unpack. Compression schemes won't decompress nothing back into something.

Best example I can think of is WAV to MP3 compression, where repetitive data is discarded as Britton was explaining, then one instance is multiplied over where needed when decoded. But there's still a limit to how small the file will get. I also wouldn't trust that type of compression as reliable if you're actually trying to back up something as critical as the registry.
 
In future builds I plan on adding a registry feature - - - as I mentioned before. And if not in HotFixed, in some other program or whatever. I plan on making it sometime.
 
You could fool around with individual bits per byte and see what kind of compression you could do there to retain data over multiple bytes in a lesser byte count? 8 bits per byte? It's all just fooling around with having an algorithm that knows what this lesser binary data represents when it's converted back into it's full thing. That takes creativity though :) I've tried to recreate versions of many compression algorithms and combining some even. It's interesting...
 

Has Sysnative Forums helped you? Please consider donating to help us support the site!

Back
Top