Files and folders to exclude from an image based backup.

Blue House Computer Help

Active Member
Reaction score
27
Location
Cambridge, UK
To hopefully benefit all of us, I'd like to construct a list of all files and folders that should be excluded from an image based backup that's intended for a bare metal restore, mainly because they can be recreated automatically or are backed up in a different way (e.g. databases). Then, I can make these into one long list that anyone could use.

For Starters:
hiberfil.sys,
pagefile.sys
swapfile.sys
*.tmp,
Virtual disks?
I'd like to exclude temp files/folders in application data... any main ones you know of?

Others? Or is there a comprehensive list somewhere that you know about?
 
It's not something I'd even want to ponder to see if it's worth doing, just to save a wee bit of space. Image based backup is "the whole dang thing" in my head. Maybe if I was forced to trim the fat and squeeze an image on some little USB thumb drive for some reason..instead of properly sized external media.
But I'd avoid excluding the hiberfile. Suppose the computer you were supposed to be backing up via full image....had a crash during sleep, and "stuff was open" at the time it fell asleep. Those are kept in the hiberfil....for when it wakes back up. Or even for "fast startup". Missing hiberfil, and end user doesn't get that back.
 
Manually messing about with any modern operating system is highly unrecommended. I'd even say discouraged. This isn't W95 or MacOS 7. They've become so complex and so many things are tied together you have no way of knowing what might happen. If there is a need MS has Sysprep which is intended to remove machine specific stuff to decrease the size/increase portability.
 
They've become so complex and so many things are tied together you have no way of knowing what might happen.

Yup. And that applies to trying to pick and choose updates, too, which seems to have (Thank God) largely fallen by the wayside.

There really isn't a point to checking for updates and not installing them. . . It's important to install all available updates. I've been doing this since the days of DOS, and I still don't have the confidence to pick and choose among updates. There are just too many variables involved - and most people can't evaluate the full consequences of installing/not installing updates.
~ John Carrona, AKA usasma on BleepingComputer.com, http://www.carrona.org/

And Mr. Carrona was (he's since retired) an incredibly well-respected BSOD expert. OS internals (any of them) are best left to those responsible for developing and maintaining said OSes. Thinking you know better than they is a recipe for disaster.
 
Okay, point taken. You guys are right. Don't know what i was thinking.

Let me fill you in on why it was bothering me. I've been testing image backups on a spare workshop computer. It's taking incremental updates to the image and averaging 1.5 GB a day in changed files, even though I've been doing almost nothing on it.

So I guess I was worrying a bit because I hadn't included it in my projections, and for some customers with slower internet connections, that's quite a daily hit to their bandwidth. So I'm not worried about the storage space, I'm worried about getting the data uploaded in the first place.
 
It's taking incremental updates to the image and averaging 1.5 GB a day in changed files, even though I've been doing almost nothing on it.

But the question becomes: On which files are changes being made?

A single byte change in a 1.0GB file means that the file, not part of it, is going to be a part of an incremental backup. Or at least the finest granularity I've ever worked with is "file." So just a tiny change to one or two huge files is enough.

I know there are utilities to show most recently modified files for an entire system, so you might want to take a look at exactly what's been changed to determine why so much is being incrementally backed up.
 
@Markverhyden I'm using MSP360 for the backups. They say that they only back up changed parts of large files, so I suppose it must be working on a sector by sector level? I somehow doubt that that includes when data is shuffled around in a file.

Yes, I'm more concerned about bandwidth impacts than anything else.

@britechguy Do you know the names of any of the utilities?
 
They say that they only back up changed parts of large files, so I suppose it must be working on a sector by sector level?
It's what most backup programs call a delta backup. They compare the file on the server to the file locally and only back up the changed parts. Yes if someone really does major changes to a file then it is unlikely to be able for a delta. But most files are not changed that drastically and most online backup programs use compression to save bandwidth.
 
Back
Top