PDA

View Full Version : Deleting the _backup folder or .RWS files



Rustle
2008-08-27, 01:28 AM
We are trying to keep our network disk space under control and we are wondering if we can delete the _backup folder or .RWS file in that folder. Revit seems to recreate them every time you save to central and not complain if they are not there. We are looking at running a script that deletes them if they are more than 15 days old. It is only about 50GB but it all adds up. We tell people to audit and recreate the central every week so this should not be a problem right?

twiceroadsfool
2008-08-27, 03:24 AM
That backup file lets you roll back the central file in the event that something bad happens, or someone makes a major error that doesnt get caught for a little while...

Storage space is cheap, buy more... Save the backups. :)

ron.sanpedro
2008-08-27, 04:22 AM
That backup file lets you roll back the central file in the event that something bad happens, or someone makes a major error that doesnt get caught for a little while...

Storage space is cheap, buy more... Save the backups. :)

Good argument for robust (and large) Shadow Copy implementation.
The RWS file never goes to tape, and leaves the primary server disk weekly, but the whole folder is available via Previous Version as needed, perhaps going back a month or two. Given that Shadow Copy is a bit by bit differential, file size for the Shadow Copy stays small, and there is fallback for 'oh $#!%' moments. All hail the really big Firewire/eSata drive. ;)

Gordon

Rick Houle
2008-08-27, 11:36 AM
I only recommend keeping the backup folder for the most recent Central archive.
The way i see it, and we also "SaveAs" our Central files regularly, is that the backup folder is a CYA (cover your a**) feature that only serves a purpose for a limited number of days/weeks... The data times out as work in the new central continues.
ON one occassion I have had to revert to an "archived" central AND it's accompanying backup folder... and the only reason it served a purpose was because it was only two days old. Anything older than a week or so, let's be real, i'm going to restore my work from tape backup before I lose that much ground...

oh yeah, we backup our server differentials every evening... So, I always have a tape backup of yesterday's effort... heck of a safety net.
I guess that kind of nullifies my opinion here...

Rustle
2008-08-27, 03:43 PM
Thanks Gordon I forgot to mention we have copies of all our files from noon and midnight each day as well as tape backups if we need to go back more than a week. That actually the point of all this. We are an 80 person all Revit firm and our tape backup time/capasity is getting too big for our current technology. I usually just create a new central from a working local if I can.

chodosh
2008-08-27, 06:35 PM
On one occassion I have had to revert to an "archived" central AND it's accompanying backup folder...
Well, well, sparky, aren't you a lucky chap? ;)

I really hate to sound snarky. However, in my experience (which I will readily has been perhaps unique) you can *never* delete the backup folders without seeing some complications. And, I have been able to successfully restore several files from the backup when all copies on the tape drives and other backup systems of the actual *.rvt's could not even be accessed. Let me qualify that specifically, I have found that you should *never* remove any backup folder or Revit_temp folders for any file that is *actively* related to a project involved in worksharing.

The backup folder and the Revit_temp folders it would appear cannot be treated as static backups, ***especially if you have more than one file linked together***.

I would strongly suggest removing them only after time has elapsed or they are irrelevant to a new file. As Rick says, keeping the backup folder for the active Central file is a good practice, but I would also keep the backup folder for the locals, too. Especially if you do not create local files every day. Acknowledging that there are divided camps on both sides of the local file creation debate, if you do create new locals, you can remove the backup folders from the previous local file without complication.


I forgot to mention we have copies of all our files from noon and midnight each day as well as tape backups if we need to go back more than a week.
Having the files at mid-day I imagine has saved you time and again. However, with Revit in a worksharing environment you may be able to rely instead on a mid-day restore from a local file and therefore may not need to backup your *.rvt's as frequently in order to shave down on the storage load from your Revit projects.

Just my $.02. :)

Nice point, Gordon about Shadow Copy.

-LC

Rick Houle
2008-08-27, 07:56 PM
Well, to round this out to the full nickel...

I have actually SOLVED a problem in the past by deleting the backup folder.

End user finds corruption, end user reverts to previous archive and end user rolls back to previous save... HOWEVER, end user fails to remove the old backup folder that is still associating itself with, da-da da, the original corruption.

We routinely remove all OLD backup folders either local or central once their time has past. I personally consider them static and bunk. A month-old roll back is not an option here. If i cannot restore a more recent save, I'm submitting the issue to Autodesk.

Central file corruption at my office is pretty rare.
IF for any reason you are uncomfortable with my method, by all means, DONT follow me. Keep to the safe side. I'm just typing my mind.
But, as the I.T. guy responsible for data restoration as well as all things Revit, I must say, my 80+ users have never lost a central file or more than 2-days work inside Revit. Good communication i guess... or just good luck.

chodosh
2008-08-27, 08:34 PM
Unless I misunderstood your post, the problem you mentioned where "end user fails to remove the old backup folder that is still associating itself with, [then] the original corruption" [returns] is because of network overwrite problems. I've found, and others have commented here about the same phenomenon, that you cannot overwrite a central file in place with the same name. Network overwrite is a recipe for disaster. I have had success by saving-as with the time/date/username/save to central stamp to a new location from the backup, then opening that file, verifying the contents. Once I know I have a recovered database, I then remove the old Central file and it's backup folder to an Archive location. I then save-as a new central from the restored file back to the original location and original name, but I avoid overwriting the file in place at all costs because, as you said, it will never fully restore properly and the issues and errors or corruption will remain or will crash Revit completely.

So, essentially, yes, you're correct: you must always remove the backup folder and the file before "saving over" the file in place or else it causes more problems.

-LC

Rick Houle
2008-08-27, 08:50 PM
As i edited in my previous post it is always good practice to air on the side of caution.
Save everything if you can... i don't.

I must say, i have never been locked out of a corrupt central file - the database seems to always heal itself in one form or fashion with user input. Sluggish central files are another story. I have had to revert to my backups for the sake of sluggishness (and that was beneficial to R+D).

But i have never been denied access to a current central file... (through use of journal or other means)... am i alone on that?

ron.sanpedro
2008-08-28, 04:29 AM
.

So, essentially, yes, you're correct: you must always remove the backup folder and the file before "saving over" the file in place or else it causes more problems.

-LC

And if this has in fact occurred, would a properly done "New Central", including an Audit, likely alleviate the problem? Or once you have a new central with old backup are you kinda hosed?
And for those with nice fat (phat?) Shadow Copy drives or backup processes, do you set your Central Files to 1 (or something less than 20) backup too keep the daily bloat down?

Like Rick, I have never been locked out of a Central file. A very few times I have seen Central files become corrupt to the point where they won't open, but I don't think the backups or journals could have helped had we tried anything different. And all the effort to make that happen, compared to just going back to the last shadow copy, seems pointless. I suspect that I would just grab the last SC and get people back working again. In which case, am I needlessly throwing away a lot of work by not trying to use the backups to get to the last good STC? It all depends on the chance of success with the Revit backup. If I have a 90% chance of getting one STC back, (and doing it quickly) so only one person looses their work, then that is a valid approach. But if I am likely to spend an hour with only a 50/50 chance of recovering all but one person's work, then I am probably better off just punting to Shadow Copy and moving on. But I don't have a sense of what the chances of success really are. Like using a Journal, it seems to be try and pray, and I would really like to have a better sense of my chances before I start down that road. And for that I would look to the Factory for a Recovery White paper to go with the recent performance one. That would be very helpful, even just to keep in a back pocket while hoping to never actually need it. Much like a backup system. ;)

That said, I wonder how Shadow Copy timing could be affected. If a file is in use when the Shadow Copy runs, I believe it is just skipped. And users tend to STC right before lunch, and right before the end of the day. And Shadow Copy is usually scheduled for noon, and 5 or 6 or so. The latter might not be a problem, but I suspect that those noon Shadow Copies often run into Central files in the act of being STC'd to. Makes me think perhaps 1 and 7 make better sense as Shadow Copy windows? Anyone have something more than theory to throw at the idea?

Best,
Gordon

Rick Houle
2008-08-28, 11:30 AM
Good summary on the file restoration thoughts Gordon.

We incrementally save our centrals so there should be no "same name" overwrite. But when it did occur, or if it does again, we delete the trouble central w/ backup then resave from archive to the next central increment.
The only limitation with our "throw away" backup approach is we have no version history beyond 20 saves.

I'm thinking i may change my backup history to more than 20 after reading this thread.
Perhaps use the version history as a time marker to define when the next incremental save will occur.

What kind of version history do you all rely on? Does anyone jack it up to 100+ saves or even more?