View Full Version : Worksharing Monitor Low Virtual Memory is a hoax
DoTheBIM
2010-07-20, 06:07 PM
I just set up two users to work on the same file and installed the worksharing monitor. They are getting a warning about low virtual memory through the monitor (not through xp). Specs are xp 32bit 3GB ram. a message pops up saying available vm is 198mb and the desired vm is at 200mb. The paging file is in Gigabytes so this should not even be releavant at 200 megabytes. Should I be worried or is it a false alarm?
wmullett
2010-07-20, 06:34 PM
The warning is probably correct. You need 4 gig with the 3 gig switch set. Even with that, you will still get the VM warning if you have a lot of windows open or really intense windows or during a long Revit session. Revit has a memory leak so most recommend that you close over Revit +/- 4 hr sessions.
DoTheBIM
2010-07-20, 06:59 PM
We don't use 3GB switch. We have 3GB of RAM.
DoTheBIM
2010-07-20, 07:01 PM
and the file is only 54.5 megs
patricks
2010-07-20, 07:33 PM
File size x20 is the rule of thumb I've heard for the memory used by Revit alone for a single file. Factor in everything else going on over the course of the day and you can see why this can be a problem.
I use Revit 2011 on my home machine with 2 GB RAM, and it works fine with smaller projects. But my current project file is up over 50 MB and my home machine simply can't take it. I have to work through remote desktop if I need to work on it at home, whereas otherwise I just use my VPN connection alone.
So yeah, 55+ MB is most likely going to have problems with only 3GB system memory (minus +/- 1GB for system stuff, minus 1+ GB for the Revit file, and that leaves very little for anything else).
I used to always have issues towards the end of the day in 2010 when I had XP x86 even with 4GB and the 3GB switch. PDF's wouldn't open, couldn't browse network locations, etc.
Scott Womack
2010-07-20, 07:37 PM
Yes, you have three gigs, the Windows OS takes amost a gig. The Revit program can take another chunk of memory to load, then the Revit File expands off of disk by 4 or more times the file size, and the longer between STC, the bigger the file gets in memory, due to the undo iti s storing in memory, etc.
All that said, a 54.5 gig file should in 2009 work great. In 2010, it'll work so-so, and in 2011, it may not, since the Revit Program gets larger each release.
DoTheBIM
2010-07-20, 08:02 PM
All that said, a 54.5 meg file should in 2009 work great. In 2010, it'll work so-so, and in 2011, it may not, since the Revit Program gets larger each release.This isn't encoraging. We've worked on 70+ meg files individually with out a hitch. in both 2009 and 2011. Our template is 50+ megs to start. I've had 3 projects of this size open in 2009 across 2 sessions of Revit. We haven't been on 2011 long enough to have explored those limits yet. I still don't follow the logic of 200M available vs. 3,096M allocated in xp virtual memory? Where's the other 2,896M at? And why is it not using the other 1Gb of Ram that windows says is available?
patricks
2010-07-20, 08:20 PM
Might look at journal files to see what it reports for VM use. You should probably open a SR with Autodesk and tell them of the issue and your workstation specs, and send them journal files also from sessions where you see this warning, so they can analyze it.
ron.sanpedro
2010-07-20, 08:48 PM
This isn't encoraging. We've worked on 70+ gig files individually with out a hitch. in both 2009 and 2011. Our template is 50+ gigs to start. I've had 3 projects of this size open in 2009 across 2 sessions of Revit. We haven't been on 2011 long enough to have explored those limits yet. I still don't follow the logic of 200M available vs. 3,096M allocated in xp virtual memory? Where's the other 2,896M at? And why is it not using the other 1Gb of Ram that windows says is available?
Two things are going on here. First off, in 32 bit Windows, any given app only get's access to 2Gb of virtual address space. That virtual address space references both physical ram and virtual memory at the OS, but the application just thinks it has 2Gb of dedicated memory to play in. That is virtual memory in a nutshell. So any given app can crash if it tries to use more than 2Gb of address space, no matter how much memory is present as physical ram or swap file.
The 3Gb switch makes 3Gb of address space available to applications, while only 1Gb is available to Windows itself. This is because 32 bit Windows can only handle 4Gb of total address space, and at any given moment it is dealing with the virtual address space for itself and a single app. Without the 3Gb switch it is 2/2, with the 3Gb switch it is a 3/1 split, and using the USERVA switch you can tweak it further, say 2.8/1.2. Because of how Windows maps graphics RAM, a large graphics card can make only 1Gb of address space for Windows a problem, thus the tweaking with USERVA to maximize address space for apps, while still allowing enough for Windows to function. Of course Revit is one of the very few apps still that needs more than 2Gb of address space. Exchange and SQL also can. And some audio and video editing apps probably.
Anyway, no matter what, if an app tries to access more ram than the available address space allows, you crash. Maybe only half a gig of Revit is in RAM, and the rest is swap file, but if Revit needs to address 2.5 GB of address space, and you don't have the 3Gb switch applied, then you crash even if you have 4Gb of RAM and a 16Gb swap file. The resources are there but Revit can't access it.
Also, the swap file is a shared resource, but as far as I can tell Worksharing Monitor only reports how much Revit is trying to use. So if Revit needs to use 200 Mb of virtual ram, and you have 2Gb allocated but only 150 Mb is available to Revit, again you crash. And being a shared resource, every open app, plus Windows, could be dipping into that 3Gb swap file you have allocated. I think a fragmented swap file can alos impact how well it gets allocated to apps, and if Windows is dynamically managing the swap file (default setting in XP) on a hard drive without much space and not recently defragged, you can have problems.
In any case, the general rule in 32 bit Windows XP was to set the swap file to twice the physical ram, and make it a fixed size so it couldn't fragment. I have seen machines where this was set when the machine had 1Gb of RAM, and thus a fixed 2Gb swap file. A year later the machine had 3Gb of ram and still that 1Gb swap and was crashing, even with more RAM, because windows was trying to page to a very tiny swap file.
The benefit of 64 bit Windows is that the address space goes to something like 2Tb, so there is no way you will crash from running out, even if Windows has half of that dedicated to itself, we are just FAR from Revit trying to access 1000 Gb of memory, as that would be a 50Gb file on disk! That alone means that 64 bit Windows can be more stable than 32 bit Windows on the same amount of RAM, even 2Gb. It may be slower because it has to page more, but it is more stable because it CAN page more. Add more RAM and it is stable and fast. Thus 64 bit Windows and 4Gb of RAM is to me a minimal machine these days, even for small residential work. Just no reason not to, and upgrades to 8Gb, 16Gb or more can be done as needed. But only if you have a 64 bit OS to start with.
Oh, one other issue I have noticed. File size on disk is only part of the picture. How the app uses RAM also impacts things. In Revit 2010 and earlier, an export to DWG would not release memory used to make each DWG until the entire DWG export was complete. Basically a memory leak due to code logic. So a 10 Mb file could crash in 64 bit Windows with 32Gb of RAM and 64 Gb of swap file. Just ask it to export 1000 DWGs and watch the available resources drop till you crash. And yet the "x20" rule suggests that you have no memory problems at all!
2011 releases the memory used to create each DWG as the DWG is written to the hard drive, so even a huge file that is using 90% of the available RAM could export 20,000 DWGs with no issues. But things like Audit and upgrade still don't release memory till the process completes, so you need some memory headroom to complete those tasks. Headroom being address space, physical ram and swap file in 32 bit windows, and only the latter two in 64 bit Windows. But still, run out of any of the three and you crash.
Gordon
patricks
2010-07-20, 09:24 PM
Wow Gordon, awesome post - even to me who knows more than the "average joe" about computers (and thus shoved into an "IT" position of sorts in my office :roll: )
DoTheBIM
2010-07-20, 10:40 PM
Wow Gordon, awesome post - even to me who knows more than the "average joe" about computers (and thus shoved into an "IT" position of sorts in my office :roll: ) Ditto on all of that!!! I understand the 3GB switch and what it does with division of paging usage and that 64 bit systems blow some of these limitation out of the water if you have enough money to through at it. The switch doesn't seem a viable option for us if we only have 3GB to begin with. New machines? That should be an interesting one worded response beginning with "N" :roll:
I get the whole virtual memory and how it's use is "more than meets the eye". but I still don't get why Revit only uses 200Meg :shock: and why the monitor is saying it's getting low and not XP. All this talk about 4 gig this and 3 gig that doesn't seem to mean jack with a .2 gig limit. I guess since the monitor is only looking at Revits needs and current usage, my question is why does it limit itself to 200 when there's much more still available? Looking back on the posts Scott got me mixed up on gig and meg. 55+ gig :shock: would definately be not so good. I'm fixing my posts.
So using Scott's approximations...
File = 0.055 gigs
Ram = 2 gigs (3 gigs - xp 1gig)
0.055 gigs * 4 = .22 gigs
Revit = .5 gigs
.22gigs + .5 gigs =.72 gigs
So that leaves1.28 gigs for undo operations, work sharing monitor, outlook, IE, Messenger, VNC, and a few other programs. We haven't touched the VM yet. I know it doesn't work that way in real life, but....
Really... why am I seeing this warning on a workshared file using the monitor. Seems like I'd get no warning on a non workshared file (or even one that was work shared and no WS monitor running) and it would just cause crashes and what not with no warnings. Heck I don't even get problems with upgrading files from 2009 to 2011.
Guess I'll take a look at the journals tomorrow and the xp performance monitor closer and file a SR to see what's up with the 200 meg limit. Seems like messing with the paging file in xp settings is going to do nothing if there's a 200 MB (0.2 GB) limit within the 3GB of page file currently set aside. I'm thinking it's a legitimate warning as one user got a blue screen with a video driver file as the offending file at one point and the other had a major slow down and then crashed revit.
Boy this is going to be fun trying to get memory upgrades with a "pay who whines the loudest environment", when our machines are due to be replaced as they are probably almost 4 years old now. They seemed speedy enough all things considered, especially with all the multi-tasking I was doing lately. I was in Revit 2009 & 2011 with similar size projects open while emailing, while surfing the net, while exporing network folders and probably about 6 other things at the time too.
ron.sanpedro
2010-07-21, 12:28 AM
But there is no guarantee that there IS any available. When you look at the settings in Windows for swap file size, it is telling you the total shared resource. Windows may say you have a 2Gb fixed size swap file, but it isn't telling you how much of that swap file is already in use by Windows and other processes. You can use Task Manager in XP to look at total swap file usage (but not total size, that is only available in Virtual Memory setup), but not how much is used by each application or process. But it still might be useful to look and see how much is actually in use. If you have a 2Gb Page File, and without Revit running Page File Usage is already at 1.9Gb, then Revit is going to have issues. And if you are seeing that kind of page file usage withouit Revit running then something is horribly wrong! To check look at the PF Usage number under the Performance tab.
Another thing I forgot to mention is that I think Windows XP had some swap file management issues, as I saw Revit ( 2008 ) crash 100% of the time as page file use approached 150% of physical RAM. If I had 2Gb of physical RAM installed, Revit would barf as Page File Usage approached 3Gb. And it didn't seem to matter if the Page File was double physical ram or triple. At utilization of 150% of physical RAM Revit barfed every time. It barfed exporting DWGs, it barfed doing an audit, it barfed doing an upgrade. I don't remember for sure, but I think that 150% threshold happened no matter how much physical RAM was there, so I could crash at 2Gb of RAM and 3Gb of Page File or 4Gb of RAM and 6Gb of Page File.
But the main thing to realize is that the 3Gb switch is NOT about using more RAM, it is about using more Memory. Memory being comprised of both RAM and Page File. If have seen situations where a machine with 2Gb of RAM was stable with a big model as long as the 3Gb switch was enabled. Perhaps Revit was accessing 2.1 GB of Memory, and maybe only 1 GB was in RAM, while another 1.1Gb was paged. Without the 3Gb switch that would cause Revit to crash because it was trying to access memory beyond the 2Gb limit. Now with 4Gb and the 3Gb switch enabled it might be faster, with maybe 2Gb in RAM and only .1Gb paged. But with 4Gb of RAM and no 3Gb switch it will still crash, because of that 2Gb limit on addressable memory. Of course with a 512 Mb graphics card it would then crash WITH the 3Gb switch, because now the 1Gb of kernel address space isn't enough for Windows. USERVA to roll back to maybe 2.8GB for User space and 1.2Gb for Kernel Space might then be the sweet spot. It all depends on the graphics driver, the network card, installed software, all sorts of things can impact how much Kernel address space is needed.
But with regards to the OP, I think I would verify first that the Page File is indeed at least double physical RAM, and also verify Page File Usage after a fresh boot, with Revit launched and no file open, with the Revit file in question freshly opened, and then doing regular work. You may find that Worksharing Monitor is actually shining a light on an impending issue. Like if your Page File was only 1Gb but Page File Usage was 800Mb before Revit was even launched. Or you may find that Worksharing Monitor is pretty much useless other than seeing who forgot to Relinquish All Mine before they went on vacation. ;)
And while you are in Task Manager, look at the Memory Usage for Revit under the Process tab. If that is approaching 2Gb then you may have to go down either the 3Gb switch road, or the 64 bit upgrade road. And yes, every version of Revit uses a little more RAM than the last on any given file. I did some calcs and I was seeing maybe a 5-10% increase with every version. So yes, a file that worked in 2009 at 1.9Gb might be iffy in 2010 and a crasher in 2011.
Gordon
nancy.mcclure
2010-07-21, 05:32 AM
Yes, Gordon, that was a very clear and digestible summary - thanks for that. And I shall be quoting you! ;)
DoTheBIM
2010-07-21, 02:39 PM
Gordon, Thanks again for that. I think I'm slowly getting it the more I look into this. Page file size plus RAM equals Virtual Memory... I think. I believe I was confusing the terms Virtual memory and paging file as interchangable when one is really just a part of the other.
So, I found the system performance monitor in in the worksharing monitor... and noticed a few things. It reports 4.7+ GB VM total with no revit open... I'm guessing this is the remainder of what's not currently used for the whole system. Then when I start Revit it says 2GB total 32% used. When I open my local file it climbs to 58%. The users having issues are at about 80% range. And when they save to central it will force this meter up to the 2GB mark and force this VM warning. After STC all is well and meter falls back to 80% level. One of these machines has 6GB page and the other has 3GB page. Both have 3GB RAM. Mine has 3GB of both page and RAM and I'm running way more programs than they are.
I'm going to try to update their video drivers to latest version as mine is, I've had better luck with it then the "certified" version. then look at these numbers again. Tis odd that VM usage by Revit is so much different on the same project on these two machines then it is on my machine. Graphics driver should be the only thing different. Hope I can find the source of the difference before something goes bad.
So is Revit allocating 2GB of VM as a limit as reported by the worksharing monitor? Or is Revit dynamically adjusting the size it needs while the monitor can't see that change in VM?
DoTheBIM
2010-07-21, 03:48 PM
Geez I had a miserable time changing my page file. Had to restart like 6 times and set no page file at one time...it always kept setting itself to 3GB.:roll:
Ummm one more question for anyone that understands this stuff like Gordon does... If Windows 32bit can only handle 4GB of "address space" (<--- same as virtual memory?) at a time then is the purpose of setting the page file twice the ram so you windows can work with any combination of 4GB it needs even though you may have 8GB of stuff open as an example?
Scott Womack
2010-07-21, 06:54 PM
one more question for anyone that understands this stuff like Gordon does... If Windows 32bit can only handle 4GB of "address space" (<--- same as virtual memory?) at a time then is the purpose of setting the page file twice the ram so you windows can work with any combination of 4GB it needs even though you may have 8GB of stuff open as an example?
The Revit Clinic Blog just posted an item that directly addresses you questions....
http://revitclinic.typepad.com/
ron.sanpedro
2010-07-21, 09:19 PM
First off, please be aware that nothing I say would pass muster with anyone who holds a CS degree. Or even a really savvy hobbyist. And some of it likely won't pass muster with IRU69. ;) That said, I think I have come to understand this stuff well enough to make informed decisions, and if I can help anyone else do that also, while improving my ability to explain things (which is kinda my job) then Woot!
So... the key idea with protected virtual memory as a concept is that each application is aware of itself and nothing else. So each app believes it has access to 2Gb of memory at addresses 0 to 2Gb, and Windows makes sure that one app never writes data to a location that another app is using, as that would cause the kind of crash we had back in the Windows 3.1 days when device drivers and apps shared memory and could walk all over each other when poorly written or the machine poorly configured.
So at the moment maybe my Revit thinks it has stuff stored at 0-1.9Gb, while Chrome that is also running thinks it has stuff stored at 0-.25Gb. And Windows thinks it has stuff stored at 2.0-3.5Gb perhaps. Obviously Revit and Chrome can't actually both have stuff at .2Gb, so Windows uses a Lookup Table to map the location of each app's memory in use to an actual location, which could be in RAM if that data is in use, or could be in the Page File if not. So the lookup table may say that Revit address .2Gb is in physical RAM at location 2.8Gb, while Chrome address .2Gb is in my 16Gb Page File at location 1.8Gb.
And that Lookup Table takes up some RAM, which Windows doesn't want to waste so it puts currently unneeded bits of the lookup Table in the Page File as well. Thus, the more apps you have open but not currently doing anything, the more the Lookup Table is taking up Page File space.
And since you don't actually have dedicated RAM for every app, Windows wants to ensure that as much physical RAM is available as possible, so memory locations that are not currently needed by an app are put in the Page File and the Lookup Table updated to reflect this. When an app askes for a particular memory location. Windows checks the lookup table to see if that data is in the PF or RAM. If the latter, it points the app to the data, if in the PF it moves the data out of the PF to some location in RAM, updates the Lookup Table and points the App to the proper location in RAM. Actually it translates the location because as far as the app is concerned, everything is at the virtual memory location and all the behind the scenes stuff is invisible.
So if at one moment each app and Windows can address 2Gb of Memory, I might have...
Windows .5Gb RAM, 1Gb PF
Revit 1Gb RAM, .9Gb PF
Chrome .125Gb RAM, .125Gb PF
At which point I have 1.625Gb of physical RAM and 2.25GB of Page File in use, and with only 2Gb of physical RAM and a 4Gb Page File I am fine. But if my Page File was only 2Gb I would be using 1.875Gb of physical RAM and getting close to running low, while my PF would be maxed out. And since every open app uses some memory, and Windows will try to keep as much physical RAM free as it can, every open app, especially inactive ones, will eat up Page File. That's why the user with 15 copies of Internet Explorer open but minimized is actually causing themselves potential problems. All those minimized apps are in the Page File, potentially limiting the availability of PF for Revit.
But Windows will only Page out memory when it is not in use, because getting data from the Page File, aka a Hard Fault, is SLOW compared to getting it from RAM. That is also why you can be slow and yet stable (enough PF but not enough RAM) or fast but unstable (enough RAM up to a point, and not enough Page File to free up RAM beyond a certain usage).
And if I upped my RAM to 4Gb and my Page file to 8Gb, I could have three Revit sessions open with the same resource needs because my total RAM use would be 3.625Gb (3x 1 + .5 + .125) and PF would be 3.825Gb (3x .9 + 1 + .125).
However, with only one session of Revit open and needing 1.1Gb of RAM and .9 Gb of PF, I crash. I only need 1.725Gb of my 4.0 GB of physical RAM, and only 2.25 of my 16Gb swap file, but Revit needs 2.1Gb of total memory, and it only has access to 2.0Gb of addressable memory. So there is the condition where I can open three models fine, but one very slightly bigger model alone crashes, with plenty of RAM & PF still available.
And if you use the 3Gb switch you still crash, because now Windows only has 1Gb of addressable memory, and the need is still 1.5Gb (.5 as RAM and 1 as PF). Use the USERVA setting to give all apps 2.4 Gb and Windows 1.6Gb and you would be stable. For the moment. Throw a larger graphics card in, which Windows must map to directly out of it's Kernel address space, and Windows could need 1.9 Gb of address space and I am crashing again. Or up the size of that one Revit model just a bit and again you go past the addressable memory limit.
And of course all those numbers are actually changing thousands of times a second, so you can go for weeks without a hitch, or perhaps with a slow down due to thrashing (paging memory a lot because you don't have enough RAM) and then suddenly crash because some combination of things pushed you over a limit somewhere. And you can simultaneously be near the limit on addressable space for both Revit and Windows, as well as near the limit on physical RAM and/or Page File with a lot of apps open, and on any given day any one of those could cause your crash.
Oh, that memory mapping of graphics RAM? Different drivers handle it differently. So a driver update or different card could change the formula, require a little more kernel address space at certain times, and suddenly Windows is crashing, but only when Revit is running, and all because the graphics driver changed. But the trigger condition for the crash didn't occur till weeks after the driver update, and might actually be caused by the network card driver using more kernel address space when moving a big animation file, but only when the network isn't saturated and it can actually use a lot of bandwidth and it is configured to dynamically size it's packets. What?!
Any doubts why IT folks would rather just get new 64 bit machines with lots of RAM and not have to troubleshoot that mess?
And for the purists, you wouldn't really use a term like 1.2Gb as a memory location, that just seems to make conceptual sense to people who just want to understand the gist. A real address would be 1024 rather than 1.0Gb, and would likely be expressed in hexadecimal not decimal.
Wow, that was fun. ;)
Gordon
eric.piotrowicz
2010-07-21, 09:37 PM
Nice one Gordon, You are the man!
Like serveral folks around here, I've got a pretty good handle on memory management and allocation but the posts you have added over the past couple days have taken that understanding to a whole new level so I won't raze you about not calling 1.0GB by its proper address space of 400h.
Thanks :beer::beer::beer:
ron.sanpedro
2010-07-21, 09:59 PM
Nice one Gordon, You are the man!
Like serveral folks around here, I've got a pretty good handle on memory management and allocation but the posts you have added over the past couple days have taken that understanding to a whole new level so I won't raze you about not calling 1.0GB by its proper address space of 400h.
Thanks :beer::beer::beer:
Thanks, I hope it is useful and no one comes along and points out some glaring misinformation.
I would have used Hex, but I don't have a base converter app on my iPhone! And to think, as a dork in 6th grade I could go between decimal, hex and octal with relative ease, and pull some binary out of my hat on occasion. Not any more, those brain cells didn't make it out of the Army. ;)
Gordon
patricks
2010-07-22, 01:29 PM
*busts out the Ye Olde TI-85 I've had since about 1996*
yeah 1 GB is actually 1,073,741,824 bytes or 40000000h in hex. Just checking. :mrgreen:
DoTheBIM
2010-07-22, 06:04 PM
Thanks so much for that Gordon. Much more than I thought I needed, but all of it very informative.
We are now on the 3GB switch as of 5 min ago (I've been running it about a half hour and it seems much better). So we shall see how it goes.
So it seems clear we were hitting our limit after working in Revit for about an hour or so. The warning from the worksharing monitor is now more understandable thanks to gordon as well. Revit wanted to have 200MB of wiggle room when it had less. and explains why we had little problem with STC as it must have had just enough wiggle room to get that task done.
Powered by vBulletin® Version 4.2.5 Copyright © 2025 vBulletin Solutions Inc. All rights reserved.