PDA

View Full Version : Large Projects / Why is Revit so Slow?



jj mac
2009-08-21, 06:55 PM
First off I want to say regardless of the posts title, I love Revit. I have been using it for 4 years now and could never / would never move back to the CAD world - unless I decide to change careers...

With that out of the way, when working on large projects, why is Revit so slow???

What happens along the way in the course of a project, that makes things run so sluggish?

I have seen this happen is every office I work in, I have read countless posts about it, tried purging, auditing, deleting elements manually, fixing errors... everything.

Nothing changes.

There needs to be some guide lines that come from the top, that are tried tested and true, based on results from a real project from the factory. Autodesk marketing needs to stop wasting time on making money on the next release, and fix the issues that we are all struggling through, so that we will continue to use this software in the future. It is the best investment Autodesk can make with Revit. We need you guys work through the issues we are working through and experience this stuff first hand... If you have already done this - please show me the literature to prove it!!!

I will be the first to say that whatever the reason our models are running so slow is because of something that we did along the way to make it so, but we need to know what that thing is and how to avoid it!!! What are the best maintenance techniques besides the ones I have mentioned above!?

My main issue, is that for all of our projects we need to utilize grouping functionality. Some files need to have 50 - 75+ groups in them, just to make editing and managing the file as easy as possible. This should be easy for the software and computer to manage and process Especially with technology being the way it is now!!! Some of our files I can see the groups going wayyy above 75+ groups. But we have to wait for so long for things to change.

My example here, is from a building that has 46 storey's and has one group that contains the typical shell of the building and has 32 instances. When making changes to the group, while in the group editor, things work great. When it's time to finish the group or reload the group, we are waiting for upwards of an hour to see the change!!! And then it comes back with a ton of errors and wants to make new groups!!! This is completely unacceptable. I can't fight to keep this going if this is what I have to show - an hour to change a group. Everyone gets sold on Revit being so fast and so much easier to use. Time to prove it Autodesk.

We all love this software, and we all want to grow with BIM as it develops, but we cannot do it with Revit unless something changes with performance. Every year I hope this will be fixed and it never gets fixed. It's time to fix it. FIX IT! Stop suppressing the technology! Let er go!!! ugggGG!

Captainkb
2009-08-21, 06:59 PM
What do you have for PC specs?

Hopefully you have the latest hardware. 64bit, quad core, 16GB RAM.

Revit is pushing the PC hardware limits. 3D is doing it.

46 story building? I would have the latest everything out there for pc hardware.

$7,000 - $10,000 per desktop computer?

cliff collins
2009-08-21, 07:36 PM
46 storey building:

1. Break it up into several linked Revit files:
-base/podium
-typical tower stories x-thru x
-upper stories x-thru-x
-tower cap
-Interiors Model(s)
-Structural model(s)
-MEP model(s)
-Site model
etc.

2. Hardware/software:
-64 bit OS
-64 bit Revit
-fast processor ( multi-processors really do not have much impact, since only a few things
like printing and wall joins are enabled for multi-threading in Revit's software code.)
Lots of RAM--8 GB min.--the more the better. RAM is probably the biggest factor.

3. Enterprise quality hub/switch w/ gigabit speed, Cat 6 cabling, good ethernet cards.

4. Dedicated hard drive on server for Revit models

cheers.............

Scott D Davis
2009-08-21, 07:41 PM
Worksets to break the model up into chunks. Only check out what needs to be worked on for a particular task. Don't open the whole building file when you are placing toilet partitions on the first floor.

twiceroadsfool
2009-08-21, 07:44 PM
What they said. Id be using Links for Floor Plates, not groups. Thats just a wacky amount of stuff that can go wrong.

Ill admit, Revit gets a LITTLE sluggish when updating very large projects... But an hour to update 60 or less group instances sounds like youre doing something profoundly wrong with those groups...

The errors also sound like elements in those groups have ties to outside elements, which isnt going to play nicely.

jj mac
2009-08-21, 08:37 PM
Yeah I think it's pretty obvious that something has gone wrong... I want to know what the typical reasons for that are. How can these things happen? Is it groups / families / links / errors

I admit the approach on this file will have to be "updated", and things will need to be broken up, but now we are going to have deal with the file management issues of linking, copy/monitor and tagging through links when it comes time to annotate the model.

It shouldn't be this way. We should be able to have the whole model in one file, if we want...

Thanks for all the responses and advice, everyone. It is appreciated.

truevis
2009-08-22, 03:56 PM
I'd avoid using 'based' families as much as possible.

Also the workset/linking tricks that others have mentioned. Once you've used worksets to maximize performance and things are still slow, then you have to start breaking the project into links.

How many people do you have working on one RVT, JJ James?

dbaldacchino
2009-08-22, 04:37 PM
Always make sure you're modeling as light-weight as possible (if you don't need 3D families, use symbolid, 2D ones with perhaps just some rough 3D shapes for collision detection etc). Also in my experience and tests, splitting things on worksets does not seem to improve performance at all. All you do is manage your memory better, which if you have plenty of then this becomes a mute point.

Little background on my claim....took a 200MB job that was slow to place a wall in. I opened the central and chose to specify worksets. I closed EVERY user defined workset so the model was blank. I created a new workset and clicked to place a wall. In theory it should have been faster, right? Well, it was just as slow as if I opened every workset.

Brian Myers
2009-08-22, 11:52 PM
Have you seen the Model Performance Technical note?

http://images.autodesk.com/adsk/files/revit_tech_note.pdf

AP23
2009-08-23, 06:52 AM
Always make sure you're modeling as light-weight as possible (if you don't need 3D families, use symbolid, 2D ones with perhaps just some rough 3D shapes for collision detection etc).

While this may be the only solution, it truly goes against the whole concept of Revit. Especially now, where Autodesk is trying to get on the design to fabrication wagon, where a higher level of detail modeling is needed, it's questionable if Revit can even play a role.

Somehow, I wonder how Digital Poject manages it's highly detailed models while maintaining performance.

dbaldacchino
2009-08-23, 02:41 PM
I'm mainly talking about furniture families, plumbing fixtures, equipment, etc. In most cases you don't need to model the intricacies of those forms; you're not going to need fabrication data! I've seen plenty of models with keyboards and keys, computer screens, mice with wires, water spouts with knobs, etc. modeled to death, literally. All that is unnecessary for a project in most cases and in a large project it's the first thing that needs to go. Most fail to see that simplified equipment & furniture families can still get you to the finish line without compromising the information you need to gather from them (which room they're in, assembly code, count, cost, etc.).

I would also make sure there are no large images in the job that have been imported. You don't need anything more than a 100dpi greyscale to use as a background scketch to build your model from. If you're using other large scans for presentation purposes then you need to start a new project for that purpose only (link your building into it and create sheets there with the imported large raster images).

It goes without saying that warnings have to be taken care of. This is the hardest thing for users to understand. Another peculiar speed issue we ran into lately was on a laptop. Most of the team is on desktops and the designer is on an M6300 (everyone is on Vista 64 with 8GB RAM). We got the model down to around 150MB (almost at end of CDs) from a bloated 210MB file after getting rid of rasters, purging and cleaning about 400 errors (2800 still remain!!). This 2009 project moves fast on all machines except this laptop. On an M6400 it flies too. Then we decided to try on another M6300 and it flies too. There's definitely something wrong with that laptop, perhaps the video hardware. All drivers are current. So in case you see an isolated performance issue, don't rule out a hardware problem and discount the user as "needy" or crazy ;)

barrie.sharp
2009-08-24, 09:04 AM
I do look a computer games with textures and dynamic lighting producing some pretty impressive real-time graphics and I wonder, why does Revit struggle to orbit smoothly with basic shadows switched on? Could Revit be streamlined so that we concentrate on workflow more than workrounds for performance.

I whole heartedly agree. Revit has changed my life at the office but it seems to make a meal of things no matter how powerful the PC. Also, when things are slow, I can't see really whats being maxed out. Is it really using the resources full potential? Maybe Revit should include a benchmark/system monitor so that you can see where the bottlenecks are.

dbaldacchino
2009-08-24, 01:39 PM
I don't believe games would have the same number of faces and geometry that Revit has. They utilize a lot of textures to convey 3D objects. I would really like to see RPC elements extended to other categories such as furniture. That way I can have a 2D plan representation for planning purposes and maybe eyen one in elevation/section and a highly realistic 3D representation in renderings without the excessive geometrical baggage.

djn
2009-08-24, 03:09 PM
Somehow, I wonder how Digital Poject manages it's highly detailed models while maintaining performance.

They don't we have a couple full DP models that we can't even open the full model live on some of our newest computers.

I also agree that using not 3D content shouldn't be the answer. Especially since Simpson now some 3D connection models.

One issue that we have ran into is it's very had to break up a model after it's already started.

Also Scott breaking the model up into worksets is somewhat of a myth. Sure it may not take as long to get into to file but it has no affect on placing or modifying elements in the model. Such as placing a beam.

barrie.sharp
2009-08-24, 03:38 PM
I would really like to RPC elements extended to other categories such as furniture.

I liked the RPC Idea and I thought "wouldn't it be cool if you could select a family and convert it to RPC. It could render the views and compile them into an RPC object"

Granted, games have less to deal with but that is my point, streamlining would help.

That reminded me of another app I use. Cubase is resource hungry and loads a ton of samples into the memory ready for any permutation. They provided a freeze option that leaves behind what's needed and temporarily dumps the rest till you want to make changes to that part.

It seems to me that revit is constantly checking the model and refreshing when all I'm doing is rotating the view. I guess that is the nature of a DB, excel gets just as bad when you link complex workbooks and graphics aren't even a factor. Not knocking Revit, infact I sympathise. All i''m saying is "I want it all and I want it now!"

truevis
2009-08-24, 03:42 PM
...Also Scott breaking the model up into worksets is somewhat of a myth. ...
It's more the non-opening of worksets upon open that helps performance (rather than just putting stuff into worksets).

djn
2009-08-24, 05:21 PM
It's more the non-opening of worksets upon open that helps performance (rather than just putting stuff into worksets).


The results we found was that it took same amount of time for me to place a beam or modify a beam in a model, with all the worksets loaded as it did with no worksets loaded.

ldrago
2009-08-24, 06:05 PM
I have experienced it where people have modeled everything possible thing that can be modeled down to the nuts and bolts holding the building together. Personally I do not see a reason for that. If the component is not going to be counted or put into a schedule why does it need to be 3d modeled? Why do you need to model an 1/8" caulk joint in a window when it will never bee seen at an 1/8" scale?
There need to be a balance. I know of a firm that was on the cutting edge using Revit in their area, but they didn't control how people were using it and had models over 300mb. Now the computers they have can't handel the larger models so they have made the decision to abondon Revit and go back to AutoCAD, how said is that to hear when it was posible to avoid it.

djn
2009-08-24, 06:29 PM
I think that Revit sends a mixed single. They make it very easy to create parametric 3D content such as base plates with anchor rods, but we aren't suppose to put these in the model because it will bog down the project. Another example is the stair tool, why does it have options for nosing length, nosing profiles, riser to tread connections, ect... If you went by the 1/8" rule you could never see these details, so why are they there, and I bet people would be upset if they removed these.

swalton240189
2009-08-24, 11:00 PM
...Another peculiar speed issue we ran into lately was on a laptop. Most of the team is on desktops and the designer is on an M6300 (everyone is on Vista 64 with 8GB RAM). We got the model down to around 150MB (almost at end of CDs) from a bloated 210MB file after getting rid of rasters, purging and cleaning about 400 errors (2800 still remain!!). This 2009 project moves fast on all machines except this laptop. On an M6400 it flies too. Then we decided to try on another M6300 and it flies too. There's definitely something wrong with that laptop, perhaps the video hardware. All drivers are current. So in case you see an isolated performance issue, don't rule out a hardware problem and discount the user as "needy" or crazy ;)

Projects are only as fast as the slowest computer because of all the element borrowing that goes on so avoid having people on slow computers.

Also if you are in an AE firm don't link directly to the other disciplines central files. Their STC's will slow you down and vice versa. We save detached copies twice a week or as needed.

Fresh local copies every morning for everyone!

jj mac
2009-08-25, 01:23 PM
Have you seen the Model Performance Technical note?

http://images.autodesk.com/adsk/files/revit_tech_note.pdf

Dilbert, thanks for posting this. Hopefully everyone knows about this document. I had not seen it yet for 2010, but was aware of it for 2009. We also have literature from AU which supports basically everything in this document.

My only issue with this is that, in my opinion these are still workarounds at the end of the day. The advice from everyone here is overwhelmingly appreciated, but we need this software to perform without having to exercise these workarounds.

All of us here, are all really pushing the envelope of what we can, and want, to do with Revit and I think it can perform and compute much faster than it does. My question is why does it not do this already?

To me, Revit is like the lion that does not yet know it's real power or strength.

For now though, we will keep pushing, working through and evolving as we go and hopefully there won't be too many more days the one when this originally posted. As a colleague of mine says... "at the end of the day it's 100 times better than CAD or ADT"... but it's time to make it even better.

twiceroadsfool
2009-08-25, 02:23 PM
FWIW, a few things:

1. Most of what they recommend is not a workaround, nor are the practices dictated by the other users here. Anyone whos under the misconception that they should be able to open a 50 stoey building and walk around in a Call Of Duty-esque fashion will be disappointed, but Revit models arent surface models, and theres a ton of baggage with them.

2. Not to discredit the effort of the Factory, but think LONG AND HARD about the information in the Technical Performance note. It is written and predicated upon the notion of simply making computers perform better. That means at the end of the day, some of those techniques are NOT the best for your practice. Avoiding groups alltogether will make the computer move slightly faster, but it means youll have to change 100 instances of Storefronts instead. Never using "group and associate" when arraying will mean things move faster, but when a designer wants to respace windows youll be sorry. Using worksets set to be "not visible by default in all views" may speed up regen time in views with worksets off, but God help you if you collaborate and send that file to consultants for Linking (theyll never see those worksets) EDIT: I see in the new version of this document they have changed this section to reflect this shortcoming. Good job guys. . 2D plan and elevation representation families are great for model performance, as long as you have everyone on board that they cant ask for 3D views of the space, or 3D clash detection. Swapping and alternating families is a possibility, and the model moves faster... Its just more of your time.

3. Any large project needs a serious Plan in place. The first day of modeling should take place at a conference room, sans-computer. Worksets, where the model will be Linked Files, where and how the content will be managed, etc, ALL of this needs to be discussed before even starting a project that large.

Groups ALONE arent a performance killer.... Ive got 50-100 group definitions in some very large projects. AND file links. AND 100's of views. And on and on. But all of these things together, if its not managed and controlled appropriately, and no matter what Autodesk does, itll run like a dog...

dbaldacchino
2009-08-25, 02:37 PM
You need to ask yourself the question: what can I control TODAY? We can all waste countless hours bitching and moaning about the software needing to catch up to the hardware, etc. etc. But that doesn't get us anywhere. We have what we have and need to do the best we can by following known best practices and be lean in our approach. Also ask yourself the question: what is my GOAL? What are you trying to achieve? Set those expectations and make the model do what you need it to do in order to meet those goals and not the other way round. Statements such as "but I can model everything, so why shouldn't I be able to do that?" are counterproductive. So if I can drive a car over a bunch of pedestrians, does it mean I should? Ok, bad example but you get the point :D

twiceroadsfool
2009-08-25, 02:38 PM
Im never getting in the car with you, Dave. Ever. :)

DaveP
2009-08-25, 02:58 PM
Im never getting in the car with you, Dave. Ever. :)
Heck, look at his avatar.
No wonder he runs people over! He's not even looking where he's going.
As opposed to his Revitting, I'm sure, where he knows where he's going all the time.

jj mac
2009-08-25, 03:15 PM
Aaron and Dave,

You both have great points here. The technical note defines standard management practices to follow (that regardless of whether or not a large model runs as fluently as "Call of duty") should in fact be followed to maintain optimal performance. But it doesn't answer the original question.

I just want to know where the bottle neck comes from. I want to know technically, in computer terms, why it takes so long to process that amount of information, and I want to know if in fact Revit IS or IS NOT taking full advantage of the computers resources (which I think we can agree we all know that it is not).

Considering how powerful our computers are ((3.4 GHz CPU's, 8 GB Ram (minimum, most have 12), high end GPU's, and SCSI hard drives )) and considering what the cost of what the hardware and software is combined, just to be able produce a large model, you expect a certain level of performance. It's like a $10k - $15k investment...

Planning from the get go is the absolute first thing that happens here before anything is modeled. Projects, are broken up in to links - Structure model, architecture model, shell model - depends on the project. Worksets are utilized and managed to full... Warnings probably need to be more regularly reviewed which is something I am taking in from this thread, but we are following these guide lines as best we can.

The reason I say these are workarounds and do not solve the problem is because they still don't fully solve the problem. We're doing everything we can, but it can be a lot better. I think everyone here can admit that; and if not than I guess it's just my opinion.

From my experience, what this comes down to at the end of the day, and this is also my opinion, is that there is some improvements needed in the core programming of the software to make it better.

cliff collins
2009-08-25, 03:42 PM
Follow this rule of thumb:

400MB Revit file requires a minimum of 8GB of free RAM

So, a 200MB file requires 4GB of free RAM
Therefore, 32 bit systems will not have enough memory to open or save central files
or local copies. 64 bit systems, with a minimum of 8GB of RAM, are required when file size exceeds 200MB.

Multi-core processors are just now coming into play with RAC 2010.
Only certain functions, such as printing and wall joins, take advantage of multi-processors.
(Rendering will take advantage of up to 4 cores.) The wall join feature has to be enabled,
and does not function out-of-the-box--as it is set by default not to be active.

The faster the clock speed of the CPU the better.

So--RAM is probably the most critical factor for performance with large projects.
Break up the project into smaller linked Revit models--i.e. 100MB or less if possible.

The LAN or WAN should also be scrutinized--gigabit speed? Cat 6 cabling?
high quality hub/switch? Fast hard drive on server? etc.

cheers........

twiceroadsfool
2009-08-25, 03:58 PM
Aaron and Dave,

You both have great points here. The technical note defines standard management practices to follow (that regardless of whether or not a large model runs as fluently as "Call of duty") should in fact be followed to maintain optimal performance. But it doesn't answer the original question.


As i said, id be careful applying everything in that technical note, just because they say its best for your machine. Itll kick you in the nuts in other areas, like managing your project team and your architectural process.


I just want to know where the bottle neck comes from. I want to know technically, in computer terms, why it takes so long to process that amount of information, and I want to know if in fact Revit IS or IS NOT taking full advantage of the computers resources (which I think we can agree we all know that it is not).


It obviously isnt, as youve declared. But worth mentioning, last year there was a thread where a bunch of us broadly clammored "we want multi-core capable revit, dang it!" and someone from the Factory chimed in with an important revision to the statement: We dont want it to JUST use more hardware, we want it to perform better. Case in Point: ArchiCAD. The crux of the discussion last year was that archicad was getting multi-core processing in their current release. And view regneration time was improving from 3-5 minutes, to about 20 seconds, or something. So stew on that. WITH "more efficient" computing processing, theyre still waiting 20 seconds or something, everytime they clicked on a Section. Now i havent personally used ArchiCAD since v8 or v9, i dont remember which. But it was sloooooooow. I know none of this addresses your question, but im just qualifying your statement: I dont want us to simple say to Autodesk "use more of our computers," i want us to say "make the program perform better, however you can." The former might get us a half baked multi-core RAM sucking program thats marginally faster, but then cripples our new computers, LOL.



Considering how powerful our computers are ((3.4 GHz CPU's, 8 GB Ram (minimum, most have 12), high end GPU's, and SCSI hard drives )) and considering what the cost of what the hardware and software is combined, just to be able produce a large model, you expect a certain level of performance. It's like a $10k - $15k investment...


Okay, but WHAT is that "expected level of performance?" As weve all outlined in the last few pages of this thread, most of the problems youre having with the one model in question seem to be user-created, or things that need methodologies revised in house. (Im not criticizing, im just saying). So can we quantify what the level of expectation of performance is?



Planning from the get go is the absolute first thing that happens here before anything is modeled. Projects, are broken up in to links - Structure model, architecture model, shell model - depends on the project. Worksets are utilized and managed to full... Warnings probably need to be more regularly reviewed which is something I am taking in from this thread, but we are following these guide lines as best we can.


Me personally, wouldnt have Arch in only one model on a project that size. I know some here DO do it that way, but they also have serious hardware above and beyond what ive got here. No way id have anything close to that size in one model.

And clearing out Warnings is PARAMOUNT, regarding performance and sluggishness in Revit. I bit the bullet and cleaned out a ton of warnings from a model, and it started moving rediculously faster, AND file sized dropped considerably. Plus, users who were experiencing failed SWC's (STC's back in the day) stopped experiencing it. This cant be stressed enough, its way more important than a purge / cleanout. WAY more.

Im not saying the program DOESNT need advancement, but its tough to ask more of it when were not really doing everything we could.

Rick Moore
2009-08-25, 03:59 PM
Follow this rule of thumb:

400MB Revit file requires a minimum of 8GB of free RAM

So, a 200MB file requires 4GB of free RAM
Therefore, 32 bit systems will not have enough memory to open or save central files
or local copies. 64 bit systems, with a minimum of 8GB of RAM, are required when file size exceeds 200MB.


Out of curiosity, what did you do before last year or so?

twiceroadsfool
2009-08-25, 04:03 PM
Out of curiosity, what did you do before last year or so?

Ran WinXP32, with 4 gigs of RAM and the 3G switch. Used Linked files everywhere we needed to, so that editing and working on the projects was doable.

Then when it was time to print or export, we had an assache to deal with. Sometimes we had to restart revit/the machine twice to get a set of drawings out, LOL. Or, we would have to plot/print in small selection sets on different machines, as they would run out of memory trying to plot 70 sheets with all the linked models loaded.

jj mac
2009-08-25, 04:05 PM
Thanks Cliff,

Looks like we probably need to push for more RAM, eh? From a technical note I have seen from AU, if you have a 400MB file, you can multiply that number by 20 times to get your required RAM. So 400MB x 20 = 8GB (8000MB)...

From what you are saying, should we be further multiplying that number by 2 so that is the amount of free RAM we achieve?

Please see attached file - This came from a seminar at AU2008...

cliff collins
2009-08-25, 04:08 PM
We ran headlong into this problem with 32 bit memory limitations,
and began upgrading all machines to 64 bit OS and software. No other choice.

In the interim, we would use Groups to "break off" peices of the model and save out to
new Revit projects--work on them in 32 bit systems, and then "bind" back into the model.

That actually works fairly well, if planned and executed properly!

cheers........

luigi
2009-08-25, 04:17 PM
just make the project smaller... ok...not a funny joke!

jj mac
2009-08-25, 04:23 PM
Okay, but WHAT is that "expected level of performance?" As weve all outlined in the last few pages of this thread, most of the problems youre having with the one model in question seem to be user-created, or things that need methodologies revised in house. (Im not criticizing, im just saying). So can we quantify what the level of expectation of performance is?


I guess by "expected level of performance" (and I realize I am opening up a whole new can of worms here) is that when the boss is working through some design change with a user, and he doesn't say "why is it taking so long", anymore, that would be part of it.

Another thing would be, instead of a large amount of groups taking say... 20 min to update, maybe if it only took 5 min... that would be good. WITHOUT having to break up the model into Links and then deal with the management issues of the links when it comes time for CD's...

I think it's acceptable to wait 5-10 min, for a large group to update, but when it's much longer, or gets to the point where you have a deadline, Revit is not responding, you end the processes of everything possible accept Revit and it still takes time to finish, that is unacceptable.

To clearly measure this however, I agree is extremely difficult. I think Autodesk needs to set additional guide lines for how to manage large projects in a "managing large projects" technical note (if this also already exists, outside what is available from AU, please let me know), or maybe offer some courses on how to so. This is a huge part of the process in transitioning to Revit. Now the investment in CPU's and software also requires massive training and ongoing desktop support to keep all the troops running.

twiceroadsfool
2009-08-25, 05:31 PM
Well, as mentioned, i think youre going to find that most people WOULDNT use groups for what youre using them for. ive never had a group take longer than 4 or 5 minutes to update, but then again i would never try building an entire floor plate out of a group.

I considered doing it once, and someone here showed me i was better off using Links for Floor Plates, and once i looked at what they were describing, ive never looked back. FWIW weve never had an issue in CD's with the links, once we went through and ironed out the procedure. With anywhere from 2-7 people in the model at any given time. The model was hellishly faster, too.

But to say "i ONLY want to use the tools i want to use (groups), and i want them to work faster (as fast as the appropriate tools)," isnt very practical. A group that large, and repeated that many times, is always going to take forever.

As for the Designer complaining things take too long, how much have they been introduced to Revit? I dont expect Designers and PM's to understand how to model every wall/railing/ceiling/whatever, but basic training and understanding of concepts is fundamental to their understanding how long tasks will take the end users. If theyre not trained, theyre likely to think its a simple push/pull, and thats not fair to anyone, them or you. If what theyre waiting for is groups to update... well, see above. :)

If you search around here, the subject of obscenely large projects has come up a number of times. I think youll get better advice and documents from the people here, than you will from the white papers. But i know there are a few docs floating around from various places, on large projects in Revit.

I'll rest my case, since you seem pretty adament about continuing with the groups. Good luck finding your answers. :)

jj mac
2009-08-25, 06:02 PM
Well, as mentioned, i think youre going to find that most people WOULDNT use groups for what youre using them for. ive never had a group take longer than 4 or 5 minutes to update, but then again i would never try building an entire floor plate out of a group.

I considered doing it once, and someone here showed me i was better off using Links for Floor Plates, and once i looked at what they were describing, ive never looked back. FWIW weve never had an issue in CD's with the links, once we went through and ironed out the procedure. With anywhere from 2-7 people in the model at any given time. The model was hellishly faster, too.



Aaron,

If you have a post on this procedure, or any further information I would love to see it. I am not opposed to using links for floor plates, instead of groups but I am sceptical.

We have been looking at developing some guide lines for general practice on using groups. They are such a powerful tool for creating relationships but can get so messy very quick. ie... groups with in groups. I have been trying to totally shut that down at our office unless absolutely required. I think the groups we are using are relatively clean - basically just curtain wall, with mullions and panels/doors/windows - but grouped as a floor plate.

I always thought this was relatively safe practice and generally obvious way of defining repetition in a model/design.

If it's not asking too much, any advice you have would be really appreciated.

dgreen.49364
2009-08-25, 06:14 PM
I don't know if this helps or not, but I haven't seen anybody mention it. We recently did a hotel tower. As far as the rooms and the floor plans, I only modeled the rooms I needed to. There were 5 different floor plans. Those levels only, got modeled. I did model the elevator and stair cores throughout and I modeled the rooms I needed to where I had sections cutting thru the tower. I don't know if you did anything like that or not, but if not, that would have cut down a lot on file size. For groups, we grouped the typical rooms. We didn't group entire floors.

twiceroadsfool
2009-08-25, 06:31 PM
jj-

Its not all encompassing, but here is the document i hand out to people, so that they can start to think about how they want things organized. Keep in mind: The TERMS used in this largely refer to Linked Models for Consultant Models, but youll also see int he screen shots that for the project in question, we had 7 Architectural Models linked together.

Now, this was a campus style venture, so we parsed it in to buildings. But ill attach a hand sketch of what ive done on taller buildings. Mind you, on the project at hand i kept the Core/Skin in one Model, but the exact premise of what was done with the Interior Floor Plates can be done with the Skin as well. It just wasnt necessary or value adding for us on this project. I wrote "not for core/ext/skin" just for that specific project.

Also worth mentioning: I put the repetititive copies of the Links on a seperate workset. Some people wouldnt agree with why, but this is the gist of it: I dont let users go in to the Manage Links Dialogue. If they do, then they load/unload something, and everytime they and other users SWC, the links keep unloading/loading. By doing this with the worksets, they can unload the links BY selectively loading worksets, alleviating the hardships on the systems.

Is this the only way to do it? Absolutely not. But Links arent bad, especially post-2009 release with the addition of Linked View Elevations and Linked View Sections, and View Template selective control over a view. VERY nice tools. Look in to it. It *IS* a LITTLE more management.... But ive never waited an hour for ANYTHING to update in my model, besides a rendering. LOL

ededios
2009-08-25, 06:43 PM
...There were 5 different floor plans. Those levels only, got modeled. .

Would you explain what you mean by that, I'm not sure I understand.

Did you have linked files or was all this in one file?

Thanks!

jj mac
2009-08-25, 06:53 PM
There were 5 different floor plans. Those levels only, got modeled.


How did you create your elevations, if you only modeled the floors you needed to?

jj mac
2009-08-25, 07:07 PM
jj-

Its not all encompassing, but here is the document i hand out to people, so that they can start to think about how they want things organized.


Thanks Arron. I look forward to reading this. I 'll post how we decided to go when we figure it out.

dbaldacchino
2009-08-25, 07:21 PM
DaveP, I was at a stop light. I never ran over anyone (yet) haha. Aaron, I'm not riding with you either; you have a history ;)

Good comments everyone. Luigi, that is an excellent suggestion! It can be taken as a joke or a serious point (meaning, make things leaner and light-weight...model only what's really needed for representation, coordination, take-offs, etc.)

twiceroadsfool
2009-08-25, 07:35 PM
DaveP, I was at a stop light. I never ran over anyone (yet) haha. Aaron, I'm not riding with you either; you have a history ;)



Ouuuuuuuuuuuuch. I doubt its a good History, depending on where you heard it. Hahahahaha.

JJ- Keep us posted. :)

dgreen.49364
2009-08-25, 08:10 PM
To clarify...a 15 story hotel tower. Level 1 is unique, 2-12 are identical, 13, 14 and 15 are all unique. Those plans got modeled. If levels 2 thru 12 are identical, why take up all that model space? The exterior walls were modeled, the elevator and stair cores. Elevations are correct because the exterior walls were modeled. Where sections were cut thru the tower, those rooms were modeled.

ededios
2009-08-25, 09:10 PM
.... If levels 2 thru 12 are identical, why take up all that model space? The exterior walls were modeled, the elevator and stair cores. Elevations are correct because the exterior walls were modeled. Where sections were cut thru the tower, those rooms were modeled.

So for Levels 2-12, (interior) only one level had the full floor modeled, partitions, casework, plumbing etc, and for Levels 3-12 the only areas with any information are the elevator core, stair core and any areas where there was a section cutting through?

So you were able to keep it as one file and not break it up into separate linked files?

Thanks (again)

dgreen.49364
2009-08-25, 09:40 PM
In this particular case, there was the high rise and the low rise, which were drawn as 2 separate bid packages, by request. The high rise was one file, with the low rise linked in for sections and elevations, and the structural steel was a linked file. Vise verse...on the low rise package, the high rise was linked into that model.

sbrown
2009-08-26, 03:22 PM
Aaron, In your graphic example are the exterior walls part of the group or just the interior. Ie do you have basically a core and shell model with the interior floors linked as you show?

twiceroadsfool
2009-08-26, 03:35 PM
In my sketch there are no groups, only Links and Worksets.

But for THAT SPECIFIC project, the Core/Shell was a seperate model with the Floors Linked in. We did that for a few reasons.

1. The Floor Plates were virtually identical at the three distinct "areas," save for sizes of Mechanical chases, etc. For those items, i use Design Options in the Linked file to show the different Chase sizes, etc, then use View templates to manage the Design Options of the Linked files.

2. The exterior of the building had elements that were more easily built as one entity, rather than as something tied to each floor plate (the vertical pieces that stop at different heights are modeled that way). You COULD manage this as well, with Design Options in the Links, but it had very little ROI for doing so, as the geometry on this particular building is obviously pretty simple.

We used groups for repetitive assemblies of Curtain Wall on the Shell, and for the individual units in the Linked Files (those groups were saved out to the project directory and edited and reloaded in to the floor plates from there).

Im not under any misconception that its a perfect setup, but its worked great for me, and i never had to wait more than a few minutes for anything to update. Id rather deal with managing the links than deal with the inconsistancies and problematic nature of large groups (failing to let you finish, ungrouping inconsistancies, etc).

sbrown
2009-08-26, 08:47 PM
I think its a great solution. How are you treating your room schedules? Are you using the core model as room bounding. or adding room sep lines to your linked floor(interior) models?

twiceroadsfool
2009-08-26, 09:14 PM
The rooms that are IN the Core, are in the Core Model. (Service rooms, elevator mechanicals, stair towers, etc.) The Suite Rooms are all in the Linked Floors interiors models, and Linked in to the Schedule. We actually name the instance of the Link the Floor Level, and use that field in the schedule. So you end up with two fields, IE:

2. 02
3. 02

That bothered some Project Managers (Because the room number is "02") but once they saw the benefits of the schedules working in their favor, they never looked back.

For THIS project, we went the route of (as you said) using Room Seperator Lines instead of having the Linked Models be Room Bounding. I would obviously prefer to have the Linked Models set to Room bounding, but there were two things: 1... When i did this project we were all on 32 bit workstations, some people with less than 3 gigs of RAM. I didnt want them to have to keep loading the Workset with the Link, just to not have the rooms be all screwed up when they were working. 2. It was a fairly simplistic building Shell and Core, so even though it was *unBIM and unRevit* to use the Room Bounding Lines, i didnt lose sleep over it because i could replace all the room bounding lines in a matter of under an hour, if need be. Now, i dont actually have any hard data to say Linked Files being room bounding would be an additional hardship on the hardware, but my buttdyno says it would have been, so i reacted to that. I was tired of trying to push hard with subpar p4's, LOL.

Ideally, id set the Links to be Room bounding, but would still be meticulous about putting the rooms in the appropriate file. For instance, i would still put the suite rooms in the Linked Interiors, the Cores in the Core, etc. I know that with Linked files being room bounding, some people will put *all of the rooms* in the model with the documentation, but i dont like mixing and matching what is what, so the rooms go in the same file as the building area, IMHO. The corridor between the Core and the Rooms, i put in the Linked Interiors files as well (in this case with room seperators at the core, but would do it again with Linked Model bounding).

twiceroadsfool
2009-08-26, 09:29 PM
Oh, there is ONE BIG THING that is a bit of a pain in the behind with this setup, i wont lie:

View Templates.

I like using Design Options for the minor differences in the repetitive floors. IE floors 2-10 may be the same, with minor variations in Chase sizes, clearances if structure gets smaller, etc. So, ill use the same Linked File for it, with design options.

Then when it gets brought in to Main, you take the time to set up ONE view with all of the correct Design Option selections for the Linked Files in VG:RVT Links. Then make a VT out of it (that applies only to VG:RVT Links) and apply it to every view in the project.

Then you need to make a seperate VT for schedules, as it wont let you apply the other one to schedules for some reason.

Heres the kicker though. You just need to be careful that no one else on the team makes a VT that affects other Linked files that DOESNT have the above VT made up as a base. Actually, on that note: The entire project should be set up to use that VT as a default for all new views. That, or explain to the project team that they need to apply it to all new views right away. Because if someone cuts a new section mark (like a "working section, or something) and doesnt apply that VT, they could be looking at 15 instances of the wrong Design Option. It was a small price to pay for me, but if your users arent versed in View templates and File Links, youll want to post reminders. (I kept one on the STC drafting view...)