A while back, I conducted extensive research into Revit content management tools. I was commissioned by Unifi to do this, and I told the story of the process over here. You can also watch the related webinar here. Over the last couple of years, some of you have approached me to gain access to the master Excel comparison matrix document that I produced. Recently, Jay Merlan updated this document on behalf of Unifi and it has now been approved for public release!
The document is very detailed and consists of a number of key sections:
Matrix – where data is entered and initial scores are calculated. This includes a ‘feature weight’ where you can allocate how important a given feature is to you personally.
Screencasts – links to actually tests undertaken
Test Results – summary sheet
Cost data – a series of sheets for attempting to compare and calculate overall cost of the content management system
Summary Pivot Tables and Charts
Overall Summary Chart
As it is an Excel document using Formulas and Pivot Tables, it could be a very powerful starting point for you to dig in and investigate the various features of Revit and BIM Content Management systems and Content Providers. I hope you find it useful!
Feel free to comment here with any of your thoughts, and if you have any questions about the document and how it works.
I have been resisting the 2019.2 update because there were some issues with it originally, plus the forced upgrade of Dynamo. I decided to go ahead with Revit 2019.2.1 Hotfix today. I think that the forced Dynamo update occurred with the 2019.2 major point version, but 2019.2.1 seems to install without forcing the Dynamo upgrade – on my machine, Dynamo 22.214.171.12466 was not updated during the 2019.2.1 hotfix installation. Here are the links:
If Revit 2019 and Revit LT 2019 are installed side-by-side and the 2019.2.1 Hotfix is only applied to one of these products, a “Could not load type ‘Autodesk.Revit.DB.ICloudExternalServer’ from assembly” error will be displayed when launching the non-updated product. To alleviate this error, make sure to apply the 2019.2.1 Hotfix to both Revit and Revit LT in side-by-side configurations.
Harlan Brumm recently tweeted about updates to Revit:
so you might have noticed, we released a number of fixes for Revit versions. These are important fixes to address security issues and to ensure your product keeps working as you expect. Check out the read-mes for more details. https://t.co/rLGfi9aihM – this will be updated too
I’m sure you are aware that intellectually Revit shared coordinates take minutes to explain, but emotionally they take years to master 🙂
I’ve been looking for a way to check and validate coordinates using the Revit API. One method I implemented in VirtualBuiltApp is to gather Grid Intersection coordinates and compare those, but obviously you need a federated model with links to achieve that comparison.
One interesting fact to note is this:
two Revit models can Report functionally identical shared coordinates (same translation and true north rotation), and you can still receive “The host model and the link do not share the same coordinate system. Default center-to-center positioning will be used”.hashtag sadface, hashtag why-revit-why
If we put this another way:
if two models don’t have some related history (created from the same file), or
if Acquire or Publish Coordinates has not occurred between those models, then
the Shared Coordinate error will appear — even if they report identical Spot Coordinates and True North Rotation
If you are wondering what the Revit API actually does support in terms of Shared Coordinate setup and validation, here is the best bit of Revit API Shared Coordinates information I can share:
A GUID-based relationship is set up between the files. Setting up the same relationship has been possible via the API via Document.AcquireCoordinates() for a few releases.
With 2018’s SiteLocation.IsCompatibleWith() it is also possible to identify if two coordinate systems are the same.
If you have Desktop Connector installed, you probably realise you can ‘upload’ Revit models and other files to BIM 360 Docs by dragging and dropping to the folder in Windows Explorer (using Desktop Connector). However, when you try and link this using the BIM 360 shortcut in Revit, you might not be able to see the file…
Here is a workaround that may allow you to link a non-initiated Revit model into your Revit file:
Ensure you have Autodesk Desktop Connector installed
Start Link Revit command from the ribbon
Update: Click on the Address drop down
Click on This PC
Browse to BIM 360 from the window below:
Select the file (non initiated) that you want to link
You should get the BIM 360 prefix in Manage Links:
Here is a video of this process (with audio, This PC – BIM 360 workflow):
In the new homepage for #Revit 2019.1 when you hover over a recent project, you get the path of the central, your local, and the file size! If it’s not workshared, then you get the filepath and size. Pretty sweet.
I previously posted about how to quickly repath links based on some control mechanisms. Enter BIM 360, and the wild world of Revit cloud worksharing… I expect that it will be commonplace now for existing projects and datasets to move across to BIM360 ‘mid project’. But that creates some interesting problems, like creating folders, dealing with the initiation process, and replacing local Revit Links with their cloud versions.
This post is focused on that process of changing all of the Revit link paths to link to the BIM 360 models. Unfortunately, the previous method I used (TransmissionData, like eTransmit) is not available for cloud hosted models. So how do we automate this process?
We went about it this way:
Initiate all Revit models on the BIM 360 Document Management cloud (manually, for now)
Create one federated model on the BIM 360 cloud that links in all the other cloud hosted Revit models. You might do this one manually, using Reload From in the Manage Links dialog box.
Once you have that one ‘super host model’, use a batch process to harvest all of the cloud model data
Using the harvested data, create a script that implements a Reload From method to batch reload local models from their cloud counterpart
On the journey to solving step 3, I experimented with a few different methods. I discovered that you need to use the ExternalResource class to get information about BIM 360 cloud models (not ExternalReference).
I also realised that I had to deal with Reference Information, which appears to be a .NET dictionary per link that stores some funky Forge IDs and so on. But I want to store all this data in our VirtualBuiltApp BIM Management system, so I had to serialise the Reference Information to a string that could be stored in a database VARCHAR field (or push to Excel if you are still doing things the old way). Dimitar Venkov gave me a few tips about using JSON with IronPython in Dynamo (thanks mate!), so after that all the harvesting pieces were in place!
Here is some of the harvesting and JSON code. Notice that I played around with using a container class to pass data between Dynamo nodes. In the end, JSON string was the answer:
data = 
for u in unwraps:
container = dummy()
sdicts = 
for y in data:
dictinfo = ExternalResourceReference.GetReferenceInformation(y)
container.dictinfo = dictinfo
The next step was to create the ‘batch reload from’ tool. Now that we had the necessary data, we just had to use it to grab the matching cloud path information (from our database) and apply it to each Revit link.
I created a node that essentially built a new reference path from the JSON and other data that we had harvested. Here is some of that code:
des = 
for x in referencesInfo:
newdicts = 
for y in des:
serverGuids = 
for g in serverIdsIn:
tempguid = Guid(g)
newrefs = 
for z in range(len(referencesInfo)):
serverIdIn = serverGuids[z]
referenceInfo = newdicts[z]
versionInfo = versionsInfo[z]
sessionPathIn = sessionsPathIn[z]
tempRef = ExternalResourceReference(serverIdIn, referenceInfo, versionInfo, sessionPathIn)
OUT = newrefs
The final step was to get a RevitLinkType and a matching ReferenceInformation and apply them to each other. I stored the data in our cloud based BIM Management Application, VirtualBuiltApp. Then I could easily just pull the data into Dynamo with a suitable database connector, and match up the RevitLinkType in the current file with its associated cloud identity. For that genuine 90s feel, you could use Excel to store the data as it is just a JSON string and some other strings:
Here is the key bit of code that actually changes the link path (without all of my other error checking bits and pieces):
newCloudPath = newCloudPaths[l]
reloaded = fileToChange.LoadFrom(newCloudPath, defaultconfig)
successlist.append("Failure, not top level link or workset closed")
To actually implement the script and get productive, I opened 4 instances of Revit, and then used this process in each instance:
Open the Revit file from BIM 360, with Specify… all worksets closed
Unload all links
Open all worksets
Run the Reloader Script
Confirm link status in Manage Links
Optional: Add ‘bim 360 links loaded’ text to Start View (just for tracking purposes)
Optional: Add comment to VirtualBuiltApp (optional, for tracking purposes)
Close and Sync
In this way I can have 4 or more sessions operating concurrently, fixing all the link paths automatically, and I just need to gently monitor the process.
One nice thing is that I set the script up to immediately Unload a link after it had obtained and applied the new Path information. This means that the Revit instance does not get bogged down with many gigs of link data in memory, and in fact this is way faster than trying to use Manage Links for a similar process.
Ideally I would like to fully automate this, to the point where it opens each file, runs the script, and syncs. Unfortunately, time didn’t allow me to get all the code together for that (for now).
Finally, because we are using our custom built schema and validation tools, we can easily create visuals like this:
Modified versions of the Dynamo graphs can be found on the Bakery Github here:
Federated Central file -> Everything else (no nested links)
Next challenge: how can we quickly promote those nested links into our federated model? Well, we currently have the module files populated, so how can we leverage those positions to promote the nested links?
Promoting Nested Links
It is a bit hacky, but here is how I went about it. For each module file:
Create an empty proxy file (New Revit project, no template)
Open the Module file and resolve all link paths (so they are loaded)
Set links to Attachment
Copy / paste the link Instances from the module file (Level 0 or Base Level) into the empty file (Level 1, default level). You can use Dynamo graph above.
Save the new proxy file as ‘ModuleContainer’ or similar. We know have a file that only has link instances in it.
Open a detached copy of the Federated Central file (you can save as temporary copy if you like)
Select the current Module file,
replace with the ModuleContainer you created. Once you have done all the modules, you are ready for binding as described below.
After populating the detached Federated Central file, we just need to Bind and then Copy / Paste the free instances:
After load, Bind the ModuleContainer files to the detached Federated Central model
To do this, right-click on the file in Revit Links in Project Browser
Select All Instances – In Entire Project
In the Ribbon, click Bind Link
Untick Attached Details, Levels, Grids
Click Remove Link when prompted
Your nested links are now promoted!
Optional: Ungroup all of those bound free instances (select all from Project Browser, Ungroup in Ribbon)
Open the real Federated Central file
Copy / paste the populated free instances from the detached Federated Model to the real Federated Model. Use the Dynamo script to collect them.
Tip: Paste into a closed workset for better performance.
Close the detached one, and sync the real one
The only thing we technically ‘broke’ or lost in this process is the module link->nested link relationship. So if someone decides to move one of the nested links in the module link, obviously that won’t replicate into the Federated Central file.
Hope this helps some of you out there building or dealing with large federated models.