The Revit API is actually something pretty special. People will go on and on about how Revit needs this feature or that feature, but the fact is that you can build almost any feature you like with the API. Recently, I have been running quite a few batch operations from the scope of a federated Revit model: so I will have one RVT file, with hundreds of Revit links, and I will process them from that main federated model.

On one recent project, we had to deliver to a Client a linked dataset, with Revit link file paths resolving correctly. As you know, people work in many different IT environments, and the pathing of Revit links may vary widely.

I set up an ‘approved’ list of Revit file paths, that looked something like this:

I knew that in Dynamo with Python I could get a lot of information about linked files using the ExternalFileReference class. What I discovered during this process is that there is a TransmissionData API class that let’s you do some pretty interesting things…

You see, I was thinking I would have to set up a batch method to open this files, change the file paths, and close them. But the TransmissionData class is basically what is implemented in eTransmit for Revit – it allows you to ‘lightly touch’ the Revit file and simply change the Revit link paths, and also set a switch saying ‘this file has been transmitted’. This puts the file in an appropriate state for re-opening in the new path environment. Pretty cool huh?

Once I figured out how to implement those TransmissionData actions in Python, I just had to build a node that, running from the federated model:

  • examines each link for the links inside of it
  • replaces erroneous paths with the correct file path
  • sets the new paths to the file

I did this in the hacky way of a “counter with List.Map” in Dynamo. In the future I’ll probably fix it up to be a ‘proper’ Python script but this works for now. In about an hour it fixed the linked file paths of 600 Revit links, all with the click of a single button 🙂

You can download the main definition here:

External References FINAL

You can get the supporting nodes from GitHub here:

https://github.com/LukeyJohnson/BakeryForDynamo/tree/master/nodes

As usual, please use with care. And it is probably worth backing up your files before running something like this.

It is kinda more Python than Dynamo but hey, you get the picture 🙂

In fact, here is the Python code:

import clr
clr.AddReference('ProtoGeometry')
from Autodesk.DesignScript.Geometry import *

# Import RevitAPI
clr.AddReference("RevitAPI")
import Autodesk
from Autodesk.Revit.DB import *

clr.AddReference("RevitServices")
import RevitServices
from RevitServices.Persistence import DocumentManager
from RevitServices.Transactions import TransactionManager

from System.Collections.Generic import *

clr.AddReference('RevitNodes')
import Revit
clr.ImportExtensions(Revit.Elements)
clr.ImportExtensions(Revit.GeometryConversion)

from System import Guid

import System

import sys
pyt_path = r'C:\Program Files (x86)\IronPython 2.7\Lib'
sys.path.append(pyt_path)

import os.path

doc = DocumentManager.Instance.CurrentDBDocument
uiapp = DocumentManager.Instance.CurrentUIApplication
app = uiapp.Application

tempvalue = IN[0]
approvedFilenames = IN[1]
approvedFilepaths = IN[2]
targetfilepath = IN[5]

def stripquotes(string):
	string = string[1:-1]
	return string

transData = TransmissionData.ReadTransmissionData(tempvalue)
erefids = transData.GetAllExternalFileReferenceIds()
refdata = []
for x in erefids:
	refdata.append(transData.GetDesiredReferenceData(x))

currentpaths, currenterefType, currenterefPath, pstr = [], [], [], []

for e in refdata:
	currentpaths.append(ExternalFileReference.GetAbsolutePath(e))
	currenterefType.append(e.ExternalFileReferenceType)
	currenterefPath.append(e.PathType)
for s in currentpaths:
	pstr.append(ModelPathUtils.ConvertModelPathToUserVisiblePath(s))
	
filenames = []
for p in pstr:
	templist = os.path.split(p)
	filenames.append(templist[1])
	
newpath = []
indices = []
failpath = []
origcounter = 0
matchrefs = []
newpathtypes, newbools = [], []
pathtypevar = IN[3]
for f in filenames:
	tempindex = approvedFilenames.index(f) if f in approvedFilenames else -1
	indices.append(tempindex)
	if tempindex == -1:
		failpath.append(origcounter)
		pass
	else:
		newpath.append(ModelPathUtils.ConvertUserVisiblePathToModelPath(stripquotes(approvedFilepaths[tempindex])))
		matchrefs.append(erefids[origcounter])
		newpathtypes.append(pathtypevar)
		newbools.append(True)
	origcounter = origcounter + 1

currentfilepathstring=ModelPathUtils.ConvertModelPathToUserVisiblePath(targetfilepath)
elementcount = len(erefids)
hostfile = currentfilepathstring * elementcount
currentdata = []
#currentdata.append(transData)
currentdata.append(hostfile)
currentdata.append(erefids)
currentdata.append(refdata)
currentdata.append(currentpaths)
currentdata.append(currenterefType)
currentdata.append(currenterefPath)
currentdata.append(pstr)
currentdata.append(filenames)

newdata = []
newdata.append(indices)
newdata.append(newpath)
newdata.append(matchrefs)
newdata.append(newpathtypes)
newdata.append(newbools)

setlength = len(newpath)
setcounter = range(setlength)
successreport = []
setdata = IN[4]
if setdata:
	for s in setcounter:
		try:
			transData.SetDesiredReferenceData(matchrefs[s], newpath[s], newpathtypes[s], newbools[s])
			successreport.append("Success setting data")
		except:
			successreport.append("Failure setting data")
else:
	successreport.append("You need to set the switch to True")

if setdata:
	try:
		transData.IsTransmitted = True
		transData.WriteTransmissionData(targetfilepath, transData)
		successreport.append("Success WRITING data")
	except:
		successreport.append("Failure WRITING data")
		
#Assign your output to the OUT variable.
OUT = successreport, currentdata, newdata
 

If you want to read more about the API methods used:

TransmissionData

ModelPathUtils

ExternalFileReference

Whenever you are looking to implement a new technology in your firm, you typically go through a few steps:

  1. Figure out what is out there in the marketplace – What products are available?
  2. Collect data about all of the technologies that may suit your use case
  3. Rigorously compare and analyse all the data
  4. Make a decision and go for it

There are some excellent content management tools out there for Revit now, so how can you choose? I went through a very comprehensive research analysis of a number of Revit CMS platforms, and I posted about the process here.

Then I caught up with Steve Germano over at Unifi to talk about the results. You can view (or just listen) to it here:

Feel free to comment here with your thoughts and we can keep the conversation going!

In Windows, you will often use either Map Network Drive dialog or net use command to map a network drive. You can use that method with a shared folder trick to map a local folder as a drive too, as described here.

But there is an even easier way, that is more flexible in some ways. It is the subst command, and it basically tells your Windows system to refer to a folder as a drive letter. Its usage is very simple, for example:

 
subst J: "E:\some folder\J_DRIVE"

If you want that to show up as a ‘drive’ at each reboot, just put the above command into a CMD file and point to it from your Windows startup folder.

For your assistance, here is the path to your typical User Startup folder in Windows:

I kind of hate surveys. They seem to be a bit of waste of time, right? Hypocritically, I have created a mini-survey on Twitter. It covers topics like:

  • primary BIM software in use
  • BIM hardware, including OS, number of monitors, RAM
  • BIM project types
  • BIM user experience
  • BIM standards in use

Please feel free to navigate through the questions below and vote on them.

You have 7 days 🙂

I look forward to seeing the results…

Going back in time, there was 123D Catch and related processing engines. Basically, it was a tool that took photos and turned them into something real in 3D. There was also something called Remake.

Now, we have Recap Photo, which basically does the same things. Over time, the processing engines have improved. Recap Photo is part of your Recap Pro licence, and now integrated into your Recap Pro install. It looks something like this when you install the latest version of Recap (I downloaded the web installer from manage.autodesk.com):

Once installed, you can start the standalone ReCap Photo app:

 

Features of ReCap Photo include:

  • A new photogrammetry engine that can process up to 1,000 photos, a 4x improvement from the previous maximum of 250 photos (note: using the cloud service consumes Autodesk Cloud Credits)
  • The ability to set GPCs (Ground Control Points, survey points) in any coordinate system.
  • New functionality to support vertical and nadir photos (photos taken by drones and UAVs at 90 degrees above the site)
  • View your 3D photo-textured mesh
  • View the geolocated orthographic view, zoom in and out, and add measurements, tags, and annotations.
  • Share the project, including its additional metadata (measurements, tags, annotations), with anyone.
  • Merge laser scan point clouds with UAV-based point clouds.

From this post http://blogs.autodesk.com/recap/introducing-recap-photo/

In this webinar, LHB’s Dan Stine walks through a proven workflow for collaboration and client engagement using Revizto. This presentation simulates a client meeting, highlighting ways Revitzo can be used to explore the model and capture client comments and requested changes. You can see how several Revizto features can be used collectively to demonstrate the design intent and react to client questions with minimal effort.

You can view it at:

In case you missed it, the AEC Collection now includes:

  • Revit Live
  • Robot Structural Analysis Professional
  • Structural Bridge Design
  • Dynamo Studio
  • Advance Steel
  • Fabrication CADmep

You should find these in your Autodesk account after logging in.

Quote from In The Fold:

Here are some highlights of what we’re adding to the AEC Collection.

In the new Collection, you’ll find Revit Live. Revit Live allows you to visualize your Revit models by turning them into immersive visual experiences. You can even take your designs into a virtual reality environment—in just two clicks of your mouse.

Also included is world-class analysis and computational design software. Robot Structural Analysis Professional and Structural Bridge Design work with Revit to extend your capabilities to perform structural analysis for any structure.  And, you can now access the Dynamo Studio standalone programming environment to help solve challenges faster by automating workflows that drive the geometry and behavior of your design models.

Finally, the AEC Collection now extends capabilities from design to fabrication. Interoperable with Revit and Navisworks, we’ve added Advance Steel and Fabrication CADmep to the Collection so you can conceive, model and fabricate better MEP and structural steel systems on an integrated platform.

Original post: http://blogs.autodesk.com/inthefold/aec-collection-additions/

Curtain Walls in Revit are strictly a Family of the Walls Category. Then you have Types for each type of Curtain Wall. What if you want to select all Curtain Walls at once? You can’t multi-select types in the Project Browser to do this, but…

You can make a suitable Schedule to do it. Here’s how:

  1. Make a new Wall Schedule
  2. Only add the Family data field
  3. In Sorting / Grouping tab, Sort by: Family and untick ‘Itemize every instance’
  4. Now, in the schedule, click inside the Curtain Wall cell, and
  5. Use Highlight in Model to select them all

You could then use Save Selection, or Temporarily Isolate Elements in View, depending on what you want to do next.

Slightly over 6 months ago, I was approached by one of my associates over at Unifi with an idea. They wanted to engage in a detailed competitive research project focused on content management systems for Revit. I was pleased that they approached me, because I obviously love Revit and I also love helping people to improve the whole ecosystem of software tools that surround Revit and BIM. In fact, I often provide too many suggestions to software companies I think 🙂

In this case, I was particularly interested in the topic as well. Having used Revit now for around 10 years, I had started to observe a trend in how Revit gets implemented into firms. Typically, they:

  1. Buy some Revit licenses
  2. Teach their people to use Revit
  3. Look at ways to standardize their use of Revit, perhaps through standards and template files
  4. Try to maximize the impact and benefit of BIM through some vertical products, such as Enscape
  5. Start thinking about how to deal with the many gigabytes of ‘content’ they have now gathered and that is sitting on the file server in their office…

Having seen this over and over again, I knew that evaluating, choosing, and setting up a Revit content management system is no easy task, yet it is a hugely important one. It is something that often gets neglected for too long, and results in many wasted hours as people go blindly looking for ‘that family’. As you know, I willingly share time-saving knowledge, tips and workflows here and via Twitter, so this competitive research project really ticked a lot of boxes for me. I would be able to:

  • do a deep analysis of content management products for Revit
  • observe the strengths and weaknesses of each
  • be better informed and able to assist people who often ask me about Revit content management
  • provide some feedback to Unifi about how their product and offering could perhaps be improved (and as I said above, this is something I often do for free)

In this particular case, I knew that there would be a lot of time involved. I was going to have to obtain, install, test, benchmark, and document a whole lot of information about various Revit content management systems. As a father of three, a technology blogger, and someone who works almost daily for different companies delivering various projects, time is extremely hard to come by. So I felt it was quite appropriate in this instance to be commissioned by Unifi to perform this research task. I had never been part of a commissioned, competitive research project before, so I knew there may be some challenges. However, given the amount of time that would be involved, I would only be able to do a proper and thorough job if I was reimbursed for the time I would need to dedicate to it.

You might say that being commissioned for the task introduced some bias, but I’ll tell you why that cannot be true. Unifi wanted to know how to improve their product, they basically wanted to know what could be improved so that they could remain competitive with their competitors. For me to somehow do a biased job would have been way off-base. I needed to be honest, and brutally so. I had to show the Unifi people if and how their competitors were stronger than they were. I admire the fact that Unifi undertook this whole project. Evidently, they wanted to make sure their product was the best it could be. Personally, I would have an avid listener, someone who would be happy to hear all of those software ideas that I come up with!

So I accepted this project as a commissioned, competitive research task. I would record my results and provide a number of comprehensive deliverables back to Unifi. How would I go about this job? There were a few logical steps:

  • establish the list of products that would be researched
  • obtain the products
  • install them onto a test workstation
  • evaluate the features available in the product and fill out a detailed comparison matrix
  • perform benchmarks to establish speed, performance, and capability of each product
  • use some large sample content datasets to really put the products through an intensive test to check for problems that may occur with huge content libraries

Looking at the above steps, you can see how much time would be involved. But I also thought it would be a good idea to involve the developers of the competing products in the research. I thought that if I could be better informed about the other products, my research output would be more complete and accurate. And I still think that this was the right approach. I spoke to some of the competing companies and described the fact that that I was doing a detailed research project and would like to discuss their product with them. Most of them knew me as a blogger and technology professional, and so they were pleased to meet and discuss their content management product.

And then I made a mistake.

I should have started those meetings by saying that I had been commissioned by Unifi to do the research project. But I did not mention that fact. We had informative meetings each time, that helped me to get a better understanding of each product. But I can see now that I should have simply told them how the research project came about. I actually don’t think it would really have influenced the discussion a whole lot, because I still think it was in their interest to assist me in understanding their product. However, I do feel like I should have been transparent at the time. I would like to publicly apologize to those competing companies for not initially disclosing that I was commissioned to do the research.

This was a lesson learned for me, and one I won’t make again if I undertake a similar research project in the future.

I do not apologize for taking on the research project, because it was simply the best way that a task like this could be handled, and I do believe I was the right guy for the job. I understand that it is quite common for companies to engage 3rd party professionals to perform market research, but for me it was my first time. As stated above, my thought process was basically that:

  • I would receive some reimbursement for my research time, that
  • I knew the research output had to be complete and accurate and honest, and that
  • It would be beneficial to speak to the individual developers about their product.

Following the consultation phase and data gathering phase of the project, I had quite a substantial amount of data to work with! How would I filter through all of this and truly make it comparative?

To begin with I put a lot of information and notes into a detailed OneNote notebook. I then started an Excel document where I would store most of the comparative results. I had a few key worksheets where most of the raw data was stored:

The Matrix worksheet contained a whole lot of data, over 150 rows and 15 columns. I broke the testing and comparison up into some major categories:

Note: the UX2 value above refers to things like bugs or user interface problems, and in that case a higher score would be worse (more bugs).

For feature comparison, I used a weighting value and a formula. Here is a sample of some of those weighting values:

So, in the case above, I viewed Parameter Searching as more important (5) than Uniformat Filtering (2). These weighting values are based on my experience and my association with other BIM professionals.

From this point, I reviewed the capability of each product and used a Yes / No value to determine if a given product would ‘score’ for that feature:

If a product achieved a Yes value here, it would also obtain the Score for that Feature.

I used a set of PivotTables and Charts to break down and review that feature data.

I also performed some performance and speed benchmarks, and stored these in another worksheet:

Ultimately, this data was all collected and provided in combined form along with some Powerpoint slides. The slides cover topics like:

  • What is BIM Content?
  • What is Revit Content?
  • Why Content Management?
  • How is Content Managed?
  • How do I choose?

Along the way, we had to do a few interesting things. We were finding that the Australia internet speed was not really a good place to start with cloud benchmarking. So we obtained a cloud based virtual workstation that more closely reflected the type of internet speed you would experience in the USA, and I then had two sets of ‘cloud speed’ benchmark data, the Australian and the US versions.

An example of how some of my research was used may be seen in the recent new offering by Unifi. The research identified some differences in the pricing models of the various products, and this information assisted Unifi in the creation of an additional pricing model.

Where can you learn more about this research? A couple of weeks back I mentioned that the details and results of the competitive research project will be shared in a global webinar.

You can register to attend the webinar here.

In summary, I really enjoyed doing this research project and I think the results will be useful to Revit users and BIM Managers who are trying to evaluate different content management tools. It is true that they each do have certain strengths, so which will you choose?

 

 

PS. It is interesting to look back, to where almost a year ago I asked you all: