Yes, I’m going to do this. I’m going to write another post about a widely used term and try to untangle reality from the fiction. Let’s talk about the term “AI”… Mr Gates wrote a compelling article recently about AI Agents specifically, and even Mr Clippy got a slightly dishonourable mention. But hold on to your chatbots hats, because we are going to go right back to basics…

First, let’s define some terms related to AI.

Intelligence
The faculty of understanding; intellect.

Artificial
made or constructed by human skill, especially in imitation of, or as a substitute for, something which is made or occurs naturally;

Now, we might assume that Artificial Intelligence means something man-made that has the faculty of understanding, right? Right?

As a verb, to understand means to know or realise the meaning of words, a language, what somebody says, etc.

Now, let’s compare this with a common definition of artificial intelligence
the study and development of computer systems that can copy intelligent human behaviour.

Or, on Wikipedia:
Artificial intelligence (AI) is the intelligence of machines or software, as opposed to the intelligence of humans or animals. 

And finally, AGI:
An artificial general intelligence (AGI) is a hypothetical type of intelligent agent. If realised, an AGI could learn to accomplish any intellectual task that human beings or animals can perform.

Now, you may ask, “what’s with all the definitions?”  My personal feeling is that the perception of commonly available AI tech progress is largely overestimated and misunderstood, and that the ultimate aspirations of AGI are some distance away. So what’s with all the AI hype?

A few things definitely changed in the past couple of years. Tools claiming to be AI became widespread, and the technology started to be viewed as useful and cool. Let’s start with OpenAI and ChatGPT. It took the world by storm! It has a massive budget and huge backing. But what is it really?

ChatGPT, PaLM / Bard, and LLaMa are all Large Learning Models. They consume massive amounts of data, build a kind of neural network, and then you can converse with them. But guess what? They don’t have the human faculty of understanding. They are effectively guessing what the next word should be based on context and a massive set of data and processing power. Almost all Copilot-like technology works on this basis, except you can introduce your own contextual data into the model. LLMs must be trained on data that was ultimately produced by humans when in its “most raw form.”

Is it cool? Yes. Is it useful? Sure. Can it save you time? Yep. Is it anywhere near human intelligence? Of course not. You can’t measure a faculty of true understanding based on outputs and interactions. Understanding happens on the inside.  I don’t believe I’m saying anything too controversial, but I do believe we should occasionally ask the question “what level of AI tech do we really have right now?”

What about image generation, specifically text-to-image models like DALL-E 3, Imagen, and Midjourney? These actually use some variation of an LLM – the input text is converted and fed to a generative image model. That model has been trained on massive amounts of text and image data scraped from all over the place. It seems creative but isn’t it really a tool that adapts and morphs known images into some new variation? Again, they must be trained on data that was ultimately produced by humans when in its “most raw form.”

Perhaps I’m oversimplifying, but does the current global state of AI tech approach anywhere near the creativity and original ingenuity of a human? Not even close. They are powerful tools that are transformative and disruptive. But they are really Super Guess Makers. They produce contextual linear things that ultimately are a version of “the system knows billions of types of representational data, and it can try to build you something like what it already knows.”

I want to introduce you to a new term. At Deep Space, we have a proprietary data framework that we call Core Thread Technology. Part of that technology framework is Embodied Intelligence. Let’s define it..

Embodied Intelligence is found when a computerised system has inbuilt comprehension of data classification, relationships, workflows, and qualitative measures. This inbuilt comprehension is encoded into the system by humans based on a depth of real world experiences, probably over decades.

Does Deep Space have AGI right now? No.
But Deep Space does have Embodied Intelligence.

We have already established that the majority of cool and powerful AI tech currently available has been trained on raw data originally and usually attributed to human agents. I believe the AI term itself is really too generous for the majority of the currently available tech. But Embodied Intelligence is a practical term that embraces the fact that there are experienced industry leaders who can legitimately train a system in specific ways to maximise productivity.

That is what we are doing for Digital Design and Construction at Deep Space right now.

Is anyone still here?

Yes!! And it isn’t an AI bot either! I know it has been a bit quiet around What Revit Wants lately, and you might have guessed that it has something to do with a little startup called Deep Space

I hope to one day share the whole story from Architectural drafting, to Revit, to Virtual Built Technology (BIM and VDC Consulting), to Revizto, and finally to where we are today – Deep Space!

For now, I just wanted to reconnect and get you thinking…

  • What will 2024 hold for our industry?
  • Where will the AI stuff finally land?
  • Will the global economy stabilise and accelerate?

What Revit Wants has always been about sharing. Sharing insight. Sharing best practice. Sharing the best tools. Sharing the best and craziest hacks to get your job done. And that’s not going to change.

But we are going to talk more about data. About automation. About AI – both real and imagined. About changing things for the better. About breaking down barriers. Barriers that divide us. Barriers in technology. Barriers in communication. Philosophical barriers. Our own self enforced mental barriers!

When I started using Revit, I knew it would be transformative. It was a smarter way to work! When we discovered Revizto, I was glad someone had built the tool I had sketched and prototyped in 2014 – an integrated collaboration platform for 2D, 3D and more. As we built Deep Space, we knew we were entering a new generation – The Next Generation. Something that would be practical, data-focused, agnostic, embodied with years of domain expertise. We called our proprietary engine “Core Thread Technology”.

Anybody can build dashboards, but only Deep Space has a data-first, structured, relational, historical, self aware, replicated, highly available, predictable, tenanted engine for capturing, storing and automating real digital design and construction workflows at both project and portfolio level…

I spoke to someone recently and we talked about “data, the new gold”. He laughed and said that phrase was used in some industries back in the 90s!! So ‘data’ isn’t really new. But for specific industries, at various times, it gets unearthed and polished and shaped into something valuable. That is happening rapidly now in AEC and Digital Construction. That is what Deep Space was built to do. Mine this AEC golden data, clean it, polish it, connect it with other gems of information and make something both beautiful and useful and transformative. Let’s actually make the most of the data we have!

Let’s use data to build better projects faster, and let’s start right now.

Look out for more posts and updates coming soon…

I’m very excited to report that the Deep Space platform continues to rapidly evolve, bringing exciting updates that empower civil and infrastructure projects. In a recent release, Deep Space introduced new and enhanced existing synchronization plugins and tools, revolutionizing data acquisition, analysis, and compliance checking. This blog post delves into the key new features of the Deep Space Sync add-in, focusing on its IFC integration and LOI (Level of Information) Report capabilities. Jump straight to the video here.

Simplifying Data Acquisition and Storage

Deep Space now offers a user-friendly IFC Sync plugin, enabling users to effortlessly load data from IFC (Industry Foundation Classes) files into the platform. By selecting the desired IFC file, workspace, and project, users can seamlessly upload the information to Deep Space. This streamlined process ensures that different data and model formats, such as IFC and Revit files, are easily accessible and integrated within the platform.

IFC Sync

Exploring Data and Parameter Analysis

Once the data is uploaded, users can navigate to the Deep Space Explorer platform to explore various applications. The Data app provides a comprehensive overview of the acquired information from the IFC file. Users can visualize the data, including the element count, parameter summaries, and individual parameter values. By clicking on specific objects, users can access detailed parameter data, facilitating a deeper understanding of the model.

Efficient Compliance Checking with LOI Reports

Deep Space’s LOI Report app is a powerful tool for compliance checking, particularly for projects adhering to government requirements like the Transport for NSW Digital Engineering Standards. The app automates the verification process, comparing the required parameters against the actual data within the IFC and Revit-based models. By unifying both types of data, Deep Space provides a single platform for comprehensive checking, streamlining the compliance process.

Advanced Analysis and Customization

The LOI Report offers an array of functionalities for in-depth analysis. Users can check the existence of parameters and validate the presence of data within those fields. Deep Space also supports the verification of shared parameter GUIDs and duplicates, ensuring the uniqueness of asset IDs. Through drill-down capabilities, users can access specific files, view available information, and explore the requirements tied to each parameter set.

Configuration and Automation

Deep Space provides a robust configuration engine (we call it DS Command), allowing users to modify templates, load parameter standards, and create a master parameter set for consistent use across projects. Additionally, the platform offers scheduling capabilities, enabling users to automate the export of models at predefined times. By simplifying configuration and scheduling tasks, Deep Space optimizes efficiency and reduces manual effort.

Conclusion

The latest Deep Space release showcases its commitment to overcoming industry challenges related to diverse data formats, data acquisition, and analysis. With the IFC Sync add-in and updated LOI Report app, Deep Space provides a comprehensive solution for integrating IFC files, performing compliance checks, and ensuring predictable outcomes. By leveraging Deep Space’s powerful features, Digital Engineers and Design and Construction teams can accelerate project delivery, reduce risks, and enhance overall quality.

Watch the release video here

As part of a very bad Cloud Model upgrade experience (a story I will tell some other time…) some models failed to upgrade.

I was trying to upgrade a file from Revit 2018 to Revit 2022. After upgrading, I was unable to save this file, even though I have a machine with 192GB RAM.

Revit reported this message:

“This computer does not have enough memory to save…”

“Increase the available memory or contact Autodesk Support for more information”

 

I contacted Autodesk Support, and eventually got this reponse:

Due to the “Multi-category Schedule”, the Revit file was failing to save in Revit 2021 or Revit 2022 version as the schedule view is large. 

After deleting the schedule view we are successfully able to save the file. Attached is the fixed model…

 

Sounds good! Let’s give it a go:

  1. Open the model in Revit 2018
  2. Identify the Multi-Category Schedule
  3. Let’s back it up while we can using “Save…”

  4. Now delete the Multi-Category Schedule

  5. And then Save the Revit 2018 File somewhere
  6. Open Revit 2022
  7. Manually upgrade using Revit 2022. Detach, Audit and Specify worksets (All Closed) on Revit 2022 open settings:

  8. Does it save now – yes!
  9. Initiate the Model into the correct folder on the (now upgraded) Revit 2022 BIM360 site.
  10. Finally, I tried using “Insert Views from file” to load back the Revit 2018 Multi-Category Schedule into the upgraded project – but it still would not save if that schedule exists:

  11. I guess we have to do without that Schedule for now – it saved fine again after deleting it. Happily the project uses Deep Space so we have access to all Revit data anyway!

One day, when I recover from the experience, I will write about the overall Revit Cloud Model Upgrade experience when dealing with hundreds of models and trying to jump from Revit 2018 to Revit 2022…

Back in 2016 I put this simple, messy sketch together to describe some basic terminology around Revit models and how they are structured:

model_hierarchy_lj

 

Revit Model:

  • Category
  • Family
  • Type
  • Instance (also known as “Element”)

 

May be hosted by:

  • Levels
  • Grids
  • Reference Planes
  • Faces
  • Special Categories (Floor, Ceiling etc)
  • Nothing

 

May belong to:

  • Phases
  • Worksets
  • Design Options
  • Groups
  • Other Families (nested)

Problem:

I was recently trying to link Revit models to a federated file on Revit 2021. The models were failing to link, but no error message or warning was provided. The federated host model and the links were all initiated and ‘live’ collaborated models on the Autodesk Construction Cloud.

 

Resolution:

I had previously used this method ( Moving the Revit BIM360 CollaborationCache Folder to a Secondary Hard Drive ) to move the cache for BIM 360 / ACC Docs to a secondary hard drive. Windows Explorer showed this hard drive was essentially running out of space. So Revit was failing to link the model, but wasn’t providing a ‘insufficient hard drive space’ warning.

But why was the hard drive filling up?

I reviewed the files and folders and discovered that one particular program was generating massive log files:

The final resolution was quite simple – delete the massive log files, and then the files linked to Revit fine. I used 7-zip with the “Fastest” and “Delete files after compression” option to archive and delete those logs.

To future-proof this scenario, I’m considering writing a Powershell script to detect and possibly archive and delete the existence of massive log files.

I have been receiving a few requests for access to files that were previously linked to What Revit Wants via Google Drive. Somewhere along the line, Google (in its wisdom) decided to change security requirements and now my inbox has been filling up with ‘Share requests’. My opinion on Google business practices in general is pretty well known after this saga.

Some of the more popular files requested in recent times have been:

  • CurvedMullionLJ.zip – from post here about Curved Mullions
  • SetMarksToElementID.zip – from post here
  • PointCloud.zip – from post here
  • Generic Label.rfa – from post here
  • URL Text Symbol Annotation D download.rfa – from post here

In any case, I have now decided to move the hosting of What Revit Wants files and resources to wrw.is and you can view and download them all below.

Note: you will need to login to this site FIRST to see and download the files.

If we head over to IFC 2020 | Revit | Autodesk App Store we can see the latest listed version of the IFC Addin for Revit 2020, which is 20.3.2.

2022-03-31-12-33-25

However, if we head over to revit-ifc Releases page on Github – Releases · Autodesk/revit-ifc · GitHub – then we see that there is a prerelease build for Revit 2020 that is version 20.3.3.0.

 

So this post is just a little reminder to check the releases on Github if you want the very latest IFC addins provided from Autodesk.

2022-03-31-12-34-42

First, let’s manage some expectations – Aconex is generally viewed as a closed platform. They seem to deliberately resist platform-level integration with other tools like BIM 360 and Revizto. However, you can achieve the promise in the title of this post – with a bit of work!

The basic steps are:

  1. Setup a sync between the Aconex Doc Register and Autodesk Docs (BIM 360 Document Management), using the Aconex PIF and Autodesk APIs
  2. Connect the BIM 360 project to the Revizto project and connect your BIM 360 ‘official Aconex’ PDFs to Revizto 2D Sheets.

Step 1 – Aconex Document Register sync with BIM 360 Document Management

Aconex provides a PIF (Project Integration Framework). This essentially acts as an API, and clients are able to build integrations between Aconex and other tools using their own budget.

aconexpif
The Aconex PIF is hard to find on the inter-webs…

As part of my strategic consultancy work as a Director at Virtual Built Technology, I have assisted clients in setting up the Aconex to BIM 360 sync. This will require considerable time, effort, $$ and planning. Although the scope is basically the same for every Aconex client that wants this, it seems individual Aconex clients will be charged significantly for this development to occur and operate on their ‘own’ PIF.

Once complete, you will begin to see PDFs arriving in Autodesk Docs from Aconex on a daily / nightly basis. Depending on how you have briefed the Aconex development consultants, you may end up with some interesting PDF file naming in the BIM 360 environment – so how can we deal with that?

Step 2a – BIM 360 Preparation Script for Merging Filenames to a single Folder

Let’s say you have your Aconex-to-BIM360 sync running, and you have PDFs arriving in 360 into “Discipline” folders like Architectural, Structural and so on. Not bad! But what if your PDFs are named like this:

  • SheetNumberX_Rev_1.pdf
  • SheetNumberX_Rev_2.pdf
  • SheetNumberY_Rev_1.pdf
  • etc

When thinking about downstream applications like Revizto, this is really sub-optimal. Why? Because a Revizto 2D sheet will want to connect directly to a SINGLE BIM 360 PDF document with Versions – not to a mangled set of ‘revision suffixed’ pdfs like the above. So we need to clean this up! And it needs to be repeatable and automated

For this, I have created a PowerShell script. Here is a description of the logic in the script:

  • it takes a list of BIM 360 Docs folder paths (the local folder path from Windows Explorer address bar), these are ‘monitored folders’
  • those folders need to be ‘synced’ prior to the script running (I’m still looking for a way to call the Autodesk Desktop Connector sync from a Powershell script, for now you have to right-click in Windows Explorer first)
  • it will process the folder list and:
  • create a subdirectory called “Current”
  • create a logfile in that subdirectory called deepSpaceSync.log
  • split the filenames at the _REV_ part
  • process each available Revision number individually in a loop (to ensure Aconex revisions are stored sequentially into BIM 360 versions)
  • check if the file was already copied – and skip it (using existence of that filename in the log folder)
  • or, copy the file to the “Current” subfolder
  • Sleep 5 minutes between each loop to allow Autodesk Desktop Connector sync to catch up.

So this script essentially merges the messy “REV” filenames into a nice clean SheetNumber.pdf filename structure AND actually creates BIM 360 file versions in the cloud (due to the sleep functionality).

We have released this script as a publicly available Deep Space Automation at this link.

egoutput
example output from the PowerShell script

Step 2b – Revizto 2D Sheet to BIM 360 PDF Connection

Now that we have a nicely named, version-rich PDF on BIM 360 Docs, its time to connect that up to Revizto.

Firstly – you should always publish 2D sheets from Revit BEFORE setting up the Docs connection if possible, as this is the only way to get the automated viewport overlays (Note – you can export with “Sheet coordinates only” option in Revizto Sheet Exporter from Revit).

Secondly – you then

  • open Revizto
  • ensure you have connected Docs to the right BIM 360 hub in Revizto
  • browse to the “Current” folder for a specific disciplines PDFs
  • click “Send to 2D”
  • select all and Done

    syncrev1
    Connecting Docs to Revizto 2D Sheets
  • then review that the correct sheets are going to be replaced and connected to the BIM360 PDF
syncrev2
Confirm that sheets are correctly connected using the Sheet Number

 

Final steps

Once your PDFs are flowing well between platforms, you may want to automate further. For example, the PowerShell script provided above could be set to run as a Windows Task. You would want it to run as soon as possible after the Aconex scheduled sync to BIM 360. As always, you should monitor your inputs and outputs for any problems or gremlins.

Done! You now have official documents from Aconex flowing regularly into Revizto via BIM 360. This means your site teams using Revizto on the iPad can now be accessing the official For Construction PDF Drawings (synced from Aconex Doc Register) while in the Field!