Turn Revit Data into Useful Information with Visualization Techniques and Workflows


BIM technology in the construction industry has been steadily increasing, reaching a point where we can be sure it is here to stay. Most of the BIM processes and workflows are clear and established for the geometric and visualization tools of the technology but not for data management. We all agree that the parametrization and data provided by BIM software is the key, but it’s not clear which practices are best for obtaining data and processing it to understand patterns or reach conclusions that can lead us into better designs on every project. The main objective of this article is to dig into better practices to turn raw Revit data into great visualizations that will allow better decisions and workflows on our construction projects.

What Is Data Visualization

A primary goal of data visualization is to communicate information clearly and efficiently via statistical graphics, plots, and information graphics. Effective visualization helps users analyze data and evidence. It makes complex data more accessible, understandable, and usable. Users may have particular analytical tasks, such as making comparisons or understanding causality, and the design principle of the graphic follows the task. Tables are generally used where users will look up a specific measurement, while charts of various types are used to show patterns or relationships in the data for one or more variables.

Data visualization is both an art and a science. Increased amounts of data created by Internet activity and an expanding number of sensors in the environment are referred to as “big data.” The same concept can be applied to the construction industry where BIM technology has increased the amount of data or at least made it more accessible. Processing, analyzing, and communicating this data presents analytical challenges. That is why data visualization is key for improving not only the communication of our projects but the practical and clear representation of multiple sources of data.


To efficiently visualize data the key factor is to get the right data. This might seem a really simple task, but unless you are dealing with just an small amount of data like a couple of columns and rows in a spreadsheet, you will need robust and clear processes that allow you to take a huge amount of information from multiples sources (Revit, AutoCAD, external databases, Internet, etc.) and multiple formats and be able to get just what you need. After a few steps you can create your visualizations and then start making conclusions.


The Potential of Revit Data

The first thing we need to know is which data is a potential candidate for visualization. It is obvious that we can transform a common Revit schedule or any simple spreadsheet into graphics, but here we are looking for data that needs to be processed and extracted out of Revit to be able to obtain new insights about model/design. The potential of data is limitless but we are going to name a few ways that have been very useful in our company in the last couple of years.

Assisted Design
Every professional from the construction industry that has been involved in a design process is aware that most decisions are not made based on clear data, but a mix of data and the designer’s intuition. Imagine if we could instead create many design options and have a dashboard with the comparison of the exact information like cost, square feet, sun and shadow effects on the building, etc., of each design, that will leave aside all ambiguities and make the designer and the owner’s life so much easier.

This is also something that comes along with the new technologies like Project Dreamcatcher that allow us to iterate between more complex designs. How can we start designing more complex buildings with organic shapes if we cannot measure and compare between the different designs?


That is why visualization and data mining have become key factors in our current processes of design and future technology even moreso. This will allow us to visualize in real time how the design is changing so we can see the effect of every change we make to our building.

I believe that in the future our role as BIM managers will not only be to deliver fully coordinated drawings and sometimes a couple of detailed schedules, but to provide a precise analysis of the model data and audit of the design in ways we just dream of today.

Collaborative Database
The procedure is something that should be a must in every company because it is powerful. The idea is that you have all project data stored in the same database. This may sound simple but the more projects you have, the more you can benefit. The concept is that you can store the knowledge that you have gained from one project and apply it to others, to improve your learning curve.


A good way to accomplish this is to study how your people are working and modeling. By exporting your database continually you can gain good insight (e.g., the sequence of modeling people are using, the time each task is taking them, the number of warnings on each phase). Then compare all that at the end of the project with the actual time and get conclusions about delays and how to avoid them in the future. This is also a good way of measuring rework.

QC for Validations of Company Standards
We all know how difficult it can be to keep up with BIM internal standards within our company, particularly if you have hundreds of projects going at the same time, with different types of buildings and different locations all over the world. You always have the option to have a quality controller who is the person who checks the quality of the models at all times. However, at times the quality controller either doesn’t do this in a really deep way or you end up hiring an army of quality controllers.

Our approach has been to export model databases once a week and run an automatic control on all models to make sure all of them follow the company standards. Doing this for all your models at the same time saves you a lot of time. You can keep a weekly record of all your company BIM progress.

Comparison of BIM Model Database Versus External Database
Something really useful for the design process is to compare your Revit database with external databases to see how your design compares with the average cost, use or energy, construction code, etc. So you end up having a tool that allows you to know if your design is building code compliant or if you are spending more money than the market average.

At the beginning this is not easy because there is no one database that you can download from the Internet that will include all the information that you need, but once you start using this workflow you end up creating your big source of information.

Machine Learning
Machine learning is a field of computer science that gives computers the ability to learn without being explicitly programmed, so through multiple iterations you can get automatic solutions based on all the previous projects you have been working on. In other words, the more projects you feed your database the more precise your tool will be.

Best Ways to Extract Revit Data

The first step into data visualization is to get the data extracted. We need to consider which tools to use, format, content, and when to extract it.

Export Methods

We all know that Dynamo is a powerful tool for multiple purposes and extracting information from Revit is no exception. Dynamo allows us to export to different platforms (Excel, Access, Microsoft SQL Server, MySQL, or SQLite) using the standard nodes or packages like Slingshot.


The main advantage of Dynamo is that it is easy to use and that apart from the multiplicity of platforms it also allows us to filter the information and export to many different data serialization formats like XML, Json, HTML, etc.


Revit Database Link
This allows you to maintain a relationship between a Revit project and a Microsoft Access, Microsoft Excel, or ODBC database. You can use Revit DB Link to export Revit project data to the database, make changes to the data, and import it back into the project. The database displays Revit project information in a table view that you can edit before importing. This table view also allows you to create Revit Shared Parameters which adds new fields for those parameters in the related tables. Any change made to these new fields within the database updates Revit Shared Parameters upon future imports.

BIM Link
With Ideate BIMLink, Autodesk Revit users can pull information from a file into Microsoft Excel and push volumes of precise, consequential BIM data back into your Revit model with speed, ease, and accuracy. Data management tasks and workflow take a small fraction of the time they once took. The cumulative advantage means more hours freed. You gain unprecedented access to the Revit modeling data you need, for an enhanced workflow.

Custom Macro or Add-in
Another good option is to write a custom add-in or macro adapted to your specific processes.

Which Database to Use

Most of us work every day with Excel, which is sometimes regarded as a simple tool for inserting information in cells and creating relations between them, but that is just a reductionism. Excel is simple to understand and learn for new users yet it is powerful and allows us to store a lot of information, process data, create algorithms, create visualizations, etc. So that is why it is a great starting point for all those users who are not familiar with programming or have never worked with relational databases.

Microsoft SQL Server
Microsoft SQL Server is a relational database management system. As a database server, it is a software product with the primary function of storing and retrieving data as requested by other software applications — which may run either on the same computer or on another computer across a network (including the Internet). I would say that this database is far more robust and powerful than Excel if your main goal is data processing, so I would suggest this as a good option if you are a more advanced user.

Other Databases
There are many more database options including MySQL, SQLite, or whatever database you’re using. The important thing is that you have a tool that allows you to get the information that you need.

What to Export

The first thought might be to export only the information that you need, but the best practice is to export the entire Revit database every time, so you don’t lose any information. The key here is to get as much information as you can to make more robust processes on every project.


When to Export Revit Data

When to export Revit data depends on how you are going to use the information that you export and how you’ve set up the processes in your company. For a quick rule of thumb reference the chart below:

Revit toxxx Viz-01.jpg

How to Transform Revit Data into Useful Information

Process mining analytics made us realize that we use our IT system to store information, but those systems don’t automatically facilitate efficient processes. Data mining is the computing process of discovering patterns in large data sets involving methods at the intersection of machine learning, statistics, and database systems. This is an essential process where intelligent methods are applied to extract data patterns. It is an interdisciplinary subfield of computer science. The overall goal of the data mining process is to extract information from a data set and transform it into an understandable structure for further use.

DIKW Pyramid-01

So what we get from the KIWD Pyramid (above) is that there is a hierarchy of data. Our main goal is to transform Revit raw data into wisdom so we can predict events, make better decisions, and improve our processes. Behind the hierarchy there are multiple processes that we have to follow in order to go from bottom to top. See the image below for all the steps involved in making this happen:


Steps to Start Digging

Selection — We start by selecting the correct data that we are going to use. For example, let’s say we have a complete Revit database and we want to dig into natural illumination calculations. What we are going to do is take the data that is related with this and exclude things like MEP calculations or structural calculations that have nothing to do with what we are studying.

Clean data — In this step we make sure that the data we are using is correct and remove incorrect data. For example, with trough algorithms we can detect if there are any outliers in our data set that we will need to discard or if there are any missing values in our set that we need to include.

Normalize data — Here we normalize all data because it is common to have information coming with different formats and scales, data that is textual, numeric, binary, etc. So we need to take everything that is equal; or, if that’s not possible, make the correlation of variables.

Data mining — In this step we find patterns, essentially applying different calcification algorithms to data. Some examples are decision threes, K nearest neighbors, support vector machines, K-means, etc. We do it this way because we have large sets of information and the only way to get patterns relatively quickly is the use of these algorithms.

Interpretation — Interpretation is not an automatic process, but something that the person has to do looking at the information. That is why visualization is key here: it allows us to understand quickly and accurately the output of the data mining process


Learn to code being an Architect

Among architects, civil engineers or Project managers is an odd thing to find a proffesional with programming skills, but this is a paradigm that is changing really fast

There are multiple reasons why we should lean programming as Contruction Proffesionals:

Master Software.jpg

The first reason would be that is allows us to truly master our tools; When we learn how to code we start understanding more deeply the way the software we use on a daily basis works.

Also, it is not a minor thing that by coding we gain the power to maximize the things we can do with our program, by developing customized solutions adapted to our specific necessities.

Steve Jobs and Phrase.jpg

As Steve Jobs Said, it is important because coding give us a background on a different way of thinking, so in the end this can lead us to benefit not only to the development of programmatic solutions but also to change the way we work in general.

England and Name.jpg


A clear example towards this paradigm is the case of the UK, who is the guinea pig for the most ambitious attempt yet to get kids coding, with changes to the national curriculum. ICT – Information and Communications Technology – is out, replaced by a new “computing” curriculum including coding lessons for children as young as five.




Thousands of times I’ve heard people complaining of the software they use, being not able to do all the tasks they want due to constraints of the program and as a result they end up being frustrated and wasting a lot of time and money. Here is a good one for all of them, when you code you can not only create your own standalone applications but also start creating plugins for the software you use. For this purpose the software companies create an API(Application Programming Interface) that are a set of functions and procedures that allow the creation of applications which access the features or data of an operating system, application, or other services.


Programming allow us to automatize processes and as a result we can do tasks  so much faster than we did in the past, this will lead in cost savings and also to a better quality, as we all know that when we have to do a task over and over again we get bored and when that happens people makes mistakes, but when you have something automatize you can guarantee that you will have exactly the same outcomes all the times.


Ventaja Competitiva.jpg

It is no secret that due to globalization, technology and communications every day it is more difficult to get decent clients in our industry, so when that happens everybody says that the best strategy is being sure that your services have and added value that gives you a competitive advantage. Base on this we can say that having coding skills perfectly fits in the “added value” description, so even if you don’t really know how this might help you, be sure that this will be a great goal to add to your career.

Minion.jpgOn the contrary of what most people think, coding is extremely fun and when you start creating applications it gets really addictive, believe when I say that you will spend hours and hours of your spare time in front of the pc trying to create the perfect algorithm.

Programming Languages Black.jpg

Based on my experience as a programmer of Revit, AutoCAD, Dynamo and Grasshopper, I think your options should be one of the above Programming Languages (The 4 Languages are supported by the Revit API).

All of them have different advantages and disadvantages, but If I have to choose a language for a beginner I would definitively go with Python; apart of the compatibility that this language has with many of the AEC design programs, I think it is the easiest one to learn as it is almost like being written in plain English and also it is used a lot for Data science and Data visualization that I think will be skill needed in the short term in the AEC industry.

StepsToLearn English-01.jpg

There are 4 steps that you should follow if you want to efficiently become a good programmer in the software that you use.

  1. First al all you need to learn how to use the software; I’ve seen many incredible programmers that don’t know what to program or creating tools that are not really useful for the actual users.
  2. Learn a Programming Language, this might seem really silly but is not; I’ve met many people that try to learn how to program an API without the appropriate Programming Languages skills and they end up in an infinite loop of try and error and in most cases not getting anywhere. So don’t waste your precious time just getting into the API without learning a languages, there are many free good tutorials online.
  3. Before starting to work on your API, you must have basic knowledge of how to properly develop software, for example the uses of GitHub, Databases, UML, etc.
  4. Finally search for the SDK(Software Development Kit) of your program and start learning how to use the API. If you have followed the steps that I told you before at this point it will be a piece of cake for you to develop anything you want.


Don’t be afraid, at the beginning programming might look at something really difficult and frustrating, but believe when I tell you the learning curve is steep at the beginning but you will quickly start to develop new solutions and you will see that the way you work and think will change a lot and you will starting seeing the benefits of it.


About Interoperability

Interoperability is the capacity to allow data exchange from one information technology system to be received by another

Following this definition we could say that between all Construction software there is almost always interoperability, but if we start analyzing in detail the way in we can share elements between different platforms, certainly this will not be entirely true.

We are used to hear all the time that any exchange of information (either be data or 3D models) between platforms is called Interoperability, so we will start analyzing which   actions are truly included in this definition and which aren’t.


We can appreciate clearly how compatibility and standard differ from interoperability. An example of the first one is the following:


Revit is a software of 3D modelling that belongs to a company called Autodesk, that allow us the modeling of a wide spectrum of specialties like Structural, Architectural, HVAC, Drainage, Electrical, etc. and in the other end we have Tekla, a software of the company Trimble, which is oriented specifically to the modelling of Structures.

In the Presented case above, Revit and Tekla belong to different companies, but despite of that both programs allow us to create links between one model to the other. That is why we can say both programs are compatibles to realize a task like it is the coordination of clashes, but this doesn’t mean that we can modify a Tekla model in Revit or the other way around, that is why we can only use the models as a reference and not like a collaborative effort in his maximum potential.


 One example of the standards is the case of BuildingSmart, an entity who promotes a format called IFC(Industry Foundation Class), which give us the possibility to create a new digital language that allows the technology of information to freely and openly exchange data through the lifecycle to improve the construction, operation and management.

This way, even using different programs we can reach to the same results, and despite the fact that IFC is presented officially as the solution to the interoperability, it doesn’t solve the problem of the model to be compatible across multiple platforms.



That major achievement of Interoperability will be real when we are able not only to exchange information between programs, but also to be capable of making modifications to that model in a different platforms and finally return that model to our initial software and continue working as it nothing have happened.

It is true that currently there have been great progresses on the collaborative way of working between different platforms and there have even been made agreements of collaboration between companies like Trimble and Autodesk, however the possibility of being able to work with models across platforms simultaneously is something that today is really far of being a reality in the BIM industry.


In the sequence above we can appreciate the different levels of compatibility before reaching the point where there is a fully Interoperability.

So we think of the ideal way of working as an Unicorn, as it is something that we desire but we don’t think is going to be real, at least not in a short term.

The truth behind this is that the absence of this ideal situation results not only on an excess of time in the projects but it is also a barrier for collaboration, which is one of the main pillars of BIM Methodologies and also a critical necessity in the Construction industry nowadays.

Good news are, without taking into account the fact that we have not reached the “Unicorn” yet, in the current market there are more solutions every day that allow us to create dynamic links between different programs, among others we can name some of them like:


This developments like DynamoGenerative ComponentsGoogle FluxGrasshopper or BIM 360, allow us not only to make a dynamic exchange between all platforms, but also to add more value to the information that we exchange through actions like analysis, monitoring, modification, data visualization, process improvement, etc. We can say tahat all this workflows are currently leading us into a shift of paradigm.

At the time our industry is still trying to get used to work with BIM, but at the same time this tools are allowing us to generate workflows that would not have been possible with standard software, giving us the possibility to maximize the potential of our project to a point the we would not have imagined just a couple of years ago.

Keep Calm.jpg

It is important to highlight that this new world of possibilities, that is currently rising, is not only allowing us to improve our workflows but also to innovate and generate solutions that can be adapted to our specific requirements.

So in the end we can be the resposibles for the innovation and development of the technology that we are using for our design/construction processes.



Hello Stranger!

I’m a computational designer and architect. I work every day developing new solutions/processes for the AEC industry.

Here you will find new trends and innovative solutions that are currently being develop in the construction industry


So, if you are that type of passionate person who works in construction and you are all the time looking for new solutions and improvements this is the place for you.




We will go from the initial idea to the actual construction process and maintenance of the building.

I think that we cannot innovate if we don’t start thinking of the lifecycle as a whole is the key and not small separate parts of the industry being improved by people who only think in their small box and not in the other links of the process.


The Technology has been evolving a lot in the last 100 years and even more in the las 10 years. It seems yesterday the day when everybody started using Smart Phones, but given this situation it seems really strange that we haven’t almost change the way we build in the AEC industry.

Yes, some will say that the technology we use, like Cranes, Drones, 3Dmodels,BIM, Lean Construction, etc., etc., is changing all, but that is the thing, even with all these new innovations we’ve not changed the way build, this can clearly be appreciated in the way we manage or even the materials we still use in our buildings.




It is now banal in the extreme to say that we are living in a rapidly changing world, and it can be misleading too. The challenge is to understand that things are not only getting faster, but also more volatile, uncertain, complex and ambiguous.

So, given the way we are used to live we think that the progress is just a little bit faster than a few years before so we feel like this:



But as Patrick Hollingworth stated in his book “The light and Fast organization”, there are three forces (People, Places and Technology) that combined are creating a perfect storm, where changes are not going to be as they always have been, and in the end, we are about to deal with the following situation:



So, don’t fear, but instead, embrace the change and start following me. I will give you many useful tools that you will need to survive in our industry!