Certain built environments, such as semiconductor and biopharma plants, are in constant flux to keep pace with innovation. These facilities are in a continuous state of renovation and evolution to accommodate new tools, processes and teams, and they rely on digital models to re-engineer and manage these spaces to avoid costly downtime and maximize production.

While downtime takes its toll on any business, it is particularly so in high-stakes environments like semiconductor and biopharma, where even a one-second lull in production can cost millions. As important as facility reconstruction is for these types of industries, it is even more mission critical that the construction does not disrupt operations.

Adding to the complexity, these types of high-purity environments typically comprise an intricate and complicated web of pipes carrying chemicals and compounds that must be delicately maneuvered around the facility. To meet these unique needs, many organizations deploy building information modeling (BIM) processes to gain a highly accurate understanding of the existing physical space prior to undergoing construction.

But achieving this level of insight relies on accurate, precise data and not all methods for collecting and applying that data — known as the scan-to-BIM process — are created equal.

BIM in high-purity environments

BIM refers to a highly accurate, 3D digital model of the physical characteristics of a building or structure. It is increasingly being used in high-purity environments — namely semiconductor, biopharma and other manufacturing facilities — to understand the dimensions of an existing structure, keep updated records of the facility, detect possible construction errors, evaluate changes over time and examine the design intent on something that has yet to be built.

When executed properly, BIM processes can lead to greater construction efficiencies including reduced rework, lower materials and labor costs, advanced clash detection and off-site prefabrication. Additionally, it also supports highly accurate 3D models that allow these facilities to be designed in a virtual world first to better
understand relationships between spaces, materials and various systems within a physical structure, and how real-life elements will impact it.

It might go without saying that BIM relies on quality data. If that is compromised, it can have a significant negative impact on the outcome. This is where a highly accurate scan-to-BIM process comes into play.

Scan-to-BIM unpacked

Scan-to-BIM is the process of using a laser scanner to capture a 3D scan of an existing structure. This scan data is then imported into a 3D modeling program to create an accurate digital, as-built representation of the space or site. This model is then used to inform design, evaluate progress and determine options.

The scan-to-BIM process can further be broken down into four general stages that make up the workflow. Unfortunately, there are no industry-wide standards or templates to follow — and there is not yet a tool that can automate the workflow. But there are best practices that can be implemented prior to each phase to control
quality and accuracy.

The general phases of a scan-to-BIM workflow, include:

  • Capture. The gathering of information from a site or space using a variety of tools and devices. This might include 3D laser scanners, cameras, gauges, measurement tools and recording devices. These devices capture the raw information that will provide the basis or foundation of the digital as-built.

  • Processing. The conversion of the raw data collected from the capture devices into consumable elements that the modeling software can assimilate.

  • Modeling. The creation of a 3D virtual representation of elements based on the registered data. It can represent selected elements of the structure or the entire environment.

  • Quality check. The detection, correction and validation of the project information to guarantee model accuracy. 

Prior to each of these phases, however, there are quality assurance (QA) opportunities to prevent downstream mistakes or defects. QA measures implemented throughout the scan-to-BIM workflow will have a direct effect on the time investment needed during the quality-check phase.

Scan-to-BIM best practices & common mistakes

Time spent on QA will be time well spent, especially in highly technical, fast-paced, high-purity environments. The slightest mistakes made in environments that rely on agility and speed to innovate can create extraordinary costs later. 

QA practices can, and should, be deployed throughout each of the four scan-to-BIM phases. For instance, in the data-capture phase, there are pre- and post-mobilization factors to consider. To start, a scope of work should be well defined and include the timeline for milestones and deadlines, as well as what the expectations are regarding the final deliverable. Particularly in environments like semiconductor and biopharma that require a high degree of specificity, defining the desired level of detail (LOD) is imperative to success. Other areas to evaluate:

  • Equipment selection — laser scanners each have specific strengths and limitations, and there will not be one that fits all needs. Device selection should be determined based on the needs of the job and which is best equipped to align to those needs.

  • Equipment security — this is often overlooked, but the slightest bump during the transport of the equipment can throw off the calibration. Conducting a pre-scan to test the calibration against later can save hours of potential rework.

  • Preplan data-collection methodology — evaluating and planning the methodology for data capture prior to arriving on site helps prevent key information from being overlooked. The environment will ultimately dictate what methodology — whether cloud-to-cloud, traverse, survey control, etc. — is the best process. Once chosen, teams must discuss this with everyone on the crew before the project so there is alignment.

  • Level of accuracy (LOA) and LOD — these are not the same thing and must be clearly defined in advance of project kick off. LOA refers to how accurate the data collection needs to be dimensionally, where LOD refers to the details/items that need to be modeled from the raw data.

  • Confirm site conditions — preplanned methodologies were assumptions up to this point. Validating the methodology once on site — based on the current conditions — is critical prior to starting.

  • Closed loops and maintain direction — working with laser scanners, and particularly when using the traverse methodology, the opportunity for error starts to accumulate with more scans. Closing the loop essentially creates small circles of overlap that will detect commonalities and balance errors, reducing the impact of accumulation. Additionally, simply by maintaining direction with each scan (facing north, as an example), will help reduce deviation.

Similarly, the processing phase can be distilled to two stages: pre- and post-registration (also known as point cloud registration or scan matching). During this phase, the point cloud data collected from multiple data sets are aligned to produce a globally consistent model. The margin for error here can be significant, making it imperative to implement QA measures in the pre- and post-registration. Best practices in both stages include:

  • Validating survey data — relying on the registration program to validate the accuracy can lead to errors, as most programs hold survey data as an absolute. Instead, a series of checks should be placed throughout to identify survey errors before committing to the registration process.

  • Setting alignments — teams should understand why and how data will be aligned.

  • Inspecting scans manually — leaving this to the software to inspect can also lead to errors, as data tends to overlap. Each scan should be manually inspected for interruptions in data fluidity, which can be caused by environmental vibrations.

  • Transferring data — how data will be sent and shared must be considered, especially when working with dispersed teams. This can create significant delays as these data files are generally extremely large. Prior to this phase in the workflow, teams should determine how data will be uploaded and shared, what tools will be used to facilitate this, who is in charge of processing the raw data, etc.

In the modeling phase, communication between all stakeholders is key — and it is one of the most effective ways to reduce error. The modeler should also have prior experience in modeling and know what to look for in terms of errors. Creating a test sample of the model to verify all stakeholders are on the same page is also a good idea before getting too far into the project.

Additionally, there are a number of conversion tools available, and similar to the scanning equipment, each has its strengths and weaknesses. Compatibility with the modeling software being used and the primary use for the model should be a consideration. Finally, in almost all cases, as-is models are being created. Teams will want to be careful to model on the existing phase versus the new construction model at all times.

The final phase of the scan-to-BIM workflow is the quality check. This will require going back to cross-check the scope of work — from the expectations to the milestones to the LOD and LOA. Establishing a set of internal checklists and templates is helpful at this stage to create a consistent process that can be replicated each time.

With scan-to-BIM projects, there will be a massive amount of raw data. This should be filtered based on the actual needs of the project, and aligned to expectations and project goals for all shareholders. This can be achieved by verifying the model using three views, such as in a plan, section and 3D section box. Virtual walkthroughs using software can also help detect any errors or overlooked items.

Advancing construction in high-stakes environments

BIM is enabling those in manufacturing, particularly in high-stakes sectors like semiconductor and biopharma, to not only better understand their physical environment, it is also proving a powerful tool in mitigating downtime and rework, and offsetting the shortage of trade labor by way of off-site prefabrication. 

In any scenario in which manufacturing cannot be halted or reduced, BIM is a tremendous added value, but it will only be as valuable as the data used to inform it. When executed with the right QA/QC methodologies, BIM will enable those in industries like semiconductor and biopharma to retool their facilities faster and bring new products and innovations to market sooner.