Analyze FAQ

This article covers the Verity Analyze algorithm in more depth.

Analysis Duration

The Analyze Progress Bar

Depending on the scale of your data, the Analyze process can be very short (less than a minute) or very long (days). Therefore, it is important to understand what process is occurring at different stages during the Analysis.

  • 0%-10% – Loading geometry from the host. If your progress has stalled here, you are most likely importing a large amount of elements.
  • 10%-35% – Triangle check and repair. If your progress has stalled here, you have element(s) with very complex geometry.
  • 35%-100% – Analyze with Verity’s algorithms. If your progress has stalled here, you are either analyzing against a large number of elements, or your point cloud is very large or dense.

As a companion application to the Host, Verity inherits an issue where Autodesk products show as “Not Responding” when they are processing information. Please be patient with the analysis. If you want to confirm if the Host or Verity are still working, open the Task manager, and look at the Host process. If either the CPU or Memory % numbers are changing, then the application is working.

Storage Speed and Latency

Both the Host and Verity will exchange a large amount of data between each other through your hard drive. Because of this, it is highly recommend to store all of your data on an internal SSD for optimal read/write speeds. We do not support running Verity on data stored on a wireless network connection; we expect this to either take an incredibly long time or not work at all. Because the Analyze algorithm is multi-threaded, it’s also recommended to ensure you have at least a decent CPU. A higher count of cores/logical processors plus a fast internal SSD cause the biggest impact to Verity’s analysis speeds. For more information, refer to the System Requirements article.

Geometry Correction

Checking and fixing the geometry usually takes a small percentage of the analysis time. However, on data sets with extremely complex geometry (tens or hundreds of thousands of triangles per object), this step can increase analysis times significantly.

Number of Points

Verity must load the scans through the ReCap SDK, and this process usually takes the vast majority of time on most data sets. While this process usually happens during the Add to Verity step (we recommend adding scans first when adding to Verity) it still can run at the very beginning of Analyze if not all point data has been imported. To minimize pointcloud processing and importing time, consider the following:

  • Reduce the number of scans you are using for the analysis. For Navisworks, add only the necessary scan data for the geometry and area you’re analyzing, not the entire pointcloud. You can also consider only adding every other scan file location instead, if scanner spacing is tight and element point coverage is high. For Revit, we currently only support adding the entire point cloud. Consider breaking up your pointcloud into defined areas before importing it into Revit to reduce unnecessary data being added to Verity. Alternatively, or in conjunction, you can also remove unnecessary scan locations from your Scans List in Verity before analyzing.
  • Reduce the number of points in each scan through downsampling. Verity only needs a few thousand points distributed on each item to work well so we recommend performing a point cloud decimation grid in your preferred registration software. Remember that the more you downsample, the faster we will import scan data, but at a certain point we will start to lose accuracy. This reduction in accuracy can sometimes impact results, changing an item from Passing to Out of Tolerance or vice-versa. There’s not one grid decimation value we can recommend as the size of your geometry and pointcloud coverage are the main determining factors and change on a per project basis. If you have mostly large geometry with good scan coverage, you can decimate more. Small geometry or poor scan coverage might mean you don’t want to decimate at all. We would not recommend decimating above 25mm for most projects.

Number of Items in a Verity Analysis

The time it takes to run the algorithms (after loading points from the Host) scales linearly with the number of items analyzed. Large complex objects with many points will take longer than smaller objects with fewer points, but the impact is not dramatic.


Why the Verity Algorithm can have Incorrect Results

Sometimes the Verity algorithm might guess incorrectly or you might need to make manual adjustments and overrides. The Analyze algorithm is designed to assess the geometry against the pointcloud data and provide a status and metrics to the best of its ability. However, no algorithm is perfect and you should expect Verity to make very few, but some mistakes. This is why the Review process is important to the workflow and why we’ve tried to make it as easy as possible.

Grouped Geometry Fit Challenges

Groups of similarly sized elements can also be challenging for the Verity algorithms as they will find multiple suitable matches. In this case, Verity will rely on the assumption that someone installed the work as close to the right place as they could. If three 6″ pipes are all installed on a pipe rack, and the model shows three pipes, each pipe is going to be individually fit to the matching points based on proximity. If the rack is installed so that pipe A was installed closest to where pipe A was supposed to be, all is well. If pipe A was installed closer to where pipe B was supposed to be, the algorithm would match pipe A and pipe B to pipe A’s scan data, pipe C to pipe B’s scan data, and nothing to pipe C’s scan data.


Verity Algorithm Assumptions

There are three main assumptions Verity makes:

1) That the scan data is high enough quality, has sufficient coverage, and enough density to accurately determine the status and fit of the element to the point cloud. This means that if the scan data is too noisy or too sparse, Verity won’t be able to determine a result. If noise causes so much uncertainty that the algorithm won’t find a good match, it will either flag the element as Not Found or Uncertain. If the data is too sparse, the element will be flagged as Not Enough Data. For the algorithms to work optimally you want the range noise to be less than half the size of the elements you are testing, and you need good coverage on more than one face of the geometry.

2) That the modeled geometry is very similar in shape and size to the real-world object as it was scanned. If you scanned a square penetration block-out and want Verity to match it to a pipe penetrating a slab in the model, you will be disappointed in both the installation status and fit. If someone modeled the ductwork without insulation wrap and the wrap was already installed when scanned, Verity probably will guess the installation status right, but the fit will not be correct. In these cases, you can get the model corrected to match reality, or you can manually adjust the status and fit in Verity.

3) That the installed work is reasonably close to where it should be. If the installer put the pipe in several feet/meters away from where they were supposed to, Verity will not find it. The algorithm searches within a limited area (approximately 2 feet or 600mm) of the as-designed location for a match in the point cloud. If there isn’t a good match, Verity is going to flag that item as Not Found. If there is a poor match, Verity may flag it as Uncertain.