Ultrasonic Testing (UT) is a common method of Nondestructive Testing (NDT) that uses high-frequency waves to measure the thickness of parts as they are inspected. UT is applied to metals, alloys such as steel, and composite structures, and is used in the aerospace, automotive, oil, and gas industries, among others.
To perform measurements, an inspector uses a transducer (with or without couplant) and observes the output on a probe screen. Discrete measurements are recorded in a table, as the results of continuous measurements are presented in a 2-dimensional graph.
While these thickness readings would be 100% clear to our Quality Inspector Jason who performs the procedure, they would be ambiguous to our Senior Engineering Lead Lauren when she re-examines the part an hour later.
The results of an inspection only exist in a digital form, with no apparent connection to the physical part.
Scan results are plotted on a drawing of a part, or in a table. However, the process of converting these results into accurate locations of defects is cumbersome and prone to errors. This applies particularly to c-shaped parts (e.g., fairings) when scanned by machines.
How it works in AR
Instead of guessing the location of a defect from the paper, Lauren could use AR to see the complete scan result projected holographically directly onto the surface of a part.
Displaying scan results onto parts in Augmented Reality provides instant and unambiguous defect locations.
In this video, we demonstrate how the UT scan of a pipe is mapped onto its surface, bringing digital information about a defect into the physical world. If such a part had previously been inspected multiple times in the past, its legacy scan results could be integrated into the same interface and accessed collectively through an “air swipe” for comparison and trend analysis. Results of the sequence of similar inspections could be aggregated by labeling issue locations with the spatial markers via the same AR interface.
Let's go back to Jason who is now performing an A-scan of an inboard wing-flap panel. Usually, he would use a UT probe and mylar - a plastic stencil with round holes that indicates “spots” where measurements are to be taken. He would perform measurements at each spot and observe the values on the probe screen. However, what would happen if he were to get distracted and lose count of the spots? Or forget which spot showed a value outside of tolerance? - He would likely need to start over. In this situation, digital data is disconnected from the physical object.
Instead, Jason could perform the same procedure within the AR interface, which stitches together digital and physical data in real-time. The need for mylar would be eliminated as each spot is highlighted and the inspection is guided dynamically. The result of the measurement would be displayed contextually next to each spot, creating a thickness heatmap.
With AR, ambiguity and inaccuracy are eliminated as measurement locations are marked holographically directly onto parts.