Vision-Controlled 3D Printing Breakthrough for Multi-Material Constructs

In an article published in the journal Nature, researchers developed a new 3D printing platform that integrates powerful machine vision capabilities to enable the fabrication of complex multi-material constructs with high resolution and speed.

Study: Vision-Controlled 3D Printing Breakthrough for Multi-Material Constructs. Image credit: Generated using DALL.E.3
Study: Vision-Controlled 3D Printing Breakthrough for Multi-Material Constructs. Image credit: Generated using DALL.E.3

3D printing, or additive manufacturing, involves building objects layer-by-layer using materials like metals, polymers, and ceramics. A popular approach called material jetting works by jetting tiny droplets of photosensitive resins cured using ultraviolet (UV) light to build the object layer by layer. 

However, an inherent challenge with this method is the non-uniform thickness of printed layers arising from variability in droplet volumes. This leads to defects as irregularities accumulate over layers. Typically, each layer is planarized using a mechanical blade before printing the next layer, but this restricts material choices.

Material jetting relies on the precise deposition of tiny resin droplets from hundreds of nozzles onto a build platform. The liquid photopolymer resin is jetted layer-by-layer and cured using UV light, allowing intricate 2D patterns to stack into a 3D object. 

High-resolution material jetting can produce features down to tens of microns, enabling the printing of complex geometries. The dimensional accuracy of the final part depends on the reliable deposition of equal resin volumes across layers. However, the discrete jetting process suffers from inherent variability in droplet volumes arising from factors like nozzle clogging. This leads to uneven layers with thickness deviations accumulating over each deposited layer. These surface irregularities propagate without adjustment as defects through the 3D print, leading to poor quality and even print failures. Therefore, mitigating layer variations is critical for material jetting.

New Vision-Controlled Jetting System

The researchers developed a material jetting 3D printer integrated with a high-speed machine vision system to address this limitation. Four cameras and two laser sources rapidly scan each deposited layer, generating topological data to adjust the ink volume when printing the subsequent layer. This closed-loop feedback eliminates the need for mechanical planarization, allowing the printing of polymers incompatible with traditional jetting.

The integrated vision system performs in-process metrology to map the precise topology of each deposited layer in the 3D print. This enables direct measurements of layer thickness deviations. The high-speed scanner acquires surface data through four cameras observing the print from different angles. Additional laser illumination generates shadows and contours highlighting microscopic topology. This metrology data feeds into the printing process, allowing the adjustment of ink deposition to compensate for any detected variations in the deposited layer thickness.

By detecting and correcting layer irregularities in real time, the closed-loop control can eliminate the accumulation of surface defects that could lead to print failures. The adaptive process also obviates mechanical smoothing, such as by a blade between layers. This expands the range of material options, including polymers unsuitable for scraping.

Integrated Machine Vision System

Remarkably, the vision-guided system can scan layers at 660 times the speed of previous methods, analyzing the data in under a second. This enables printing complex multi-material structures with voxel sizes down to 32 μm and throughput reaching 24 billion voxels per hour - on par with commercial devices.

The integrated machine vision performs surface metrology at the microscale, acquiring 3D data about each deposited layer. This is enabled by four cameras that scan from different angles and laser illumination, generating a high-resolution topological map of the print. A Graphics Processing Unit (GPU) then processes this data in real time, providing feedback to adjust printer parameters for the next layer.

The high-speed scanner developed by the researchers has an in-plane resolution of 64 μm and can measure variations in layer thickness down to 8 μm. This microscale precision is necessary to detect defects arising from droplet variability. The scanner acquires the metrology data orders of magnitude faster than previous vision systems used in 3D printing. This high throughput is vital for implementing real-time print corrections.

The vision data is processed by a GPU that runs a custom neural network-based algorithm to analyze the topology rapidly. This analysis happens in a second, enabling real-time control of the printer nozzles. By scanning, analyzing, and adapting the print parameters between layers, the vision-guided system can deposit precisely tuned resin volumes to smooth each layer. This prevents the accumulation of microscopic defects over the 3D print.

Vision-Controlled Multi-Material Jetting

The researchers leveraged these capabilities to 3D print intricate designs using specialized materials. A bio-inspired robotic hand with integrated pressure sensing was printed using rigid and flexible polymers. Pneumatic channels enabled grasping motion in response to applied pressure.

A heart-like pump containing integrated valves, chambers, and conduits was also printed and could pump up to 2.3 liters per minute. The vision feedback was crucial for printing these complex fluidic systems with disparate materials.  

Additionally, metamaterials with precisely tuned mechanical properties were printed by modulating the geometry of internal trusses. The platform's precision facilitates printing such intricate structures spanning multiple materials. The high-resolution multi-material printing enabled the co-fabricating of rigid and soft components within a single construct. This allowed the printing of fully functional devices like robots and pumps.

Vision control was critical for printing compliant components like sensors and membranes integrated into complex assemblies. The adaptive ink deposition prevented defect accumulation over these delicate features. Combining stiff and flexible materials facilitated designs like the integrated soft gripper on the robotic hand, which included tactile sensing. The vision feedback maintained print fidelity over these disparate polymers.

Another demonstration was an integrated fluidic pump able to reproduce critical functions of a biological heart, including chambers, valves, and membranes. Again, vision control prevented failure while printing this challenging system. The platform also enabled the intentional modulation of mechanical metamaterials for properties like compression stiffness by tweaking their internal geometry. This highlighted the capabilities for microscale geometrical control.

Future Outlook

The approach could incorporate electronic chips or other components into printed devices by precisely placing commercial parts. Further expanding the material palette and developing specialized inks will enable increasingly complex multi-functional systems.

According to the authors, machine-vision-guided feedback could become vital for multi-material additive manufacturing. The innovative vision-controlled jetting system expands the design space for 3D printed constructs spanning soft robotics, bio-inspired devices, metamaterials, and more.

As the technology matures, vision-controlled multi-material jetting could become ubiquitous in manufacturing complex engineered systems. This could expand 3D printing capabilities closer to the vast complexity and multi-functionality observed in nature.

Journal reference:
Aryaman Pattnayak

Written by

Aryaman Pattnayak

Aryaman Pattnayak is a Tech writer based in Bhubaneswar, India. His academic background is in Computer Science and Engineering. Aryaman is passionate about leveraging technology for innovation and has a keen interest in Artificial Intelligence, Machine Learning, and Data Science.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Pattnayak, Aryaman. (2023, November 19). Vision-Controlled 3D Printing Breakthrough for Multi-Material Constructs. AZoAi. Retrieved on September 18, 2024 from https://www.azoai.com/news/20231119/Vision-Controlled-3D-Printing-Breakthrough-for-Multi-Material-Constructs.aspx.

  • MLA

    Pattnayak, Aryaman. "Vision-Controlled 3D Printing Breakthrough for Multi-Material Constructs". AZoAi. 18 September 2024. <https://www.azoai.com/news/20231119/Vision-Controlled-3D-Printing-Breakthrough-for-Multi-Material-Constructs.aspx>.

  • Chicago

    Pattnayak, Aryaman. "Vision-Controlled 3D Printing Breakthrough for Multi-Material Constructs". AZoAi. https://www.azoai.com/news/20231119/Vision-Controlled-3D-Printing-Breakthrough-for-Multi-Material-Constructs.aspx. (accessed September 18, 2024).

  • Harvard

    Pattnayak, Aryaman. 2023. Vision-Controlled 3D Printing Breakthrough for Multi-Material Constructs. AZoAi, viewed 18 September 2024, https://www.azoai.com/news/20231119/Vision-Controlled-3D-Printing-Breakthrough-for-Multi-Material-Constructs.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Advancing Additive Manufacturing with ML and Digital Twin