AI produces shape-morphing materials in minutes

Sep 12, 2025

Animals, plants, and even viruses change their shape to adapt to their environments. This is not a trait shared by most engineered materials, which are fixed in form and function.

Recent work by Liwei Wang, with Wei Chen and Ryan Truby of Northwestern University, could help bridge that gap by developing materials that can reshape themselves in response to external factors, acting as if they have the built-in intelligence of living organisms, just like flowers and other living creatures.

The team, led by Chen and Truby, developed an AI-driven design and 3D-printing method that can autonomously create material systems capable of changing shape when exposed to stimuli like heat or light. This new materials engineering framework doesn’t just design the structure of materials—it also figures out the best material (stimuli) distribution and the printing process parameters necessary to achieve a desired shape in response to environmental cues, all without human input.

3D printed, yellow, 2-D ovals

 

This new approach to engineering shape-morphing, stimuli-responsive materials integrates generalized topology optimization with hybrid data-physics differentiable simulations, two computational techniques that enable efficient design of complex systems. The approach accelerates high-dimensional design exploration and unlocks the full potential of advanced manufacturing. The system also quickly adapts to changing design requirements, rapidly producing new designs in just minutes.

For a desired shape-morphing task, the team’s method automatically designs the materials and structure in just one minute, complete with all instructions for how they should be 3D printed. Fittingly, the produced designs have structures and shape-changing behaviors that often resemble biological systems, patterns that emerge naturally from the optimization process. This suggests the method not only improves performance but also uncovers new design principles that mirror nature.

We also see a synergy between nature and AI, where AI optimizes materials like evolution, and materials act like computer programs. We’re excited to study this synergy from both directions—for example, comparing the network topologies that emerged in design with those found in nature.

Liwei Wang, Assistant Professor, Department of Mechanical Engineering

They presented their findings in the paper “Autonomous Co-Design and Fabrication of Multi-Stimuli Responsive Material Systems,” published September 12 in the journal Science Advances.

“By combining AI, physics, and digital manufacturing, we’ve created a powerful tool for developing adaptive materials that could be used in medical devices, robotics, and other technologies that need to respond to changing environments or functional needs,” Chen said. “It’s a step toward smarter, more versatile materials that can do things traditional systems simply can’t.”

Recent advances allow materials to respond to multiple stimuli, but current design methods are limited to single responses and rely heavily on trial-and-error or expert intuition, restricting creativity and performance. Designing and manufacturing multi-responsive materials remains a significant challenge.

The team overcomes this by creating a system that automatically designs and fabricates materials to change shape in programmed ways under multiple triggers. This approach is generalizable, extending to other manufacturing methods and responsive materials.

“This breakthrough helps close the gap between what stimuli-responsive materials we can design and how we actually build and manufacture them for practical engineering applications,” Truby said.

Working alongside Wang’s computational efforts, Alex Evenchik, PhD candidate at Northwestern, developed new inks and 3D printing processes to fabricate stimuli-responsive materials called liquid crystal elastomers, building on prior work from Truby’s lab on designing and 3D printing responsive and multifunctional materials.

“Next steps include linking shape changes to specific applications. Can we leverage this system to design new, unintuitive ways to create artificial muscles, mechanical computers, or drug delivery devices? Now that we’ve demonstrated our approach works, I hope we can begin to apply it to address key societal challenges that traditional engineering materials can’t solve,” Evenchik said.

We also see a synergy between nature and AI, where AI optimizes materials like evolution, and materials act like computer programs. We’re excited to study this synergy from both directions—for example, comparing the network topologies that emerged in design with those found in nature.” Wang said, who is continuing this research after becoming a faculty member at Carnegie Mellon University in August 2024.  “These directions will help deepen our understanding of programmable materials and expand their practical impact.”

This research is part of the Chen-led US National Science Foundation-sponsored Boosting Research Ideas for Transformative and Equitable Advances in Engineering (BRITE) Fellow project “AI-Enabled Discovery and Design of Programmable Material Systems.” The overarching goal of this project is to co-design architecture, stimuli, and materials for programmable material systems. It is also partially supported by the Northwestern University Materials Research Science and Engineering Center (NU-MRSEC) and the Office of Naval Research.