Researchers Use AI-Based Approaches for Optimizing Atomic Layer Deposition Process

To make computer chips, technologists around the world rely on atomic layer deposition (ALD), which can create films as fine as one atom thick. Businesses commonly use ALD to make semiconductor devices, but it also has applications in solar cells, lithium batteries and other energy-related fields.

Today, manufacturers increasingly rely on ALD to make new types of films, but figuring out how to tweak the process for each new material takes time.

Part of the problem is that researchers primarily use trial and error to identify optimal growth conditions. But a recently published study -- one of the first in this scientific field -- suggests that using artificial intelligence (AI) can be more efficient.

In the ACS Applied Materials & Interfaces study, researchers at the U.S. Department of Energy's (DOE) Argonne National Laboratory describe multiple AI-based approaches for optimizing the ALD processes autonomously. Their work details the relative strengths and weaknesses of each approach, as well as insights that can be used to develop new processes more efficiently and economically.

"All of these algorithms provide a much faster way of converging to optimum combinations because you're not spending time putting a sample in the reactor, taking it out, doing measurements etc. as you typically would today. Instead you have a real-time loop that connects with the reactor," said Argonne principal materials scientist Angel Yanguas-Gil, a co-author of the study.

Cutting edge, but with challenges

In ALD, two different chemical vapours, known as precursors, adhere to a surface, adding a thin layer of film in the process. This all happens inside a chemical reactor and is sequential: one precursor is added and interacts with the surface, then any excess of it is removed. Afterwards the second precursor is introduced then later removed, and the process repeats itself. In microelectronics, the ALD thin film might be used to electrically insulate nearby components in nanoscale transistors.

ALD excels at growing precise, nanoscale films on complex, 3D surfaces such as the deep and narrow trenches patterned into silicon wafers to manufacture today's computer chips. This has motivated scientists worldwide to develop new thin film ALD materials for future generations of semiconductor devices.

However, developing and optimizing these new ALD processes is challenging and labor-intensive. Researchers have to consider many different factors that can alter the process, including:

  • The complex chemistries between the molecular precursors
  • Reactor design, temperature and pressure
  • The timing for each dose of their precursors

In an effort to find ways of overcoming these challenges, Argonne scientists evaluated three optimization strategies -- random, expert system and Bayesian optimization -- the latter two utilizing different AI approaches.

Set it and forget it

Researchers evaluated their three strategies by comparing how they optimized the dosage and purge times of the two precursors used in ALD. Dosage time refers to the time period when a precursor is added to the reactor, while purge time refers to the time needed to remove excess precursor and gaseous chemical products.

The goal: Find the conditions that would achieve high and stable film growth in the shortest time. Scientists also judged the strategies on how quickly they converged on the ideal set of timings using simulations that represented the ALD process inside a reactor.

Linking their optimization approaches to their simulated system let them measure film growth in real time after each cycle, based on the processing conditions their optimization algorithms generated.

"All of these algorithms provide a much faster way of converging to optimum combinations because you're not spending time putting a sample in the reactor, taking it out, doing measurements, etc., as you would, typically. Instead you have a real-time loop that connects with the reactor," said Argonne Principal Materials Scientist Angel Yanguas-Gil, a co-author of the study.

This set up also made the process automatic for the two AI approaches by forming a closed-loop system.

"In a closed-loop system, the simulation performs an experiment, gets the results, and feeds it to the AI tool. The AI tool then learns from it or interprets it in some way, and then suggests the next experiment. And this all happens without human input," said Noah Paulson, a computational scientist at Argonne and the lead author.

Despite some weaknesses, the AI approaches effectively determined the optimal dose and purge timings for different simulated ALD processes. This makes the study among the first to show that thin-film optimization in real time is possible using AI.

"This is exciting because it opens up the possibility of using these types of approaches to rapidly optimize real ALD processes, a step that could potentially save manufacturers precious time and money when developing new applications in the future," concluded Jeff Elam, a senior chemist at Argonne and co-author.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.