15-Mar-2023 - Ecole Polytechnique Fédérale de Lausanne (EPFL)

New AI model transforms understanding of metal-organic frameworks

The “MOFtransformer” is designed to be the ChatGPT for researchers that study MOFs

How does an iPhone predict the next word you’re going to type in your messages? The technology behind this, and also at the core of many AI applications, is called a transformer; a deep-learning algorithm that detects patterns in datasets.

Now, researchers at EPFL and KAIST have created a transformer for Metal-Organic Frameworks (MOFs), a class of porous crystalline materials. By combining organic linkers with metal nodes, chemists can synthesize millions of different materials with potential applications in energy storage and gas separation.

The “MOFtransformer” is designed to be the ChatGPT for researchers that study MOFs. It’s architecture is based on an AI called Google Brain that can process natural language and forms the core of popular language models such as GPT-3, the predecessor to ChatGPT. The central idea behind these models is that they are pre-trained on a large amount of text, so when we start typing on an iPhone, for example, models like this “know” and autocomplete the most likely next word.

“We wanted to explore this idea for MOFs, but instead of giving a word suggestion, we wanted to have it suggest a property,” says Professor Berend Smit, who led the EPFL side of the project. “We pre-trained the MOFTransformer with a million hypothetical MOFs to learn their essential characteristics, which we represented as a sentence. The model was then trained to complete these sentences to give the MOF's correct characteristics.”

The researchers then fine-tuned the MOFTransformer for tasks related to hydrogen storage, such as the storage capacity of hydrogen, its diffusion coefficient, and the band gap of the MOF (an "energy barrier" that determines how easily electrons can move through a material).

The approach showed that the MOFTransformer could get results using far fewer data compared to conventional machine-learning methods, which require much more data. “Because of the pre-training, the MOFTtransformer knows already many of the general properties of MOFs; and because of this knowledge, we need less data to train for another property,” says Smit. Moreover, the same model could be used for all properties, while in conventional machine learning, a separate model must be developed for each application.

The MOFTransformer is a game-changer for the study of MOFs, providing faster results with less data and a more comprehensive understanding of the material. The researchers hope that the MOFTransformer will pave the way for the development of new MOFs with improved properties for hydrogen storage and other applications.

Facts, background information, dossiers
  • hydrogen storage
  • artificial intelligence
More about Ecole Polytechnique Fédérale de Lausanne
  • News

    Using machine learning to forecast amine emissions

    Global warming is partly due to the vast amount of carbon dioxide that we release, mostly from power generation and industrial processes, such as making steel and cement. For a while now, chemical engineers have been exploring carbon capture, a process that can separate carbon dioxide and s ... more

    A step towards solar fuels out of thin air

    EPFL chemical engineers have invented a solar-powered artificial leaf, built on a novel electrode which is transparent and porous, capable of harvesting water from the air for conversion into hydrogen fuel. The semiconductor-based technology is scalable and easy to prepare. A device that ca ... more

    Optomechanics simulates graphene lattices

    The precise control of micro-mechanical oscillators is fundamental to many contemporary technologies, from sensing and timing to radiofrequency filters in smartphones. Over the past decade, quantum control of mechanical systems has been firmly established with atoms, molecules, and ions in ... more