04 April 2013

Modeling biomolecules

Molecular modeling

Anton Chugunov, PostNaukaMolecular modeling is a way to simulate the structure and functions of molecules on a computer.

First of all, I am interested in biological molecules.

An experiment is very important for biologists to study nature, because it allows them to directly study the laws of nature. Physicists study physical laws, and biologists study the patterns that exist in life. Zoologists are interested in how animals are arranged, botanists – plants, ecologists – populations of animals and plants on Earth. Molecular biologists study how molecules interact with each other and what results inside a living cell. There are different approaches here. Biochemistry is interested in chemical reactions, transformations of some molecules into others, how metabolism works, enzymes, how proteins are obtained from DNA, and so on.

(By biochemistry I mean a wide range of biochemical sciences: this includes molecular biology, virology, and so on.) And biophysics is interested in physical laws: how can two molecules interact with each other, according to what laws does this happen? And can we deduce these laws for ourselves and use them for theoretical modeling?

How to measure the interaction of two molecules with each other? This requires special equipment. In biochemistry, this can be the registration of fluorescence, which allows us to judge the interaction of two molecules, or a radioactive signal, by which it can be determined that one molecule "sat down" on another. But when we know the laws of interaction of molecules, we can proceed to simulate this process and conduct the experiment actually on a computer. This is a special paradigm of conducting an experiment, denoted by the Latin term in silico. It is a continuation of other classical Latinisms used in experimental science. This is in vivo, that is, in a living system, and in vitro, that is, "in glass" (or in the laboratory). In silico pushes the studied phenomenon even further towards a theoretical description of the process. Literally, it means "in silicon", that is, on a computer chip, in computer memory.

Even the ancient Greeks suggested that matter consists of atoms. In principle, it was probably enough to think about what everything consists of. But until the 20th century, the existence of atoms and molecules was not fully confirmed. This is due to the work of Bohr and other physicists, who have already shown beyond doubt that there are atoms and that there are molecules. By the middle of the XX century, progress was enormous. We have already discovered the structure of biological molecules, the structure of DNA, the structure of many proteins (hemoglobin, for example). At that time, the basics of molecular modeling were already laid. The first molecular modelers were chemists who drew the structure of a molecule on paper. It was already possible to draw and imagine two molecules side by side, how the electron density moves or how electrophiles attack nucleophiles.

Famous chemist Linus Pauling in one of the videos told about the discovery of the alpha helix – one of the main elements in protein chemistry. He says that he got sick and lay at home with a terrible cold and read lousy detective stories. At some point, he got tired of it, he took a piece of paper and began to draw chemical bonds. And he drew some kind of polypeptide skeleton, that is, what proteins are made of, and began to fold this sheet. He folded and folded, and then it became obvious to him that here it is - the alpha helix – this is how the main chain running allows hydrogen bonds to close on each other and form stable elements of the secondary structure. After that, he recovered, immediately went to work and a few months later published almost eight articles simultaneously in the journal "Reports of the American Academy of Sciences" or PNAS.

We all need to learn from Pauling, but since you have to be born a genius, for greater fruitfulness in our time, modeling of molecules is done on a computer. They take molecular editors, build molecules in them and run a calculation that allows us to track how the molecule behaves. Let's say we can look at molecular dynamics, how a protein exists in solution, or how two molecules recognize each other. How, let's say, a small molecule first floats somewhere in the solution, and then interacts with the receptor, and something happens. Of course, this is all very complicated, and the physics underlying these phenomena is also very complicated.

The most correct, most detailed concept underlying the behavior of molecules is quantum mechanics. But the Schrodinger equation, which is the basic equation of quantum mechanics, is too complicated to be solved for large biological molecules. Therefore, molecular modeling mainly operates with classical laws, namely, Newton's second law, which has been known to everyone since school. This makes it possible to represent the molecule as a set of "balls" held together by "springs". Here, let's say we take two balls on a spring, stretch them, and they converge back under the action of the elastic force. Approximately the same thing happens with molecules in molecular modeling. Between each two atoms in a molecule, we can prescribe an explicitly acting force. And then start solving this very Newton equation. Then we will see how the movement of the molecule will occur in time and trace its dynamics.

There are other areas of molecular modeling. Such, for example, as bioinformatics, which allows you to find patterns inside genetic texts or in amino acid sequences of proteins. There is a special approach that allows you to predict the structure of an unknown protein. For example, if we know how one receptor works, we can build a spatial model of another receptor, which, as we assume, will be arranged approximately the same as the first one, but with its own characteristics. This method is called "homology-based modeling". In particular, receptors associated with G-proteins can be modeled using this approach.

Molecular modeling is constantly being improved and, frankly, it still has a lot to grow. Its purpose, which I have voiced, is essentially a "molecular microscope": to study in detail the details of the interaction of molecules with each other and draw conclusions from this. It is important to mention what this might be useful for. This may be necessary for the rational design of medicines. If we know the structure of the receptor and the structure of the molecule acting on it, we can simulate the interaction and find out what happens next (for example, "see" the activation of the receptor). Or we may not know the chemical structure of the molecule acting on this receptor and pick it up already on the computer. Then it will be just an example of rational drug design. But this process is quite complicated, and people have not yet learned to do it confidently. Therefore, one of the global challenges facing molecular modeling is to learn how to solve the problem of designing drugs using the structure of receptors and other molecular targets. Then the existence of molecular modeling will be fully justified, and everyone will understand why it exists in the world.

The author is a candidate of physical and Mathematical Sciences, a researcher at the Laboratory for Modeling Biomolecular Systems of the Institute of Bioorganic Chemistry named after Academicians M.M. Shemyakin and Yu.A. Ovchinnikov of the Russian Academy of Sciences.

Portal "Eternal youth" http://vechnayamolodost.ru04.04.2013

Found a typo? Select it and press ctrl + enter Print version