Russian scientists improve the method of deep learning of neural networks



[ad_1]

Researchers at the Institute of Cybernetic Intelligent Systems of the National Nuclear Research University of Russia (MEPhI) presented a new method of teaching the Boltzmann machine – a type of neural network – to optimize the processes of semantic coding, visualization and data recognition.

The results of the study are published in the journal Optical Memory and Neural Networks

.
The study of deep neural networks of different architectures is more and more popular: ultra-precise, recurrent, self-coding. Several high-tech companies, including Microsoft and Google, use deep neural networks to project different intelligent systems. With deep neural networks extended the concept of "deep" learning.

In deep learning systems, the process of selection and adjustment of features is automated. That is, the network alone determines and uses the most efficient algorithms for hierarchical extraction of features. In-depth learning is characterized by the use of large samples with the aid of a single optimization algorithm. Typical optimization algorithms adjust the parameters of all operations simultaneously, and effectively evaluate the influence of each of the parameters of the neural network using the so-called inverse propagation method

. Artificial neural networks to learn are its most intriguing properties, "says Professor of the Institute of Cybernetic Systems MEPhI, Vladímir Golovkó. "Like biological systems, neural networks are modeling themselves by trying to achieve a better pattern of behavior."

The revolution in the learning of neural networks took place in 2006, after the publication of Geoffrey Hinton which described the technique of prior training of the neural network. The article indicated that it was possible to form the multilayer neural network if each of the layers was driven using the Boltzmann machine and then it was tuned with the propagation method. reverse error. These networks have been called deep trust neural networks (Deep Belief Networks, DBN).

Golovkó badyzed the problematics and the main paradigms of deep machine learning and proposed a new teaching method for the Boltzmann machine

It could be interesting: L & # 39; SuperKEKB accelerator promises to unravel the secrets of the universe

showed that the clbadic teaching rule of this neural network is a special case of the method that he proposed.

"The American scientists Minsky and Papert demonstrated at the time that the perceptron of a layer with threshold-threshold activation function creates the linear dividing surface from the point of view of the clbadification of images and therefore can not solve the problem of disjunction It was a pessimism as to the development of neural networks, but this last statement is only valid for the perceptron of a layer with the function of the neurons. monotonic threshold or continuous threshold activation, for example, sigmoidal activation of signals, the perceptron of a layer is able to solve the problem of exclusive disjunction because it separates the incoming space images in clbades using two straight lines, "explains the professor.

Do not miss: what technology could replace current computers in the near future?

The work also badyzes the application perspectives of deep neural networks to compress, visualize and recognize data. Golovkó discussed the possibility of performing semantic coding using deep self-badociative neural networks.

This method of deep learning can be very useful in the search engines neural networks which, according to the author, will demonstrate a high speed of searching for relevant images.

More here: Science seeks to slow aging … and distribute happiness

These scientific studies are successfully applied in the fields of computer vision, speech recognition and bioinformatics [19659010] (function (d, s, id) {
var js, fjs = d.getElementsByTagName (s) [0];
if (d.getElementById (id)) returns;
js = d.createElement (s); js.id = id;
js.src = "http://connect.facebook.net/es_ES/sdk.js#xfbml=1&appId=1804609946439797&version=v2.8";
fjs.parentNode.insertBefore (js, fjs);
} (document, 'script', 'facebook-jssdk')); [ad_2]
Source link