The first in the world the automated system of designing of multimodal neural networks was created and on its base the multimodal neural system of recognition moving objects on the video images was developed.

Series of domestic neural computers of general purpose was developed. A prototype of a base model of the neural computer “NeuroLand” was created that corresponds to the best world analogues and on some parameters exceeds them.

Technology of application of the multimodal neural networks was developed, for recognition of images of ultrasonic location in the safety systems of car's passengers and for recognition of chemical images at creation of intellectual sensors for the operative exposure of air pollution, etc.

Intelligent neural system was created, that includes three subsystems: multimodal neural general purpose system, “Neuro-Conveyer” subsystem of recurrent neural networks of a real-time and “TrendCaster” subsystem of forecasting time sequences.

Multimodal architecture of the research sample of the intelligent neural system was developed, that provides automatically connect to the system and register a new modules.

 A new method for inverse dynamics modeling in neural control systems using controlled disturbance has been developed and experimentally tested. The theory of dynamical associative memory with multiple feedback delays has been developed and experimentally proved.

Experimental software complex intended for neural control systems simulation by means of dynamic objects using recurrent neural networks has been generated.

An improved system of neuroleading dynamic objects, which took the first place at the international competition of magnetic levitation systems at the University of Siena, Italy.

It was created a theory and a programming model of dynamic associative memory of a new type that can reproduce itself under the loss of neurons, where the world's first reproducible treatment effect of amnesia by reminding of the past takes place.

Developed and experimentally tested the pseudoregulation method and a new training recurrent neuronetworks technology, which increased to 2-10 times neuroforecasting accuracy and to 10-30% multistep neuroforecasting accuracy.

A new method of learning neural networks of direct distribution of delay line signals for multistep forecasting of numerical sequences, neurotechnology and software of multistep forecasting that provide two-fold improvement forecasts quality were developed. 

A new concept of open dynamic systems based on the concept of surveillance, which creates entirely new opportunities for interpreting and predicting the behavior of objects of any nature, the study of complex processes of socio-economic and ecological systems, development of effective models of information processes in nature and society was developed. 

Developed and experimentally tested method of sampling to improve the accuracy of neural network forecasting methods of time series that contain long-term dependence. A new type of activation functions, providing overcome the fundamental effect of the disappearance of gradients negative for all types of training of deep neural networks. 

Deep recurrent neural networks are used to solve the problem of detecting arguments in the process of natural language (“Argument Mining”), which allowed a 12% improvement in quality compared to the best in the world. The theory of orthogonal neural networks has further developed, which allows overcoming the negative effect of the disappearance of gradients in perceptron-like neural networks of direct distribution due to the use of the developed method of orthogonal projections of matrixes of weight coefficients.

For the first time, deep recurrent neural networks were used to solve the problem of categorizing the complexity of medical terms, developed the software package “BiLSTM-CNN-CRF tagger” for the processing of natural language using deep recurrent neural networks. The obtained scientific results ensure the possibility of using developed methods and software tools for solving sequencing problems, in particular, processing natural language. The obtained results were implemented in educational process of the Ukrainian Catholic University (Lviv).

The concept of open dynamic systems is proposed, which gives a new interpretation of the processes of control of adaptive systems and learning of deep neural networks, reveals the known paradox of the
arrow of time and identifies possible causes of gross errors of deep learning methods. The practical significance of the obtained scientific results lies in the possibility of using the concept of open dynamic systems for deeper study and improvement of deep learning processes and theoretical justification of new developments in artificial intelligence.

The interaction of internal information flows in deep learning
was studied, a fundamentally new model of the neural network as an open dynamic system was offered. concept of an open dynamical system as a general model of processes of development, management and training of dynamic systems of any nature was developed. This concept allows to expand the subject area of systems theory to open dynamic systems.

There was offered a new modification of the Hopfield symmetric dynamic network, suitable for modeling the processes of associative memory content recovery and simulating the behavior of the DNA molecule, which can be applied in artificial intelligence systems to realize self-organization effects. In addition, the structure and new properties of the nonlinear operator responsible for the process of convergence and updating the content of the associative memory of the dynamic Hopfield network were determined.

The behavior of multistable neural networks in the convergence mode was studied. It was found that when the minimum of the energy function is reached, there appears an excess of energy, formed during the nonlinear transformation of the vector of the current state into the external reaction of the neural network. This excess represents a hidden source of energy ensuring its existence and the development of a living cell.
 


       Last modified: Jun 20, 2023