Global temporal dynamic landscape of pathogen-mediated subversion.....

Latest Version

Version
Update
Jun 30, 2023
Category
Installs
10+

App APKs

Global Temporal Dynamic APP

Discussion
The research described here is motivated by the temporal dynamicsof living neural systems and especially by the temporal abilities of humans and higher animals. Our brains can respond to time-varying signals, can generate time-varying patterns, can process information (think) over time, can represent concepts and images mentally for arbitrary intervals of time, and have differing states of ongoing, self-sustained activity (awake, aroused, sleeping). Furthermore, we seem to automatically have time-related skills and dynamics such as recognition of spatiotemporal patterns as they occur; coordination of internal processing in the brain, in spite of no apparent controlling time clock; and the presence of self-sustained dynamic activity in many areas of the brain, through oscillation (e.g., respiratory neurons) or other more complex continuing activity (“spontaneous activity”). This extensive array of temporal capabilities and time-varying activity points to a temporally dynamic neural networkunderlying these processes. To date, many neural models show pattern mapping abilities but lack the dynamics and temporal behavior of the systems they are intended to model.
We have explored a series of paradigms that concern dynamic activity in neural networks. We have illustrated how a simple model of a neural network can develop dynamic attractors, self-sustained activity, and chaos. Control over parameter g, a weight multiplier, allows modulation of the dynamics with a progression from a simple fixed point attractor to chaos. Once we generate chaotic activity patterns in a neural network, we can apply a stimulus pattern and lock the network into a limit cycle attractor. This scenario poses a potential way to perform pattern recognition and signal classification. Because dynamic systems can have complicated basin boundaries for their attractors, there is reason to expect increased performance and generalization capabilities from this type of approach.
Developing multiple attractors in a neural network can be accomplished via an accretional method with weight perturbations. In the resulting network, a set of initial states each evoke their own attractor. Computational tasks in pattern classification and associative memory could be accomplished through differing initial states evoking differing dynamic attractors.
In dynamic binary networks, exploration of attractor basins and the flexibility of those basins of attraction showed capacities for attractors to be considerably higher than the number of memories in the static Hopfield network (0.15n). With as few as five neurons in a dynamic binary network, thousands of basin classes—divisions of patterns into different basins—can be accomplished.
To train a specific attractor into a neural network, a neural network with time delays was trained to generate a closed-loop trajectory. The trained network generates this trajectory in spite of noisy starting conditions, and with differing initial segments. The result is a robust signal and path generator for communications and control applications.
Impulse trains add a new dimension of spatiotemporal processing in biological neural systems. Temporal patterns of nerve impulses and synchronies among ensembles of neurons are putative codes for information processing and representation. The firing activity of neurons and neural ensembles could reflect transients and dynamic attractors superimposed on the impulse train structure of biological neural processing.
The general problem of recognition and generation of spatiotemporal signals appears solvable with dynamic neural networks, although much research remains to be done. The ability to generate and train self-sustained activity, based on dynamic oscillating attractors, is shown in the preliminary results described here.
Read more

Advertisement