#### Symbolic Dynamics of Neurophysiological Data

ERP data are given as large ensembles of short nonstationary (transient) and noisy time series. Symbolic dynamics of ERPs describes intertrial coherence of polarity deflections by running cylinder entropies and related measures. We have provided heuristics for symbolic encoding of ERP data, such as the median encoding (REF), the half-wave encoding (REF; REF; REF) and the stochastic resonance analysis (SRA) based on the findings of threshold stochastic resonance (REF). Especially the SRA allows to discriminate ERPs for conditions where voltage averaging fails (REF, REF, REF, REF ).

The Figure shows a 3-symbol encoding of a noisy signal exhibiting stochastic resonance at the extrema of the signal.

#### Emergence in Complex Neural Networks

A leaky integrator (LI) unit is the most simple model neuron described by an ordinary differential equation (REF). We have shown that at least two recurrently connected LI units may form nonlinear neural oscillators possessing limit cycles, which are known, e.g., in thalamo-cortical pathways (REF). We studied networks of coupled LI units in order to model global properties of the EEG (REF, REF). Moreover, we are also investigating the issue of contextual emergence in neural networks (REF).

The Figure shows simulated EEG power spectra obtained from recurrent network of 20, 100, 200, 500, and 1000 LI units whose synaptic connections were randomly drawn such that 80% excitatory and 20% inhibitory synapses have been created. The spectra are computed for the oscillatory phase transition where super-cycles emerge in the network’s topology.

#### Neural Field Theories

Very large networks of LI units can be described by a continuum approximation (REF). Starting from the LI equation the sum over the nodes connected with one unit has to be replaced by an integral transformation of a neural field quantity, where the continuous parameter now indicates the position of a unit in the network. Correspondingly, the synaptic weights turn into a kernel function. In addition, for large networks, the propagation velocity of neural activation has to be taken into account. We discuss the solvability and invertibility of neural field equations for general synaptic kernels (REF, REF, REF) and their applicability to computational psycholinguistics (REF, REF) and cognitive science in general (REF).

The Figure should just illustrate the continuum limit starting from a discrete neural network and approaching a continuous neural tissue.

## Speak Your Mind