Search results

Filters

  • Journals
  • Authors
  • Keywords
  • Date
  • Type

Search results

Number of results: 2
items per page: 25 50 75
Sort by:
Download PDF Download RIS Download Bibtex

Abstract

The way brain networks maintain high transmission efficiency is believed to be fundamental in understanding brain activity. Brains consisting of more cells render information transmission more reliable and robust to noise. On the other hand, processing information in larger networks requires additional energy. Recent studies suggest that it is complexity, connectivity, and function diversity, rather than just size and the number of neurons, that could favour the evolution of memory, learning, and higher cognition. In this paper, we use Shannon information theory to address transmission efficiency quantitatively. We describe neural networks as communication channels, and then we measure information as mutual information between stimuli and network responses. We employ a probabilistic neuron model based on the approach proposed by Levy and Baxter, which comprises essential qualitative information transfer mechanisms. In this paper, we overview and discuss our previous quantitative results regarding brain-inspired networks, addressing their qualitative consequences in the context of broader literature. It is shown that mutual information is often maximized in a very noisy environment e.g., where only one-third of all input spikes are allowed to pass through noisy synapses and farther into the network. Moreover, we show that inhibitory connections as well as properly displaced long-range connections often significantly improve transmission efficiency. A deep understanding of brain processes in terms of advanced mathematical science plays an important role in the explanation of the nature of brain efficiency. Our results confirm that basic brain components that appear during the evolution process arise to optimise transmission performance.

Go to article

Authors and Affiliations

B. Paprocki
A. Pregowska
J. Szczepanski
Download PDF Download RIS Download Bibtex

Abstract

One of the mathematical tools to measure the generation rate of new patterns along a sequence of symbols is the Lempel-Ziv complexity (LZ). Under additional assumptions, LZ is an estimator of entropy in the Shannon sense. Since entropy is considered as a measure of randomness, this means that LZ can be treated also as a randomness indicator. In this paper, we used LZ concept to the analysis of different flow regimes in cold flow combustor models. Experimental data for two combustor’s configurations motivated by efficient mixing need were considered. Extensive computer analysis was applied to develop a complexity approach to the analysis of velocity fluctuations recorded with hot-wire anemometry and PIV technique. A natural encoding method to address these velocity fluctuations was proposed. It turned out, that with this encoding the complexity values of the sequences are well correlated with the values obtained by means of RMS method (larger/smaller complexity larger/smaller RMS). However, our calculations pointed out the interesting result that most complex, this means most random, behavior does not overlap with the “most turbulent” point determined by the RMS method, but it is located in the point with maximal average velocity. It seems that complexity method can be particularly useful to analyze turbulent and unsteady flow regimes. Moreover, the complexity can also be used to establish other flow characteristics like its ergodicity or mixing.

Go to article

Authors and Affiliations

S. Blonski
A. Pregowska
T. Michalek
J. Szczepanski

This page uses 'cookies'. Learn more