TY - JOUR N2 - The way brain networks maintain high transmission efficiency is believed to be fundamental in understanding brain activity. Brains consisting of more cells render information transmission more reliable and robust to noise. On the other hand, processing information in larger networks requires additional energy. Recent studies suggest that it is complexity, connectivity, and function diversity, rather than just size and the number of neurons, that could favour the evolution of memory, learning, and higher cognition. In this paper, we use Shannon information theory to address transmission efficiency quantitatively. We describe neural networks as communication channels, and then we measure information as mutual information between stimuli and network responses. We employ a probabilistic neuron model based on the approach proposed by Levy and Baxter, which comprises essential qualitative information transfer mechanisms. In this paper, we overview and discuss our previous quantitative results regarding brain-inspired networks, addressing their qualitative consequences in the context of broader literature. It is shown that mutual information is often maximized in a very noisy environment e.g., where only one-third of all input spikes are allowed to pass through noisy synapses and farther into the network. Moreover, we show that inhibitory connections as well as properly displaced long-range connections often significantly improve transmission efficiency. A deep understanding of brain processes in terms of advanced mathematical science plays an important role in the explanation of the nature of brain efficiency. Our results confirm that basic brain components that appear during the evolution process arise to optimise transmission performance. L1 - http://journals.pan.pl/Content/115171/PDF/07D_225-233_01323_Bpast.No.68-2_27.04.20_K2A_SS_TeX.pdf L2 - http://journals.pan.pl/Content/115171 PY - 2020 IS - No. 2 (i.a. Special Section on Computational Intelligence in Communications) EP - 233 DO - 10.24425/bpasts.2020.131844 KW - neural network KW - entropy KW - mutual information KW - noise KW - inhibitory neuron A1 - Paprocki, B. A1 - Pregowska, A. A1 - Szczepanski, J. VL - 68 DA - 30.04.2020 T1 - Optimizing information processing in brain-inspired neural networks SP - 225 UR - http://journals.pan.pl/dlibra/publication/edition/115171 T2 - Bulletin of the Polish Academy of Sciences Technical Sciences ER -