Journal
JOURNAL OF NEUROSCIENCE METHODS
Volume 196, Issue 1, Pages 81-87Publisher
ELSEVIER SCIENCE BV
DOI: 10.1016/j.jneumeth.2011.01.002
Keywords
Information; Mechanoreceptor; Entropy; Compression; Sorting; Algorithm
Categories
Funding
- Canadian Institutes of Health Research
- Nova Scotia Health Research Foundation
Ask authors/readers for more resources
Neurons receive, process and transmit information using two distinct types of signaling methods: analog signals, such as graded changes in membrane potential, and binary digital action potentials. Quantitative estimates of information in neural signals have been based either on information capacity, which measures the theoretical maximum information flow through a communication channel, or on entropy, the amount of information that is required to describe or reproduce a signal. Measurement of entropy is straightforward for digital signals, including action potentials, but is more difficult for analog signals. This problem compromises attempts to estimate information in many neural signals, particularly when there is conversion between the two signal formats. We extended an established method for action potential entropy estimation to provide entropy estimation of analog signals. Our approach is based on context-independent data compression of analog signals, which we call analog compression. Although compression of analog signals is computationally intensive, we describe an algorithm that provides practical, efficient and reliable entropy estimation via analog compression. Implementation of the algorithm is demonstrated at two stages of sensory processing by a mechanoreceptor. (C) 2011 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available