Skip to content

Spikes Exploring the Neural Code

ISBN-10: 0262681080

ISBN-13: 9780262681087

Edition: 1999 (Reprint)

Authors: Fred Rieke, David Warland, William Bialek, Rob De Ruyter van Steveninck

List price: $48.00
Shipping box This item qualifies for FREE shipping.
Blue ribbon 30 day, 100% satisfaction guarantee!
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Description:

Intended for neurobiologists with an interest in mathematical analysis of neural data as well as physicists and mathematicians interested in real nervous systems, Spikes provides a review of concepts in information theory and statistical decision theory.
Customers also bought

Book details

List price: $48.00
Copyright year: 1999
Publisher: MIT Press
Publication date: 7/26/1999
Binding: Paperback
Pages: 408
Size: 6.75" wide x 9.25" long x 0.75" tall
Weight: 1.496
Language: English

Series Foreword
Preface
Acknowledgments
Introduction
The classical results
Defining the problem
Central claims of this book
Foundations
Characterizing the neural response
Probabilistic responses and Bayes' rule
Rates, intervals, and correlations
Input/output analysis
Models for firing statistics
Taking the organism's point of view
Intervals in the signal and intervals between spikes
What can small numbers of spikes tell the braind
Response-conditional ensembles
Reading the code
Why it might work
An experimental strategy
Qualitative features of a first test
Summary
Quantifying information transmission
Why information theory?
Entropy and available information
Entropy of spike trains
Mutual information and the Gaussian channel
Time dependent signals
Information transmission with spike trains
Can we really "measure" information transmission?
Information transmission with discrete stimuli
Stimulus reconstruction and information rates
Entropy and information with continuous stimuli
Mechanical sensors in the cricket cercal system
Amphibian eyes and ears
Frogs and frog calls
Summary
Reliability of computation
Reliability of neurons and reliability of perception
Historical background
Photon counting
Auditory discrimination
Motion discrimination in monkey vision
Hyperacuity
Where is the limit?
Experiments with single neurons
Temporal hyperacuity
Motion processing in the fly visual system
Limits to discrimination
Discrimination experiments with H1
Continuous estimation
Summary
Directions
Arrays of neurons
Natural signals
Optimal coding and computation
Epilogue: Homage to the single spike
Mathematical asides
Rates as expectation values
Two-point functions
Wiener kernels
Poisson model I
Poisson model II
Estimation from independent responses
Conditional mean as optimal estimator
Practical calculations of reconstruction filters
The "acausal-shifted" calculation
Power series expansions of the K[subscript n]
Entropy of Gaussian distributions
Approximating the entropy of spike trains
Maximum entropy and spike counts
The Gaussian channel
Gaussians and maximum entropy
Wiener-Khinchine theorem
Maximizing information transmission
Maximum likelihood
Poisson averages
Signal to noise ratios with white noise
Optimal filters
References
Index