Paper
Using computational theory to constrain statistical models of neural data
Published Jan 31, 2017 · Scott W. Linderman, S. Gershman
Current Opinion in Neurobiology
27
Citations
1
Influential Citations
Abstract
Computational neuroscience is, to first order, dominated by two approaches: the “bottom-up” approach, which searches for statistical patterns in large-scale neural recordings, and the “top-down” approach, which begins with a theory of computation and considers plausible neural implementations. While this division is not clear-cut, we argue that these approaches should be much more intimately linked. From a Bayesian perspective, computational theories provide constrained prior distributions on neural data—albeit highly sophisticated ones. By connecting theory to observation via a probabilistic model, we provide the link necessary to test, evaluate, and revise our theories in a data-driven and statistically rigorous fashion. This review highlights examples of this theory-driven pipeline for neural data analysis in recent literature and illustrates it with a worked example based on the temporal difference learning model of dopamine.
Sign up to use Study Snapshot
Consensus is limited without an account. Create an account or sign in to get more searches and use the Study Snapshot.
Full text analysis coming soon...