Statistical inference of cognitive algorithms and neural dynamics
Perhaps surprisingly, simple mathematical expressions can precisely describe animal behaviors measured in the laboratory; for example, optimal decision theory explains the random categorical choices as a probabilistic function of integrated evidence in a decision making task. Despite this apparent power of mathematics in describing cognitive tasks and behavior, when one peeks into the underlying machine–the neural circuit–that implements such elegant operations, suddenly one faces the (perhaps superficial) ineffectiveness of mathematics. My approach is bottom-up–learn the hidden structures in large-scale neural signals recorded from neural systems, and interpret their population-wide time evolution as performing neural computation without prejudice on how they should be computing. This data-driven approach to cognitive systems neuroscience has not been an attractive option until recently due to the lack of large-scale recording techniques, and the lack of advanced machine learning techniques. The nature of this bottom-up approach is statistical: Neural signals are highly variable (noisy) and only partially observed (subsampled)–there are orders of magnitude larger number of neurons involved in any cognitive behavior. We develop statistical methods and machine learning tools to study the underlying neural dynamics and how they implement various computations such as decision-making, sensory coding, attention, interval timing, working memory, and motor planning. We study both the ongoing and the task-driven neural activities within and across neural systems to understand neural & behavioral variability, neural code, and cognitive strategies at the systems level.