INSTRUCTOR:

Prof. Anatoli Gorchetchnikov

Office: Rm. 213, 677 Beacon Street

Office hours: Wednesday 10:00-1pm, or by appointment

Email: anatoli (at) bu (dot) edu

TEACHING ASSISTANT:

None

LAB TIME:

Friday 10:00-1pm

REQUIRED TEXT:

Dayan, P. and Abbott, L.F.
(2001). *
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems.*.
Cambridge, MA: MIT Press. Referred to as **D&A** in
syllabus and class notes.

RECOMMENDED TEXTS:

Brauer, F. and Nohel, J.A. (1989). *The Qualitative Theory of Ordinary
Differential Equations: An Introduction*.
New York: Dover.

Kandel, E., Schwartz,
J.H., and Jessell, T.M. (2000). *Principles of Neural Science*, 4^{th} Edition. New York: McGraw-Hill.
Referred to as **PNS**.

Izhikevich, E.M. (2007). *Dynamical Systems in Neuroscience:
The Geometry of Excitability and Bursting.* Cambridge, MA, MIT Press.

Levine, D.S. (2000). *Introduction to Neural and Cognitive Modeling*,** **2^{nd}
Edition. Hillsdale, NJ: Erlbaum. Referred to
as **NCM**.

Shepherd, G.M. (2004). *Synaptic
Organization of the Brain*. New York, NY: Oxford
University Press.

*Useful Matlab
references:*

COURSE OVERVIEW:

CN510 is designed to introduce students to important themes and approaches in the computational modeling of biological neural systems. The class combines systems-level neuroscience, mathematical modeling techniques, and computer simulation techniques. Particular simple neural models will be covered in detail, thus exposing students to mathematical modeling techniques that will serve as the basic building blocks for a number of the sensory, motor, and memory models discussed in the other CN5xx courses. A major theme of CN510 is the use of these modeling techniques to tie together anatomical, physiological, and psychological data, as exemplified by the neural modeling studies covered in the course.

GRADING CRITERIA:

Grades are
determined by performance on homework assignments and examinations. The in-class
final exam will be worth 30%, and all homework assignments will make up equally
the remaining 70% of the grade. It is recommended to type homework reports in LaTeX
using this template and
this style file. **Participation
in class discussions will play a role in determining the final letter grade in
borderline cases.**

Late
homework policy: 10% penalty if turned in less than one week late, 20% penalty
for 1-2 weeks late, and 30% penalty for > 2+ weeks late. **No late homework will
be accepted after the final exam.**

OTHER USEFUL TEXTS:

Anderson, J.A. and Rosenfeld,
E., (Eds.), (1988). *Neurocomputing**: Foundations
of Research*. Cambridge, MA: MIT Press. Referred to as **NFR**. [Out of print, available in CompNet Library.]

Arbib MA (ed.) (1995). *The
Handbook of Brain Theory and Neural Networks*. Cambridge, MA: MIT Press.
Referred to as **HBT**.

Churchland, P. (1986).
*Neurophilosophy**: Toward a Unified Science of the
Mind/Brain.* Cambridge, MA: MIT Press.

Schwartz, E.L. (1990*). Computational
Neuroscience.* Cambridge, MA: MIT Press.

Carpenter, G.A. and Grossberg,
S. (1991). *Pattern Recognition by Self-organizing Neural Networks.* Cambridge, MA:
MIT Press.

Grossberg, S. (1982). *Studies of Mind and Brain*. Kluwer/Reidel
Press. Referred to as **SMB**. [Out of print, available
in CompNet library.]

Koch, C. and Segev, I. (1989).
*Methods in Neuronal Modeling*.
Cambridge, MA: MIT Press.

McClelland, J.L. and Rumelhart, D.E. (1988).
*Explorations in Parallel Distributed
Processing: A Handbook of Models, Programs, and Exercises*. MIT Press.

Martin, J.H. (1996). *Neuroanatomy: Text and Atlas*. Stamford, CT:
Appleton & Lange.

**Lecture ** 1
(September 2)

**Introduction to Brain Structure and
Function, Experimental Techniques, and Brain Theories**

I will provide an
overview of central nervous system's anatomy and functions, including a brief treatment of
theories of brain function from the late 18^{th} century to the
present. We will discuss experimental techniques for measuring brain anatomy and
physiology from the perspective of a theoretical modeler.

Required readings:

*
Functions of
the Human Nervous System*, Encyclopedia Britannica.

Supplemental readings:

PNS Chapter 1 [Kandel, E.R. The brain and behavior.]

PNS Chapter 17 [Amaral, D.G. The anatomical organization of the central nervous system.]

NCM Chapter 1 [Levine, D.S. Brain and machine: The same principles?]

NCM Chapter 2 [Levine, D.S. Historical outline.]

Lecture Notes:

Homework Assignment (Due September 9):

Short (1-2 pages) essay describing your background, expectations for the class, and what you want to learn in this class but did not find in the syllabus.

**Lecture** 2
(September 4)

**Introduction to Computational Neuroscience
and Various Modeling Scales**

We will discuss the goals of computational neuroscience, general approaches to model the nervous systems, and various scales the models can concentrate on. Several example neural models, each defined at a different grain of analysis will be briefly presented. We will also discuss an idea of multiscale modeling, its advantages, and related issues.

Required readings:

Churchland, P.S., Koch, C., and Sejnowski, T.J.
(1990).
What is computational neuroscience? In E. Schwartz (ed.): *Computational Neuroscience*. Cambridge, MA:
MIT Press.

Supplemental readings:

Guenther, F.H. (2002).
Neural
networks: Biological models and applications. *International Encyclopedia of the Social
& Behavioral Sciences *(vol. 15, pp. 10534-10537)** . **Oxford: Elsevier Science.

Lecture Notes:

**Lecture** 3
(September 9)

**Mathematical and Computational Concepts
Used in Neural Modeling I**

Using examples of leaky integrator and leaky Integrate-and-Fire (lIaF) neurons, I will review mathematical concepts used in neural modeling including ordinary differential and difference equations, matrices and eigenvalues. I will point out some issues introduced by discrete digital computers when we simulate these concepts.

Required readings:

D&A Appendix A.1, pp399-403; Appendix A.3

Supplemental readings:

NCM Appendix 2 [Levine, D.S. Difference and differential equations in neural networks.]

Rotter S., and Diesmann M. (1999).
Exact digital simulation of time-invariant linear systems with applications to neuronal
modeling . *Biol. Cybern.*. 81: 381-402. (From the beginning up to section 3.1.2
(inclusive)

Homework due:

Essay on your background and goals in CN510 is due before the class starts.

Lecture Notes:

Homework Assignment (Due September 16):

**Lecture** 4
(September 11)

**Mathematical and Computational Concepts
Used in Neural Modeling II**

We will discuss critical points, stability, and basic phase plane analysis in application to neural models

Required readings:

Izhikevich, E.M. (2007). *Dynamical Systems in Neuroscience:
The Geometry of Excitability and Bursting.* Cambridge, MA, MIT Press.
Chapter 1

Supplemental readings:

http://en.wikipedia.org/wiki/Phase_plane

Lecture Notes:

**Lecture ** 5
(September 16)

**Biophysics of the Cell Membrane
as a Foundation of the Hodgkin-Huxley Equation**

I will briefly remind you how electric current flows in ionic solutions, present the concepts behind mapping cell elements into equivalent electrical circuits, and analyze the resulting circuit to produce Hodgkin-Huxley equation.

Required readings:

D&A Chapters 5 (sections 1-4, 6) and 6 (sections 3-4)

Supplemental readings:

Shepherd, G.M. (2004). *Synaptic Organization of the Brain*. New York,
NY. Oxford University Press. Chapter 1.

PNS Chapter 7 [Koester, J. and

PNS Chapter 9 [Koester, J. and

PNS Appendix A [Koester, J. Current flow in neurons.]

PNS Chapter 6
[Siegelbaum, S.A. and Koester, J. Ion
channels.]

PNS Chapter 8 [Koester, J. and Siegelbaum, S.A. Local signaling: Passive electrical properties of the neuron.]

Hodgkin, A.L. and Huxley, A.F. (1952).
A quantitative description of membrane current and its application
to conduction and excitation in nerve. *J. Physiology*, **117**, 500-544

Homework due:

Leaky integrator assignment is due before the class starts.

Lecture Notes:

Homework Assignment (Due September 23):

Numerical Integration of Leaky Integrator and Leaky Integrate and Fire Neurons.

**Lecture ** 6
(September 18)

**Extensions and Simplifications of
the Hodgkin-Huxley Model**

We will discuss how to combine neuronal equations into a compartmental model and take a brief look at cable theory for simulations of the cellular structure. We will also discuss how to simplify Hodgkin-Huxley model of spiking to derive Izhikevich neuron.

Required readings:

Izhikevich, E.M. (2007). *Dynamical Systems in Neuroscience:
The Geometry of Excitability and Bursting.* Cambridge, MA, MIT Press.
Chapter 8

Supplemental readings:

Ermentrout, GB and Kopell, N. (1986). Parabolic bursting in an
excitable system coupled with a slow oscillation, *SIAM-J.-Appl.-Math*, **46**,
233-253.

Brette R. and Gerstner W. (2005), Adaptive Exponential Integrate-and-Fire Model
as an Effective Description of Neuronal Activity, *J. Neurophysiol.*, **94**,
3637-3642.

Lecture Notes:

**Lecture** 7
(September 23)

**Feedforward Shunting Competitive
Network**

We will discuss how from the need to factorize an input pattern from the total input energy we can derive shunting network. Feedforward shunting network properties of automatic gain control and activity normalization will be discussed.

Required readings:

Appendix C of Grossberg, S. (1980). How does the brain
build a cognitive code? *Psychological
Review*, **87**, pp. 1-51. [SMB Chapter 1; NFR Chapter
24.]

Supplemental readings:

SMB Chapter 11 [Grossberg, S. Behavioral contrast in short term memory.]

NCM Chapter 4 [Levine, D.S. Competition, lateral inhibition, and short-term memory.]

Furman, G.G. (1965). Comparison of models for
subtraction and shunting lateral-inhibition in receptor-neuron fields.
*Kybernetika*, **2**, 257-274.

Lecture Notes:

Homework Assignment (Due September 30):

Replication of 20 cases of Izhikevich neuron.

**Lecture** 8
(September 25)

**Shunting Network and Experimental
Data**

Further properties of feedforward shunting networks are treated, including the shift property and Weber's law. The similarities and differences between the shunting equations and membrane equations from earlier lectures are discussed.

Required readings:

Werblin, F.S. (1971).
Adaptation in a
vertebrate retina: Intracellular recording in*
Necturus*. *Journal of Neurophysiology*, **34**,
pp. 228-241.

Supplemental readings:

Cornsweet, T. (1970). *Visual
Perception*, Chapter 11. New York, NY: Academic Press.

Lecture Notes:

**Lecture** 9
(September 30)

**Recurrent Competitive Shunting Networks
(or Fields)**

We will discuss pattern preservation and transformations during short-term memory storage in recurrent competitive fields (RCFs). A major issue is the nature and role of the feedback (recurrent) signal. Properties associated with linear, faster-than-linear, slower-than-linear, and sigmoid feedback signal functions will be analyzed and discussed.

Required readings:

Appendix D of Grossberg, S. (1980). How does the brain
build a cognitive code? *Psychological
Review*, **87**, pp. 1-51.

Grossberg, S. (1973).
Contour enhancement, short term memory, and
constancies in reverberating neural networks. *Studies in Applied Mathematics*, **LII**,
pp. 213-257. [SMB Chapter
8.]

Supplemental readings:

NCM Chapter 4 [Levine, D.S. Competition, lateral inhibition, and short-term memory. ]

Grossberg, S. and Levine, D. (1975).
Some developmental and attentional biases in the contrast
enhancement and short term memory of recurrent
neural networks. *Journal of
Theoretical Biology*, **53**, pp. 341-380.

Homework due:

Replication of 20 cases of Izhikevich neuron assignment is due before the class starts.

Lecture Notes:

Homework Assignment (Due October 7):

Additive and Shunting Networks.

**Lecture** 10
(October 2)

**Synaptic Transmission in the Brain
**

We will discuss synaptic dynamics from action potential to postsynaptic response and identify the possibilities for plasticity during this process.

Required readings:

Shepherd, G.M. (2004). *Synaptic
Organization of the Brain*. New York, NY. Oxford University
Press. Chapter 2.

Supplemental readings:

PNS Chapter 10 [Kandel, E.R. and Siegelbaum, S.A. Overview of synaptic transmission.]

PNS Chapters 12-15 [Kandel, E.R., Siegelbaum, S.A., and Schwartz J.H. Synaptic integration; Modulation of synaptic transmission; Transmitter release; Neurotransmitters.]

Lecture Notes:

**Lecture** 11
(October 7)

**Modeling the Synaptic Response
**

We will discuss current based and conductance based models of synaptic responses in spiking networks, and possible applications of Rotter-Diessmann approach to these models.

Required readings:

D&A Chapter 5 (sections 8-9).

Supplemental readings:

Rotter S., and Diesmann M. (1999).
Exact digital simulation of time-invariant linear systems with applications to neuronal
modeling . *Biol. Cybern.*. 81: 381-402. (From the beginning up to section 3.1.2
(inclusive)

Homework Due:

Additive and Shunting Networks assignment is due before the class starts.

Lecture Notes:

Homework Assignment (Due October 16):

**Lecture** 12
(October 9)

**Learning Rules for Continuous Models
**

We will discuss several learning laws described at the network level and based on neurophysiological findings. Concepts of local and global learning rules are introduced. A set of Hebbian rules is analyzed for stability and properties of resulting weights.

Required readings:

D&A Chapter 8 (section 1-3).

Vasilkoski, Z. et al. (2011).
Review of stability properties of neural plasticity rules for implementation on
memristive neuromorphic hardware. In:
*Proceedings of International Joint Conference on Neural Networks*. San Jose,
CA.

Supplemental readings:

Levy, W.B. and Desmond, N.L. (1985).
The rules of elemental synaptic plasticity. In W.B. Levy,
J.A. Anderson, and S. Lehmkuhle (Eds.), *Synaptic modification, neuron selectivity,
and nervous system organization***.** Hillsdale, NJ: Erlbaum.

NCM Chapter 3 [Levine, D.S. Associative learning and synaptic plasticity.]

PNS Chapter 63 [Kandel, E.R. Cellular mechanisms of learning and the biological basis of individuality.]

Lecture Notes:

**No class - Monday schedule**
(October 14)

**Lecture** 13
(October 16)

**Spike-Timing-Dependent Plasticity I**

We will start discussing the experimental findings and modeling using temporally asymmetric learning rules.

Required readings:

Morrison, A, Diesmann, M and Gerstner, W. (2008).
Phenomenological models of synaptic
plasticity based on spike timing.
*Biological Cybernetics* 98: 459-478.

Izhikevich E.M. and Desai N.S.
(2003).
Relating STDP to BCM. *Neural Computation* 15: 1511-1523

Supplemental readings:

Abbott, LF and Nelson SB (2000) Synaptic plasticity:
taming the beast. *Nat Neurosci*. Suppl:1178-83. Review.

Song, S., Miller, KD and Abbott, LF (2000)
Competitive Hebbian learning through
spike-timing-dependent synaptic plasticity. * Nat Neurosci*. 3(9):919-26.

G-Q Bi, M-M Poo (2001) Synaptic Modifications by Correlated Activity: Hebb's
Postulate Revisited. *Ann Rev Neurosci*, 24: 139-166

Mehta, MR, Quirk, MC and Wilson, MA. (2000). Experience dependent asymmetric shape of hippocampal receptive fields. Neuron 25: 707-715.

Homework Due:

Recurrent Competitive Field assignment is due before the class starts.

Lecture Notes:

Homework Assignment (Due October 23):

Communication in spiking network.

**Lecture** 14
(October 21)

**Spike-Timing-Dependent Plasticity II**

We will continue discussing modeling of temporally asymmetric learning rules. Spatially and temporally local STDP rule is introduced and analyzed.

Required readings:

Gorchetchnikov, A., Versace, M., and Hasselmo,
M.E. (2005). A model
of STDP based on spatially and temporally local information: Derivation and
combination with gated decay. *Neural Networks* 18, 458466.

Supplemental readings:

Gorchetchnikov,
A. and Grossberg, S. (2007)
Space, time and learning
in the hippocampus: How fine spatial and temporal scales are expanded into
population codes for behavioral control. *Neural Netw.* 20, 18293.

Lecture Notes:

**Lecture** 15
(October 23)

**Outstar Network and Classical
Conditioning**

We will look at a classical conditioning model example in terms interactions between neuronal activation and weight changes, the resulting outstar network, and outstar learning theorem.

Required readings:

Appendix B of Grossberg,S. (1980). How does the brain
build a cognitive code? *Psychological
Review*, **87**, pp. 1-51. [SMB Chapter 1.]

Sections I-IV of Grossberg, S. (1974)
Classical and instrumental learning by neural networks. *Progress in Theoretical Biology*, **3**,
pp. 51-141. [SMB Chapter 3.]

Supplemental readings:

None

Homework due:

Communication in spiking network assignment is due before the class starts

Lecture Notes:

Homework Assignment (Due October 30):

**Lecture** 16
(October 28)

**Outstar Network and Spatio-Temporal
Sampling**

We will discuss the use of a chain of outstars in a model of temporal sequence learning, the associative avalanche.

Required readings:

Section V of Grossberg, S. (1974)
Classical and instrumental learning by neural networks. *Progress in Theoretical Biology*, **3**,
pp. 51-141. [SMB Chapter 3.]

Supplemental readings:

Lashley, K.S. (1951). The problem of serial order in behavior. Reprinted in F.A.
Beach et .al (Eds.), *The Neuropsychology of Lashley*.
New York: McGraw-Hill, 1960.

Lecture Notes:

**Lecture** 17
(October 30)

**Introduction to Neural Pathways and Cortical
Organization**

We will discuss the main neural pathways of the sensory and motor systems, and the organization of unimodal and association areas of cortex.

Required readings:

None

Supplemental readings:

PNS Chapter 18 [Amaral, D.G. The functional organization of perception and movement.]

PNS Chapter 19 [Saper, C.B., Iversen, S., and Frackowiak, R. Integration of sensory and motor function: The association areas of the cerebral cortex and the cognitive capabilities of the brain.]

Homework due:

Outstar Network assignment is due before the class starts

Lecture Notes:

Homework Assignment (Due November 6):

Spike-Timing Dependent Plasticity.

**Lecture** 18
(November 4)

**Introduction to Models of Cortical
Maps**

We will discuss von der Malsburg (1973) and Grossberg (1976) models of the development of neural feature detectors using competitive learning.

Required readings:

D&A Chapter 8 (section 3 reread).

von der Malsburg, C. (1973).
Self-organization of
orientation sensitive cells in the striate cortex.
*Kybernetik*, **14**,
pp. 85-100. [NFR Chapter 17.]

Grossberg, S. (1976).
Adaptive pattern classification and universal
recoding I: Parallel development and coding of neural
feature detectors. *Biological
Cybernetics*, **23**, pp. 121-134. [SMB Chapter 12; NFR Chapter 19.]

Supplemental readings:

Sutton, G.G. III, Reggia, J.A., Armentrout, S.L.,
and D'Autrechy, C.L. (1994).
Cortical map reorganization as a competitive process. *Neural
Computation*, **6**, 1-13.

Grajski, K.A., and Merzenich, M.M. (1990).
Hebb-type dynamics is sufficient to account for the
inverse magnification rule in cortical somatotopy. *Neural Computation*, **2**,
71-84.

Lecture Notes:

**Lecture** 19
(November 6)

**Introduction to Models of Cortical
Maps II**

We will continue to discuss competitive learning models of cortical maps by looking at Kohonen (1982) model, comparing all three models and briefly looking at Cohen-Grossberg theorem.

Required readings:

Kohonen, Teuvo (1982).
Self-Organized Formation of Topologically Correct Feature Maps. *Biological
Cybernetics* **43**(1): 59–69.

Supplemental readings:

Cohen, M. and Grossberg, S. (1983).
Absolute stability of global pattern formation and parallel memory storage by
competitive neural networks. *IEEE Transactions on Systems, Man, and
Cybernetics*, **13**, pp. 815-826.

Homework due:

Spike-Timing Dependent Plasticity assignment is due before the class starts

Lecture Notes:

Homework Assignment (Due November 13):

Setting up correlated inputs and network architecture for self-organizing STDP.

**Lecture** 20
(November 11)

**Neuromodulatory Systems and Drug
Effects**

We will discuss diffusely projecting transmitter systems of the CNS and the implications of these systems in terms of a variety of drug effects.

Required readings:

None.

Supplemental readings:

Pages 84, 86-88 of Martin, J.H. (1996) *Neuroanatomy: Text and Atlas*. Stamford,
CT: Appleton & Lange.

Snyder, S.H. (1996). *Drugs and the Brain*. New York: Scientific American Library.

Lecture Notes:

**Lecture** 21
(November 13)

**Dopamine and Reinforcement
Learning**

We will look at the dopamine role in the reinforcement learning from data modeling points of view.

Required readings:

D&A Chapter 9 (sections 1-2).

Supplemental readings:

Schultz, W., Dayan, P. and
Montague, PR. (1997). A neural substrate of prediction
and reward. *Science*, **275**, pp. 1593-1599.

Schultz, W. (2006).
Behavioral theories and the neurophysiology of reward. *Annual review of Psychology*. **57**,
pp. 87-115.

Homework due:

Setting up correlated inputs and network architecture for self-organizing STDP assignment is due before the class starts

Lecture Notes:

**Lecture** 22
(November 18)

**Acetylcholine, Memory, and
Learning**

We will discuss the role of acetylcholine as a diffuse teaching signal and as a modulator of encoding/retrieval switch, including a neurophysiological results and modeling studies.

Required readings:

Hasselmo, M. E. (1999).
Neuromodulation: Acetylcholine and memory consolidation.* Cogn. Sci. 3: 351-359
*.

Supplemental readings:

Kilgard, M.P. and Merzenich, M.M. (1998).
Cortical map reorganization enabled by nucleus basalis activity.
*Science*, **279**, pp. 1714-1718.

Lecture Notes:

**Lecture** 23
(November 20)

**Adaptive Resonance
Theory**

We will discuss the basic foundations of Adaptive Resonance Theory.

Required readings:

None.

Supplemental readings:

Grossberg, S. (1980).
How does the brain build a cognitive code? *Psychological
Review*, **87**, pp. 1-51. [SMB Chapter 1; NFR Chapter
24.]

Baraldi, A., Alpaydin D. (1998).
Simplified ART: A new class of ART algorithms *Technical
report TR-98-004, International Computer Science Institute*.

Lecture Notes:

Homework Assignment (Due December 4):

**Lecture** 24
(November 25)

**Spiking Modulated Adaptive Resonance
Theory (SMART). Oscillations and Synchrony I
**

In the first part of the lecture Jesse Palma will present SMART model. In the second part we will discuss the reasons for oscillations in the brain, look at intracellular and extracellular oscillations, and discuss microcircuit models of neuronal oscillations.

Required readings:

Ermentrout, GB, and Kopell, N. (1998). Fine structure of neural spiking and synchronization in the presence of conduction delays. Proc. Nat. Acad. Sci. USA 95: 1259-1264.

Supplemental readings:

C. Chow, J. White, J. Ritt, N. Kopell (1998). Frequency control in synchronous networks of inhibitory neurons, J. Comput. Neurosci. 5:407-420.

D. Pinto, S. Jones, T. Kaper and N. Kopell (2003). Analysis of state-dependent transitions in frequency and long-distance coordination in a model oscillatory cortical circuit, J. Comp. Neurosci. 15(2):283-298.

N. Kopell, C. Borgers, D. Pervouchine, P. Malerba,
and A.B.L. Tort. 2010.
Gamma and theta rhythms in biophysical models of hippocampal
circuits. In *Hippocampal** Microcircuits: A Computational Modeller's
Resource Book.* Eds. V. Cutsuridis, B.P. Graham, S. Cobb, I. Vida. Springer.
Ch. 15.

Lecture Notes:

**No Lecture - Thanksgiving break**
(November 27)

**Lecture** 25
(December 2)

**Oscillations and Synchrony II
**

We will discuss an idea of synchronization as a binding element between different cortical areas will be further discussed as a basis for computation with neural assemblies.

Required readings:

Engel, AK, Fries, P. and Singer W. (2001). Dynamic predictions: oscillations and synchrony in top-down processing. Nature Neuroscience Reviews 2: 704.

Supplemental readings:

Lecture Notes:

**Lecture** 26
(December 4)

**Modeling Tools
**

We will discuss various tools for neural modeling with an emphasis on result reproducibility and interchange between research groups.

Required readings:

Brette, R. et al (2007). Simulation of networks of spiking neurons: a review. J Comput Neurosci 23(3):349-98.

Supplemental readings:

Versace M., Ames H.M., Leveille J., Fortenberry B.,
Mhatre H., and Gorchetchnikov A. (2008).
KInNeSS: A modular framework for computational neuroscience. *Neuroinformatics*.
6(4), 291-309.

Carnevale, N.T. and Hines, M.L. *The NEURON Book.*
Cambridge, UK: Cambridge University Press, 2006.

Cornelis, H., Rodrigues, A.L., Coop, A.D., Bower, J.M. (2011). Federated Scripting in GENESIS 3.0 Neural Simulation Platform. University of Texas Health Science Center at San Antonio (submitted to PLOS).

Diesmann, M. & Gewaltig, M. (2002) NEST: An Environment for Neural Systems Simulations. In: Plesser, T. & Macho, V. (ed.) Forschung und wisschenschaftliches Rechnen, Beitrge zum Heinz-Billing-Preis 2001, 58: 43-70, Ges. fr Wiss. Datenverarbeitung

Homework due:

Self-organizing STDP assignment is due before the class starts

Lecture Notes:

Homework Assignment (Due at the Final):

Synchronization in a strongly coupled circuit

**Lecture** 27
(December 9)

**KInNeSS, Descriptive Languages, GPUs for neurons
**

We will discuss design of KInNeSS simulator with respect to: 1) Descriptive languages for neural models and concepts underlying NineML and NeuroML 2; 2) CUDA programming of neural models.

Required readings:

Supplemental readings:

Churchland, P.S. (1986).
Theories of brain function.
Chapter 10 in *Neurophilosophy: Toward a Unified Science of the
Mind-Brain.*

Anderson, J.R. (1987).
Methodologies for studying human knowledge.
*Behavioral and Brain Sciences*, **10**, 467-505.

Nordlie E, Gewaltig M-O, Plesser HE (2009) Towards Reproducible Descriptions of Neuronal Network Models. PLoS Comput Biol 5(8): e1000456. doi:10.1371/journal.pcbi.1000456.

Lecture Notes:

**FINAL EXAM**
Tuesday December 16, 12:30pm-2:30pm