During one summer in college I worked with Dr. Patrick Langley, Director of the Institute for Learning and Experience (ISLE), on the application of machine learning technology to assist scientific discovery. The project was the evaluation, usage, and extension of a new dynamic systems modeling tool called Lagramge. Lagramge is an equation discovery system which allows the user to specify what they know about the system under consideration and also to provide a dataset of observed behavior. Lagramge would then attempt to find a system of differential equations which would describe the data. The user's prior knowledge was used to constrain the form of the resulting equations. In order to adapt the system to a real-world data set, I made a number of modifications to the program. My extensions allowed the equation discovery system to more robustly fit the parameters of large equation models.
For five and a half months during my junior year, I spent an average of 21 hours a week working with a mechanical engineering design group. The group was applying an artificial intelligence method to build software for fault diagnosis in mechanical systems. Specifically, they were using Bayesian networks, a class of graphical probabilistic models. My contribution was to extend the Bayesian inference engine to use intervals of probability instead of definite probabilities.
Before my extension, an engineer using the software would first have to specify the precise failure probabilities of each part in the mechanical system. The goal of using probability intervals was to allow engineers to use the software even when precise knowledge of some of these probabilities was unavailable.
Since no paper detailing the proposed extension was available, I learned the theory on my own. I was then able to piece together two algorithms from different papers in the area, and coded both the extension to the inference engine and the extension to the GUI.
Although my research and implementation of the extension was done independently, this project also gave me experience working with a larger team (5 people, including me, were working on the same piece of software). There were regular team meetings and frequent design discussions where architectural design decisions were made.
At the end of college, I worked independently on an honor's thesis which involved researching software that assists in collaboration\footnote{sometimes called "social software"}, both scientific collaboration and collaboration in the context of programming projects. This involved a large amount of research on existing tools, as well as daily discussions and actual collaborations with an international group of other people interested in social software. The results were the thesis itself, as well as the creation of some open-source software tools.
At Stanford I also earned a Master's degree in mathematics. Since then, I have studied the statistical theory of learning and pattern recognition. My knowledge in these areas makes me particularly well-qualified to understand the advanced analytical methods of theoretical neurobiology.
During my study for the Master's degree, I spent a summer working with Dr. Adam Taylor and Prof. Gary Cottrell on a model of the leech swim CPG. Dr. Taylor had previously published a model of the CPG which reproduced real electrophysiological data, however the model was overly sensitive to small variations in parameters, such as synaptic weights. We attempted to make the model more stable. In addition, we modified it to more realistically approximate the actual synaptic connectivity of the swim CPG. In the end, we decided that stability could better be achieved by replacing the simple model with a conductance-based model.
Since my arrival at UCSD, I have done a few rotations. With Henry Abarbanel, I reviewed the literature on birdsong and particularly the RA nucleus of zebra finches. I independently created conductance-based models of two types of neurons found in the RA nucleus. I also wrote a semi-automated software workbench for modeling electrophysiological data, and in particular for finding appropriate parameters for conductance-based models, given electrophysiological traces from the living system.
With Dr. Maxim Bazhenov and Prof. Terrence Sejnowski, I learned about the insect antenna lobe and in particular about neural codes based on transient synchrony. I learned and implemented statistical techniques for analyzing experimental data and detecting transient synchrony. I also assisted Dr. Bazhenov in his modeling work on this system.
With Prof. Charles Stevens, I analyzed the synaptic connectivity of the lobster stomatogastric ganglion. I programmed an analysis tool to detect the subnetworks corresponding to the pyloric and gastric CPGs. Further research is in progress to identify if other useful features of this neural system can be extracted from similar analyses of its connectivity.
For the past few months I have been rotating in Prof. Massimo Scanziani's lab, which focuses on hippocampal slice physiology. In this lab, I have learned to prepare rat hippocampal slices and to make patch clamp recordings.