NEW POSITION
In 2011 I moved to a new position with the Visual Computing Group at Microsoft Research Asia. My new webpage can be found here.



OLD WEBSITE

\begin{figure}
\centering\includegraphics[width=5in]{labpic.eps}
\end{figure}

David Wipf
Postdoctoral Fellow
Biomagnetic Imaging Lab
University of California, San Francisco
513 Parnassus Avenue, S362
San Francisco, CA 94143
Email: {first name}.{last name} at gmail.com


Background and Research Interests

I recently completed my Ph.D. at the University of California, San Diego where I was an NSF Fellow in Vision and Learning in Humans and Machines. Now I am an NIH Postdoctoral Fellow at the University of California, San Francisco working on Bayesian estimation as applied to the problem of finding sparse representations of signals using overcomplete (redundant) dictionaries of candidate features. In contrast to the Moore-Penrose pseudoinverse, which produces a representation with minimal energy or high diversity, I'm concerned with finding inverse solutions using a minimum number of nonzero expansion coefficients (maximal sparsity). A particularly useful application of this methodology is to the source localization problem that arises in neuroelectromagnetic imaging and brain computer interfacing (BCI). Here the goal is to convert an array of scalp sensor measurements into an estimate of synchronous current activity within the brain which can then be used for classifying brain states or other clinical tasks. I'm also looking at sparse coding problems associated with the visual cortex.


Publications by Year

2010

2009

2008

2007

2006

2005

2004


Matlab Code

Some of the algorithms we use can be run using a very simple Matlab code. Several generalized versions of SBL (popularly known as the relevance vector machine) are included as well as $\ell_p$-norm minimization methods. All are (somewhat) optimized to work well with very overcomplete dictionaries (unlike some RVM code available elsewhere). In general, I have found that the slower EM updates sometimes give better solutions, but can be impractical to run for large problems. Regardless, the code here should work ok with many thousands of dictionary columns as long as the number of rows is small enough (e.g. $275 \times 40,000$ works fine with the fast updates, but is too slow for standard EM). Please send me an email if there are any problems. Note also that this code can be speeded up dramatically by removing the SVD computation which is not required but can be more numerically stable in some situations.

Download

Also, for a very fast implementation of SBL that uses the method of Tipping and Faul (2003), you can check here.



David Wipf 2010-09-10