#Cosyne2016, by the numbers

mostCosyne2016

Cosyne is the systems and computational neuroscience conference held every year in Salt Lake City and Snow Bird. It is a pretty good representation of the direction the community is heading…though given the falling acceptance rate you have to wonder how true that will stay especially for those on the ‘fringe’. But 2016 is in the air so it is time to update the Cosyne statistics.

I’m always curious about who is most active in any given year and this year it is Xiao-Jing Wang who I dub this year’s Hierarch of Cosyne. I always think of his work on decision-making and the speed-accuracy tradeoff. He has used some very nice modeling of small circuits to show how these tasks could be implemented in nervous systems. Glancing over his posters, though, and his work this year looks a bit more varied.

Still, it is nice to see such a large clump of people at the top: the distribution of posters is much flatter this year than previously which suggests a bit of

Here are the previous ‘leaders’:

  • 2004: L. Abbott/M. Meister
  • 2005: A. Zador
  • 2006: P. Dayan
  • 2007: L. Paninski
  • 2008: L. Paninski
  • 2009: J. Victor
  • 2010: A. Zador
  • 2011: L. Paninski
  • 2012: E. Simoncelli
  • 2013: J. Pillow/L. Abbott/L. Paninski
  • 2014: W. Gerstner
  • 2015: C. Brody
  • 2016: X. Wang

mostCosyneALL

If you look at the total number across all years, well, Liam Paninski is still massacring everyone else. At this rate, even if Pope Paninski doesn’t submit any abstracts over the next few years and anyone submits six per year… well it will be a good half a decade before he could possibly be dethroned.

The network diagram of co-authors is interesting, as usual. Here is the network diagram for 2016 (click for PDF):

cosyne2016

And the mess that is all-time Cosyne:

cosyneALL-straight

 

I was curious about this network. How connected is it? What is its dimensionality? If you look at the eigenvalues of the adjacency matrix, you get:

eigenvals

I put the first two eigenvectors at the bottom of this post, but suffice it to say the first eigenvector is basically Pouget vs. Latham! And the second is Pillow vs Paninski! So of course, I had to plot a few people in Pouget-Pillowspace:

pillowspace

(What does this tell us? God knows, but I find it kind of funny. Pillowspace.)

Finally, I took a bunch of abstracts and fed them through a Markov model to generate some prototypical Cosyne sentences. Here are abstracts that you can submit for next year:

  • Based on gap in the solution with tighter synchrony manifested both a dark noise [and] much more abstract grammatical rules.
  • Tuning curves should not be crucial for an approximate Bayesian inference which would shift in sensory information about alternatives
  • However that information about 1 dimensional latent state would voluntarily switch to odor input pathways.
  • We used in the inter vibrissa evoked responses to obtain time frequency use of power law in sensory perception such manageable pieces have been argued to simultaneously [shift] acoustic patterns to food reward to significantly shifted responses
  • We obtained a computational capacity that is purely visual that the visual information may allow ganglion cells [to] use inhibitory coupling as NMDA receptors, pg iii, Dynamical State University
  • Here we find that the drifting gratings represent the performance of the movement.
  • For example, competing perceptions thereby preserve the interactions between network modalities.
  • This modeling framework of goal changes uses [the] gamma distribution.
  • Computation and spontaneous activity at the other stimulus saliency is innocuous and their target location in cortex encodes the initiation.
  • It is known as the presentation of the forepaw target reaction times is described with low dimensional systems theory Laura Busse Andrea Benucci Matteo Carandini Smith-Kettlewell Eye Research.

Note: sorry about the small font size. This is normally a pet peeve of mine. I need to get access to Illustrator to fix it and will do so later…

The first two eigenvectors:

ev1 ev2