Trends in Computational Neuroscience: Cosyne 2022

March 2022


Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022
  In Cosyne 2022's keynote talk, Robert Datta said that since he got tenure ๐ŸŽ‰ he could engage in some wild speculation. This is a great rule to have! That being said, I most certainly do not have tenure, but I'm still going to take a swing ๐Ÿ˜ƒ

  Before I dive in, let me explain what this blog post aims to do. This blog post does NOT explain what happened at Cosyne 2022. If you want to see what happened go to: Cosyne 2022 Resources. This post IS about clarifying the trends that transcend model animal, experiment vs. computation, and so many more of our field's dichotomies. These thoughts are my attempt at synthesizing not only where we currently stand as a field and community, but also where we hope to go.

  When I started this endeavor of identifying trends I wanted to see all the talk titles together. No authors. No institutions. Just titles. I wanted to put myself into Laura's and Tim's (the organizers') minds and envision what they wanted us to get out of Cosyne 2022. So, I did just that. Before reading this post, I challenge you to look at my list and see what trends you generate! It'll help you at the end of this post, I promise ๐Ÿ˜

  With that backdrop, let's get started!! I identified eight ๐Ÿ˜ฑ trends; they are:

    โœฆ Behavior
    โœฆ Comparing Artificial Neural Networks to (dare I say) Real Neural Networks ๐Ÿ˜‰
    โœฆ Opportunities in Dimensionality Reduction
    โœฆ Data Scaling
    โœฆ Single Units โžก๏ธ Population Level
    โœฆ Machine Learning for Computational Neuroscience Lags
    โœฆ Model Interpretability
    โœฆ Attractors
  So buckle up, it's going to be a ๐ŸŒถ ride!



1 - Behavior


Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - Mugatu Meme


  One of THE ๐Ÿ”ฅhottest๐Ÿ”ฅ trends we saw at Cosyne 2022 was behavior. And honestly, most everything about it: from judging the performance of our models on both neural and behavioral metrics to building analysis pipelines specifically for behavior. A professor even said, "I know I reported a lot of behavioral results so let me pause here before I get to the neural results".

  To hit this point home, I'm going to show you two conclusion slides that caught my eye. I added red arrows and underlines to really hit the behavior with neural point home! These slides were presented on different days in different sessions, but the striking similarity between their layouts speaks for itself. At Cosyne 2022, behavior was ๐Ÿ‘‘!

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - Behavior brain score From Dan Yamins

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - Behavior Rastermap From Carsen Stringer


  Behavior was heavily discussed at Cosyne 2022 for several reasons: (1) we now have the ability to collect huge amounts of data (see the Data Scaling section), and (2) it's important for us to analyze as many datastreams (neural, behavioral, etc.) as possible to fully describe the scientific phenomenons we observe.

  There were many subfields of behavior that were addressed, but one I want to dig into is pose tracking. There were many, many, many examples discussed throughout the talks, posters, and hallways of Cosyne 2022.

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - Behavior 3D Tracking
From Kiah Hardcastle

Some of the ones we all know, love, and use are:
  When so many repositories exist to perform very similar tasks, and suffer from very similar hurdles (multi-animal tracking, the data annotation bottleneck, pose-tracking stability over time, etc.) I can't help but wonder if as a community we should stop and ask ourselves if (1) we can go further by building on top of one another's models and more importantly repositories, and (2) we can do a better job of leveraging advances from the deep learning community.



2 - Comparing Artificial Neural Networks to (dare I say) Real Neural Networks ๐Ÿ˜‰


  I know, I know, I'm being cheeky! But, I have to motivate you through this post somehow ๐Ÿค“ This area of research was incredibly well represented at Cosyne 2022, but it took numerous forms; I'm going to highlight a few that encapsulate the main directions.

  The first is 1:1 mapped artificial and biological neural networks. Let me expand: the artificial neural network's units have a direct 1:1 mapping with biological neurons (in this case the locus coeruleus neurons in drosophila). The knockout training paradigm is the key to get direct correspondance between locus coeruleus (LC) artificial neurons and biological LC neurons.

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - ANN to BNN LC Neurons
From Mala Murthy


  The second is generative modeling! If you take in real neural data and learn its underlying distribution, then you can generate your own neural data ๐Ÿ˜ฎ The generated, and hopefully realistic, neural data can then be compared to real neural data. Why you might ask?... For incredibly difficult neuroengineering problems like: training data-hungry models in data-limited regimes (I'm only a little biased since this is what I work on ๐Ÿ™ƒ). If underlying distributions and data characteristics get you going, I recommend sticking around for the Opportunities in Dimensionality Reduction section!

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - ANN to BNN Generative Modeling
From Eva Dyer


  The third is mixing biological mechanisms with artificial tasks. The MNIST digit dataset and classification task are canonical in machine learning. When we use biological mechanisms (like saccading) to solve classic artificial tasks (like digit classification), we move to an exciting hybrid space where we can analyze both artificial and biological systems with the same metrics.

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - ANN to BNN Generative Modeling
From Valerio Mante


  Speaking of metrics... the last category is similarity metrics between neural networks. Wow, it's almost like somebody planned that ๐Ÿ˜ƒ Defining metrics that allow us to quantify similarity or dissimilarity is crucial for sound comparisons between artificial and biological networks (or even biological to biological, and artificial to artificial). Notice the feedforward neural network and mouse brain beside one another in the slide below!! If this sings to you ๐ŸŽถ, check out the next section: Opportunities in Dimensionality Reduction.

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - ANN to BNN Generative Modeling
From Alex Williams


  We discussed comparing artificial neural networks to real neural networks through four distinct lenses: (1) 1:1 mapped artificial and biological neural networks, (2) generative modeling, (3) mixing biological mechanisms with artificial tasks, and (4) similarity metrics between neural networks. They are all incredibly interesting directions and I can't wait to see how they evolve at Cosyne 2023!



3 - Opportunities in Dimensionality Reduction


Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - Sir Mix A Lot Meme


  Tough to go wrong with Sir Mix-a-Lot, but at this point you might be asking yourself: "Sabera, you said dimesionality reduction, not big data! What gives?". What gives is that big data is often high dimensional!

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - Dim. Red. Big Data is High Dimensional
From Carsen Stringer


  This handily demonstrates that most all of us at Cosyne deal with high dimensional data, whether it be neural or behavioral. We all really want to and need to do dimensionality reduction, but honestly we can do better. I'll let an anonymous professor from Cosyne 2022 speak for me: "It is a crime against data to just use 3D". In a different workshop, when performing PCA on neural data, the principal components (PCs) explaining relatively low amounts of variance were still helpful in describing the genetic relationships between neural populations. Moreover, these relationships were not captured in the higher order PCs! Don't get me wrong, sometimes these techniques work as can be seen in the examples below, but why not do better?

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - PCA Speech
From Mike Long

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - PCA IsoMap
From Andre Fenton

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - Rastermap TSNE UMAP PCA
From Carsen Stringer

  We are still using PCA, IsoMap, tSNE, UMAP, etc. for biological data because in all honesty, preserving local AND global structure in dimensionality reduced neural data and behavioral data are not solved problems. If you're interested in diving into the complicated world of embeddings, I recommend these papers and spicy ๐ŸŒถ๏ธ tweet threads:   There is a huge opportunity here to transform the way we do neuroscience if we: (1) create better low-dimensionality embeddings (see rastermap in the image above), and (2) develop better metrics for benchmarking nonlinear embeddings.



4 - Data Scaling


Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - Automation Mice in Boxes
From Shannon Schiereck

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - Drosophila flywire.ai
From Flywire

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - mouse cortex
From Carsen Stringer

Sabera Talukder Trends in Computational Neuroscience: Cosyne 2022 - Human mturk
From Ethan Bromberg-Martin

  As discussed above in the Opportunities in Dimensionality Reduction section, we are entering the era of big neural data. Because of our desire as a field to collect larger and larger datasets, we are scaling our data collection and processing pipelines with automation and outsourcing.

  Here are a few of the examples I found at Cosyne 2022 which span many animal models:
  • High throughput behavioral training facilities for rats. This allows you to collect over 5.2 million ๐Ÿ˜ฑ behavioral trials per year!!
  • Flywire.ai, which is crowdsourcing the first centralized brain wiring diagram in drosophila.
  • Deep learning models to automate everything from cell segmentation, to behavior tracking, to neural data analysis, etc.
  • Mturk, or Amazon Mechanical Turk. It was used to generate human experimental data at Cosyne 2022. It has also been used in the machine learning community to get human annotations at scale.
  Through automation and outsourcing we are pushing the boundaries of neural and behavioral data scaling. As we progress, I hope we continue to take inspiration from other research communities that have explored solutions to these important problems.



  In true academic form I will leave you with Exercises for the Reader. These exercises are trends I identified and collected examples for from Cosyne 2022. I will share what I gathered with you, but there is a catch ๐Ÿ˜ I would love for you to join the conversation, so that we can build out examples for these topics together. Interested in how you can join the conversations? Click Below!

  Before I leave you, to synthesize our trends I hope ๐Ÿ˜…, I want to thank George Barnum for helping me more eloquently verbalize my thoughts during earlier versions of this post! I also want thank you for sticking around to the end! I hope you learned something ๐Ÿค—

Exercises for the Reader

5 - Single Units โžก๏ธ Population Level

    Join the Conversation for Topic 5!

6 - Machine Learning for Computational Neuroscience Lags

    Join the Conversation for Topic 6!

7 - Model Interpretability

    Join the Conversation for Topic 7!

8 - Attractors

    Join the Conversation for Topic 8!



If you enjoyed this article, please let me know! ๐Ÿ‘‰