• Sabera Talukder BOUQUET: Learning Large Vocabularies for Time Series and Images with Bernoulli Quantized Tokenization

    BOUQUET: Learning Large Vocabularies for Time Series and Images with Bernoulli Quantized Tokenization

    To be released soon!
    BOUQUET learns over 1 billion unique tokens that capture 15x more information, while enabling foundation models that train 6x faster, respond 12x quicker, and use 100x less memory.


    Sabera Talukder GPOPULATION TRANSFORMER: LEARNING POPULATION LEVEL REPRESENTATIONS OF NEURAL ACTIVITY iCLR 2025

    Population transformer: Learning population-level representations of neural activity

    Oral Paper (Top 1.8% of Accepted Papers): [Paper] [Code]
    Geeling Chau*, Christopher Wang*, Sabera Talukder, Vighnesh Subramaniam, Saraswati Soedarmadji, Yisong Yue, Boris Katz, Andrei Barbu
    ICLR 2025
    Our next-generation neuroscience foundation model overcomes sparse electrodes, human-to-human variation, and dataset discrepancies.


    Sabera Talukder Generalizability Tokenization + Transformers Robust Latent Spaces COSYNE 2024

    Generalizability Under Sensor Failure: Tokenization + Transformers Enable More Robust Latent Spaces

    [Paper] [Code]
    Geeling Chau*, Yujin An*, Ahamed Raffey Iqbal*, Soon-Jo Chung, Yisong Yue, Sabera Talukder
    COSYNE 2024
    We run human neural experiments on 4 subjects, capturing brain data while uncovering the most generalizable AI model.


  • Sabera Talukder TOTEM Tokenized Time Series Embeddings for General Time Series Analysis

    TOTEM: Tokenized Time Series Embeddings for General Time Series Analysis

    [Paper] [Code] [Video]
    Sabera Talukder, Yisong Yue, Georgia Gkioxari
    TMLR 2024
    TOTEM is one of the first time series foundation models, utilizing 30+ datasets across 20+ scientific domains, and solving 5 tasks.


  • Sabera Talukder Deep Neural Imputation COSYNE 2023 Neurips 2022

    Deep Neural Imputation: A Framework for Recovering Incomplete Brain Recordings

    [Paper]
    Sabera Talukder*, Jennifer Sun*, Matthew Leonard, Bing Brunton, Yisong Yue
    COSYNE 2023 | Learning from Time Series for Health Workshop, Neurips 2022
    This is among the first neuroscience foundation models - trained on invasive neural recordings from 12 patients to recover lost brain signals.


  • Sabera Talukder Multimodal Representation Learning Neurips 2020

    On the Benefits of Early Fusion in Multimodal Representation Learning

    [Paper]
    George Barnum*, Sabera Talukder*, Yisong Yue
    Shared Visual Representations in Human & Machine Intelligence Workshop, Neurips 2020
    We show that early fusion of audio and visual signals yields noise-robust neural networks that mirror the brain’s own multi-sensory convergence.


  • Sabera Talukder Architecture Agnostic Neural Networks Neurips 2020

    Architecture Agnostic Neural Networks

    Oral Paper: [Paper] [Video]
    Sabera Talukder*, Guruprasad Raghavan*, Yisong Yue
    BabyMind Workshop, Neurips 2020
    Inspired by the brain's synaptic plasticity, we built an architecture manifold search algorithm that uncovers whole families of performant neural networks.


  • Sabera Talukder Electrostatic Precipitator Circuit

    A Smoke Removing DC-DC Converter Composed of a Seven Stage Class DE Rectifier and a Class ϕ2 Inverter

    [Honors Thesis]
    Sabera Talukder
    Electrical Engineering Honors Thesis, Stanford University, 2018
    I introduce a miniaturized, portable, low power electrostatic precipitator that lowers indoor air pollution from rural cook stoves.


  • Sabera Talukder Drosophila melanogaster Visual Learning and Memory

    Exploring Visual Memory Formation in Drosophila melanogaster

    [Honors Thesis]
    Sabera Talukder
    Biochemistry Honors Thesis, Stanford University, 2018
    We created a 6-stimulus training suite across 4 rigs to test if fruit flies can form visual memories.


  • Sabera Talukder Electrostatic Precipitator IEEE Compel 2017

    A Portable Electrostatic Precipitator to Reduce Respiratory Death in Rural Environments

    Oral Paper: [Paper]
    Sabera Talukder, Sanghyeon Park, Juan Rivas-Davila
    IEEE Compel 2017
    With 4.3 million lives lost each year to cooking smoke, we built a circuit that boosts voltage ~100x to remove harmful airborne particles.