Skip to content

Subsets and references

On this page you can find summaries of the Enigma-Mouse neurophysiological dataset used to train and evaluate OmniMouse, including sources, experimental modalities, and key recording characteristics. For details on individual experiment sessions, see Dataset Details.

All experiments were conducted in awake, head-fixed mice and approved by the Institutional Animal Care and Use Committee of Baylor College of Medicine.

Enigma-Mouse

Enigma-Mouse BibTeX

Enigma-Mouse combines newly recorded data with prior recordings from other projects (sub-datasets listed below). It was introduced in the OmniMouse paper by Willeke et al. in 2026 and is published here as the Enigma-Mouse dataset with its own DOI.
Source
Willeke et al., 2026 (OmniMouse)
DOI Enigma-Mouse dataset
TBA
DOI OmniMouse paper
TBA
Animals / recordings
78 unique mice overall (some appear in both training and evaluation), with 4,210–11,284 neurons per session and >3 million single-unit recordings in total.
Neural data
Wide-field two-photon calcium imaging of excitatory neurons in layers 2–5 of right V1.
Stimuli
Naturalistic images (ImageNet) and videos (cinematic movies + Sports-1M), plus parametric and synthetic stimuli (e.g., static/drifting Gabors, directional pink noise, flashing Gaussian dots, random dot kinematograms, model-generated stimuli). Presented at 30–60 Hz; images shown for 500 ms with 300–500 ms blank intervals.
Behavior
Five aligned variables (locomotion speed, pupil size, pupil-size change, horizontal eye position, vertical eye position).

Sensorium-2022

Sensorium 2022 BibTeX

Source
Willeke et al., 2022 (Sensorium 2022 competition)
DOI
10.48550/arXiv.2206.08666
Animals / recordings
7 behaving mice (7 recording sessions). Five “pretraining” recordings are released for training/generalization; two “competition” recordings are reserved for held-out evaluation (withheld responses for live and final test scoring).
Stimuli
Natural images sampled from ImageNet, converted to grayscale; presented for 500 ms with 300–500 ms blank intervals.
Neural data
Two-photon calcium imaging of excitatory neurons in layer 2/3 of right V1; responses aggregated 50–550 ms after stimulus onset (boxcar window).
Behavior
Five aligned variables (locomotion speed, pupil size, pupil-size change, horizontal eye position, vertical eye position).

Sensorium-2023

Sensorium 2023 BibTeX

Source
Turishcheva et al., 2024 (Sensorium 2023 competition)
DOI
10.48550/arXiv.2305.19654
Animals / recordings
10 behaving mice (10 recording sessions). Five recordings are released fully as pretraining data; five additional recordings are used for held-out competition evaluation (live/final test responses withheld).
Stimuli
Grayscale dynamic natural movies (cinematic clips + Sports-1M) presented at 30 Hz (~8–11 s clips), plus out-of-domain stimuli (ImageNet images, flashing Gaussian dots, random dot kinematograms, directional pink noise, drifting Gabors). For precise presentation details, see the whitepaper’s Materials and Methods section.
Neural data
Wide-field two-photon calcium imaging of excitatory neurons in layers 2–5 of right V1.
Behavior
Four aligned behavioral variables (locomotion speed, pupil size, horizontal pupil position, vertical pupil position).

Functional connectomics spanning multiple areas of mouse visual cortex

Functional connectomics spanning multiple areas of mouse visual cortex BibTeX

Source
Bae et al., 2025 (Functional connectomics spanning multiple areas of mouse visual cortex)
Microns Website
DOI
10.1038/s41586-025-08790-w
Animals / recordings
1 behaving mouse recorded over 12 sessions.
Stimuli
Trained on natural videos and tested on held-out natural videos as well as out-of-domain gratings.
Neural data
Wide-field two-photon calcium imaging of excitatory neurons in layers 2–5 of right V1.
Behavior
Four aligned behavioral variables (locomotion speed, pupil size, horizontal pupil position, vertical pupil position).

Functional connectomics reveals general wiring rule in mouse visual cortex

Functional connectomics reveals general wiring rule in mouse visual cortex BibTeX

Source
Ding et al., 2025 (Functional connectomics reveals general wiring rule in mouse visual cortex)
DOI
10.1038/s41586-025-08840-3
Animals / recordings
6 behaving mice recorded over 7 sessions.
Stimuli
Trained on natural videos and tested on held-out natural videos as well as out-of-domain gratings.
Neural data
Wide-field two-photon calcium imaging of excitatory neurons in layers 2–5 of right V1.
Behavior
Four aligned behavioral variables (locomotion speed, pupil size, horizontal pupil position, vertical pupil position).

A global map of orientation tuning in mouse visual cortex

A global map of orientation tuning in mouse visual cortex BibTeX

Source
Fahey et al., 2019 (A global map of orientation tuning in mouse visual cortex)
DOI
10.1101/745323
Animals / recordings
12 behaving mice recorded over 75 sessions.
Stimuli
Trained on natural videos and tested on held-out natural videos as well as out-of-domain gratings.
Neural data
Wide-field two-photon calcium imaging of excitatory neurons in layers 2–5 of right V1.
Behavior
Four aligned behavioral variables (locomotion speed, pupil size, horizontal pupil position, vertical pupil position).

Functional bipartite invariance in mouse primary visual cortex receptive fields

Functional bipartite invariance in mouse primary visual cortex receptive fields BibTeX

Source
Ding et al., 2026 (Functional bipartite invariance in mouse primary visual cortex receptive fields)
DOI
10.1038/s41593-026-02213-3
Animals / recordings
14 behaving mice recorded over 75 sessions.
Stimuli
Trained on natural videos and tested on held-out natural videos as well as out-of-domain gratings.
Neural data
Wide-field two-photon calcium imaging of excitatory neurons in layers 2–5 of right V1.
Behavior
Four aligned behavioral variables (locomotion speed, pupil size, horizontal pupil position, vertical pupil position).

Foundation model of neural activity predicts response to new stimulus types

Foundation model of neural activity predicts response to new stimulus types BibTeX

Source
Wang et al., 2023 (Foundation model of neural activity predicts response to new stimulus types)
DOI
10.1038/s41586-025-08829-y
Animals / recordings
16 behaving mice (16 recording sessions). Eleven “pretraining” recordings are released for training/generalization; five recordings are reserved for held-out evaluation.
Stimuli
Trained on natural videos and tested on held-out natural videos as well as out-of-domain gratings.
Neural data
Wide-field two-photon calcium imaging of excitatory neurons in layers 2–5 of right V1.
Behavior
Four aligned behavioral variables (locomotion speed, pupil size, horizontal pupil position, vertical pupil position).


OmniMouse

Main dataset citation.

@inproceedings{
      willeke2026omnimouse,
      title={OmniMouse: Scaling properties of multi-modal, multi-task Brain Models on 150B Neural Tokens},
      author={Konstantin Friedrich Willeke and Polina Turishcheva and Alex Gilbert and Goirik Chakrabarty and Hasan Atakan Bedel and Paul G. Fahey and Yongrong Qiu and Marissa A. Weis and Michaela Vystr{\v{c}}ilov{\'a} and Taliah Muhammad and Lydia Ntanavara and Rachel E Froebe and Kayla Ponder and Zheng Huan Tan and Emin Orhan and Erick Cobos and Sophia Sanborn and Katrin Franke and Fabian H. Sinz and Alexander S. Ecker and Andreas S. Tolias},
      booktitle={The Fourteenth International Conference on Learning Representations},
      year={2026},
      url={https://openreview.net/forum?id=mEw4lhAn0F}
}


Sensorium 2022

Original source citation for the Sensorium 2022 subset.

@misc{willeke2022sensoriumcompetitionpredictinglargescale,
  title={The Sensorium competition on predicting large-scale mouse primary visual cortex activity},
  author={Konstantin F. Willeke and Paul G. Fahey and Mohammad Bashiri and Laura Pede and Max F. Burg and Christoph Blessing and Santiago A. Cadena and Zhiwei Ding and Konstantin-Klemens Lurz and Kayla Ponder and Taliah Muhammad and Saumil S. Patel and Alexander S. Ecker and Andreas S. Tolias and Fabian H. Sinz},
  year={2022},
  eprint={2206.08666},
  archivePrefix={arXiv},
  primaryClass={q-bio.NC},
  url={https://arxiv.org/abs/2206.08666}
}


Sensorium 2023

Original source citation for the Sensorium 2023 subset.

@misc{turishcheva2024dynamicsensoriumcompetitionpredicting,
  title={The Dynamic Sensorium competition for predicting large-scale mouse visual cortex activity from videos},
  author={Polina Turishcheva and Paul G. Fahey and Laura Hansel and Rachel Froebe and Kayla Ponder and Michaela Vystrčilová and Konstantin F. Willeke and Mohammad Bashiri and Eric Wang and Zhiwei Ding and Andreas S. Tolias and Fabian H. Sinz and Alexander S. Ecker},
  year={2024},
  eprint={2305.19654},
  archivePrefix={arXiv},
  primaryClass={q-bio.NC},
  url={https://arxiv.org/abs/2305.19654}
}


Funccon paper

Original source citation for this subset.

@article{ding2025functional,
  title={Functional connectomics reveals general wiring rule in mouse visual cortex},
  author={Ding, Zhuokun and Fahey, Paul G and Papadopoulos, Stelios and Wang, Eric Y and Celii, Brendan and Papadopoulos, Christos and Chang, Andersen and Kunin, Alexander B and Tran, Dat and Fu, Jiakun and others},
  journal={Nature},
  volume={640},
  number={8058},
  pages={459--469},
  year={2025},
  publisher={Nature Publishing Group UK London}
}


Platinum Mouse Data paper

Original source citation for this subset.

@article{microns2025functional,
  title={Functional connectomics spanning multiple areas of mouse visual cortex},
  journal={Nature},
  volume={640},
  number={8058},
  pages={435--447},
  year={2025},
  publisher={Nature Publishing Group UK London}
}


Orimap paper

Original source citation for this subset.

@article{fahey2019global,
  title={A global map of orientation tuning in mouse visual cortex},
  author={Fahey, Paul G and Muhammad, Taliah and Smith, Cameron and Froudarakis, Emmanouil and Cobos, Erick and Fu, Jiakun and Walker, Edgar Y and Yatsenko, Dimitri and Sinz, Fabian H and Reimer, Jacob and others},
  journal={bioRxiv},
  pages={745323},
  year={2019},
  publisher={Cold Spring Harbor Laboratory}
}


Bipartite paper

Original source citation for this subset.

@article{ding2026functional,
  title={Functional bipartite invariance in mouse primary visual cortex receptive fields},
  author={Ding, Zhiwei and Tran, Dat and Ponder, Kayla and Ding, Zhuokun and Froebe, Rachel and Ntanavara, Lydia and Fahey, Paul G and Cobos, Erick and Baroni, Luca and Diamantaki, Maria and others},
  journal={Nature Neuroscience},
  pages={1--13},
  year={2026},
  publisher={Nature Publishing Group US New York}
}


Foundation model paper

Original source citation for this subset.

@article{wang2023foundation,
  title   = {Foundation model of neural activity predicts response to new stimulus types},
  author  = {Wang, E.Y. and Fahey, P.G. and Ding, Z. and others},
  journal = {Nature},
  year    = {2025},
  doi     = {10.1038/s41586-025-08829-y}
}