Figshare+
Browse
1/14
275 files

VR Data Neuropixel supporting "Distance-tuned neurons drive specialized path integration calculations in medial entorhinal cortex"

dataset
posted on 2021-07-29, 17:36 authored by Lisa GiocomoLisa Giocomo, Malcolm G. Campbell, Alexander Attinger

During navigation, animals estimate their position using path integration and landmarks, engaging many brain areas. Whether these areas follow specialized or universal cue integration principles remains incompletely understood. We combine electrophysiology with virtual reality to quantify cue integration across thousands of neurons in three navigation-relevant areas: primary visual (V1), retrosplenial (RSC) and medial entorhinal cortex (MEC). The data contained in this dataset consists of recordings of MEC, RSC or V1, made with Neuropixel probes in mice. Recordings were made while mice performed a simple virtual reality behavior where they had to navigate to run down a virtual hallway to receive a water reward at the end. Detailed description of data below. For more details, please check the paper Distance-tuned neurons drive specialized path integration calculations in medial entorhinal cortex (https://doi.org/10.1016/j.celrep.2021.109669).

Each .mat file corresponds to data from one vr session.

The title are structured as follows:

___.mat

For example:

AA50_191004_gaincontrast10_1 or npF2_1015_contrasttrack_gainchanges_2

Following data is contained in the .mat files

sp: structure with information about spike times and cluster identity, with these subfields

-dat_path: spikeGLX file name

-n_channels_dat: number of recorded channels

-dtype: datatype (‘int16’)

-sample_rate: spikeGLX sampling rate

- st: 1 x n Spikes. Spike time of each spike in ms, synchronized to onset of behavioral session

-clu: 1 x n Spikes. Cluster number of each spike

-cgs: 1 x n Clusters. Assigned cluster group for each spike cluster: 0: noise, 2: good

-cids: 1 X n Clusters. Unique cluster group labels

-xcoords: x coordinates of recording sites

-ycoords: y coordinates of recording sites

post: vr time in seconds, sampled at 50Hz

posx: for each vr time stamp, current position in VR (sampled at 50 Hz), typically from 0-400

trial: for each vr time stamp, current trial number

trial_gain: for each VR trial, applied gain value during that trial (1, 0.8, 0.7, 0.6 or 0.5)

trial_contrast: for each VR trial, contrast of landmarks during that trial

lickt: VR time stamp of individual detected licks

lickx: x position in VR for each detected lick.

Histology: Only for files starting with np (MEC recordings), manual reconstruction of probe trajecgtory

-MEC_entry: x/y/z coordinates of reconstructed entry of probe into MEC, in um in Allen Coordinate framework

probe_term: x/y/z coordinates of reconstructed terminal point of probe, in um in Allen Coordinate framework

anatomy: In files starting with np (MEC recordings) contains information regarding anatomical location for each cluster from manual reconstruction. Subfields differ between MEC sessions (starting with np) and other session. For MEC recordings:

-tip_distance: for each cluster, mean distance from tip of electrode

-cluster_parent: anatomical label for each cluster

-z2: ignore

-FinalDepth: ignore

anatomy: In other fiels, contains information regarding anatomical location, reconstructed with the AllenCCF Matlab tool.

-cluster_parent: brain region (e.g. VISp for V1) assigned to each cluster

-cluster_region: where applicable, more fine grained assignement. (e.g. RSPd2/3 for layer 2/3 of granular Retrosplenial cortex, dorsal part)

-tip_distance: distance from probe tip of each cluster

-depth: for each cluster, distance from brain surface

Funding

All-optical interrogation of neuronal sequences in retrosplenial cortex

Swiss National Science Foundation

Find out more...

History

Research Institution(s)

Stanford University

I confirm there is no human personally identifiable information in the files or description shared

  • Yes

I confirm the files and description shared may be publicly distributed under the license selected

  • Yes

Usage metrics

    Figshare+

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC