Abstract
Various patterns of neural activity are observed in dynamic cortical imaging data. Such patterns may reflect how neurons communicate using the underlying circuitry to perform appropriate functions; thus it is crucial to investigate the spatiotemporal characteristics of the observed neural activity patterns. In general, however, neural activities are highly nonlinear and complex, so it is a demanding job to analyze them quantitatively or to classify the patterns of observed activities in various types of imaging data. Here, we present our implementation of a novel method that successfully addresses the above issues for precise comparison and classification of neural activity patterns. Based on two-dimensional representations of the geometric structure and temporal evolution of activity patterns, our method successfully classified a number of computer-generated sample patterns created from combinations of various spatial and temporal patterns. In addition, we validated our method with voltage-sensitive dye imaging data of Alzheimer’s disease (AD) model mice. Our analysis algorithm successfully distinguished the activity data of AD mice from that of wild type with significantly higher performance than previously suggested methods. Our result provides a pragmatic solution for precise analysis of spatiotemporal patterns of neural imaging data.
Similar content being viewed by others
Introduction
Neural activities observed in optical imaging data often show a variety of spatiotemporal patterns of global synchronization1 or local propagation2,3,4,5,6,7. A number of studies have suggested that such activities exhibit spatiotemporally organized patterns8,9,10,11,12 and may reveal information about the underlying functional circuits13,14,15,16,17,18 or functional connectivity between different brain regions19,20,21,22,23. This notion might also be supported by reports that neural activity patterns can vary depending on the context of sensory stimuli, or on motor activity such as eye movement24,25,26,27,28,29,30.
However, it is technically demanding to distinguish or classify these spatiotemporal activity patterns quantitatively, due to the intrinsic complexity and nonlinearity of neural activities. In most cases, activity patterns in local cortical areas show indefinite or complicated geometric structure that also varies with time dynamically14,15,31. For example, propagating waves—one of the most frequently observed patterns—often contain highly nonlinear motion such as compression, reflection24, and interactions with other propagating waves32,33,34,35,36.
Various approaches have been suggested to address this issue15,25,37,38,39,40,41,42, but an ultimate solution to the problem has not yet been achieved. For example, one idea was that tracing the center of activity mass15,25 could successfully provide the trajectory of mean activity, but this approach could not well distinguish spatial features such as convergent or divergent patterns. Another method could analyze the spatial structure of activities by considering the maximal amplitude values at each location40,43, but could not handle the temporal change of the amplitude or the dynamic movement during activity. Other studies exploited instantaneous change of activity at each location, using an activity phase latency map, especially for detection of propagating wave-type patterns38,39. Even so, this method turned out to be useful only for simple propagating waves without any reflection or spatial oscillation. Last, in a recent study, a mathematical approach based on phase dynamics41 was introduced that could classify neural patterns into several stereotypical groups such as sinks or spirals. However, this method showed weak performance for patterns that did not fit into any of the typical groups. Therefore, a more complete and robust method of quantitative classification and labeling of complex neural activity patterns was needed.
In this work, we developed a novel method for precise classification and discrimination of spatiotemporal neural activity patterns. Our approach was to categorize neural activity patterns using two independent profiles—the geometric and dynamic profiles—that extract spatial and temporal features, respectively, of given patterns. We first tested our method for classification of computer-simulated neural activities of various types of pattern. We confirmed that our method could successfully distinguish complex forms of spatiotemporal activity patterns composed of complicated spatial and temporal changes, such as convergence and divergence. Then we tested the performance of our method for classification of voltage-sensitive dye imaging (VSDI)44 data for Alzheimer’s disease (AD) model and wild type (WT) mice (Fig. 1a,b). In this data set, the features of the activity patterns were hardly observed (Fig. 1c, stationary and linear motion), and in most cases, the patterns observed were highly complex, exhibiting divergence, separation, or merging of patterns (Fig. 1c). However, even under this condition, where differences between the patterns could hardly be detected visually, our method could successfully extract important features of the activity patterns in AD and WT samples, and was able to clearly distinguish all the patterns into two groups.
Our approach provides a solution for precise classification of complex spatiotemporal neural activity patterns. Furthermore, this method is applicable to the analysis of data from various types of optical imaging techniques for a population-level neural activity, regardless of the data collection conditions such as image resolution.
Results
Spatiotemporal representation of activity patterns: Geometric and dynamic profiles
For a precise classification of various spatiotemporal patterns, we considered two major characteristics of neural activities: the topography of an activity amplitude map and its temporal evolution (Fig. 2a). From these geometric and dynamic profiles, we defined our spatiotemporal activity profile index (GeoDyn) to describe quantitatively each type of neural activity pattern.
The geometric profile represents the topographic distribution of activity amplitude at each time. In the previous approaches, the size of neural activity was defined as the area exhibiting more than a certain fixed threshold value of amplitude45,46. However, the property of supra-threshold area is highly dependent on the choice of threshold value. For example, for a sample activity in Fig. 2b, the area can appear consistent over time with a low threshold (Fig. 2c). However, the area noticeably varies over time with a high amplitude threshold. Thus, to describe fully the entire profile of an amplitude map, we defined the geometric profile as supra-threshold area with varying threshold (y-axis) at each time point (x-axis) (Fig. 2d). The measured area was normalized using the total area in a recorded frame for scale-free and resolution-independent representation. Such a design enables our geometric profile to contain fully the spatial information of the given patterns.
Next, to classify the dynamic features of activity patterns such as linear motion, radial convergence and divergence over time, we designed a multi-directional profile index that takes into account not only average motion but also the entire dispersion of activity (Fig. 2e–h). For example, when a sample activity pattern propagates outward and inward in the radial direction over time (Fig. 2e), our dynamic profile first calculates a velocity field between two consecutive recording frames (t, t + 1) by the optic flow method47 (Fig. 2f). When the activity propagates outward in the radial direction, all the vectors in a velocity field are pointing outward (positive), while they point inward (negative) during inward propagation. Then, the average velocity in each direction, v θ , was estimated from the weighted vector sum of the velocity field in an angular window where underlying vectors were weighted by the Gaussian function of angle difference from the direction θ (Fig. 2g). Then we varied θ from −π to π radians around the center of mass of the activity to construct a two-dimensional (2-D) profile of v θ computed at each time point (Fig. 2h). In the current example, the index values are positive during outward propagation, while they are negative during inward propagation. For scale-free and resolution-independent analysis of arbitrary images, we normalized the directional velocity as a ratio of spatial distance to the size of the image (i.e., the length of the longer side).
GeoDyn distinguishes spatiotemporal patterns of noisy artificial neural activities
We tested whether our activity index (GeoDyn), was able to differentiate distinct geometric or dynamic features of computer-generated sample activity patterns, and was able to classify the patterns without any supervised algorithm or pre-set analysis parameters. To mimic various types of observed neural activity in experiments, we generated nine activity patterns (Fig. 3a, see Methods for details) from the combinations of various geometric features: amplitude, size, and amplitude contour change (Fig. 3a, #1–4), and dynamic features: ring-shaped propagation, linear, zigzag, dividing, and dispersing motions (Fig. 3a, #5–9). Then we examined how GeoDyn described these patterns differently.
We observed that geometric and dynamic profiles of each sample pattern were clearly distinguished by their spatiotemporal features (Fig. 3b). Our geometric profile could capture slight differences in activity contour between samples, and the dynamic profile could describe various types of difference between sample activity patterns. For example, the dynamic profile of linear motion activity (Fig. 3a, #6) was clearly distinguished from that of a zigzag motion (Fig. 3a, #7). In our 2-D dynamic profile index, the maxima on the y-axis (angle) at each time step represents the direction of the dominant motion or maximum directional velocity. The trajectory of such a maximum point represents the dominant characteristics of a given motion. For example, a straight-line describes a linear motion and a zigzag shape indicates that the direction of motion switches periodically. Furthermore, bifurcation of the line in sample #8 shows that the activity pattern is divided into two parts, and that the width of the maxima points on the y-axis represents the degree of dispersion of the activity (as shown in #6 and #9). Therefore, these geometric and dynamic profiles contain complete information about the given spatiotemporal patterns in each sample.
Considering these two profiles together allows us to discriminate better the activity patterns of a similar structure. For example, the sample patterns #1, #6, and #7 have exactly the same geometric profiles (Fig. 3a, #1, #6, and #7) because their supra-threshold activities are the same. However, their dynamic profiles are noticeably different in the profile pattern; thus readily separable. Even when the shapes of both geometric and dynamic profiles are similar, GeoDyn can distinguish two patterns from the difference between the amplitudes of the profiles.
Next, to confirm that the GeoDyn was applicable to experimental imaging in which the data on the observed activities was noisy, we tested to see if the method also worked for samples with significant amounts of noise added. For this, we added a Gaussian noise to the nine patterns in Fig. 3a (see Methods for details). Nine hundred sample activities were generated by adding independently generated background noises to the nine template patterns. Then, to determine if these samples of various profiles could be classified into nine groups of source patterns without any supervised algorithm or pre-set parameters, we tried a clustering of activity patterns of similar geometric and dynamic profiles by estimating similarity between the GeoDyn profiles of each sample (Fig. 3c,d, see Methods for details). The optimal hierarchical clustering method48 (see Methods and Supplementary Fig. S1) was applied for the classification of each sample. As a result, by simply comparing the geometric and dynamic profiles, we were able to classify successfully 900 noisy activity patterns into nine groups of distinct spatiotemporal characteristics (Fig. 3e). It is noteworthy that the number of clusters (N = 9) in the final result was achieved from simple clustering analysis, not given as an analysis parameter. This shows that our GeoDyn method did well at extracting the underlying principal components of the spatiotemporal patterns in a given activity data set.
In addition, to test the performance of our method for the classification of slightly different activities under noisy conditions, we generated very noisy activity patterns that were hardly distinguishable by visual inspection (Supplementary Fig. S2, Supplementary Video 1–6). In this case, the differences of spatial/temporal parameters such as size, amplitude, and speed between the patterns were set smaller than the level of background noise (i.e., the sigma of the Gaussian). From the classification test of six patterns, we found that our GeoDyn method could successfully distinguish these slight differences in both geometric and dynamic patterns under very noisy conditions (Supplementary Fig. S2). Moreover, each profile clearly showed in which parameter the two patterns were different. For example, differences between the profiles of the Geo index in the bottom area, suggest that two patterns differed in the size of activity. This result shows that our GeoDyn method could successfully distinguish the activities of various samples by precisely comparing the features of their spatiotemporal patterns (Supplementary Fig. S2).
Discrimination of activity patterns from VSDI recordings of Alzheimer’s disease and wild type mice
Next, we tested to see if our GeoDyn method could distinguish Alzheimer’s disease (AD) and wild type (WT) mice, using only the difference in the activity patterns from real imaging data. For each type of mouse, spontaneous activity of the right hemisphere was recorded using voltage-sensitive dye imaging (VSDI). For this, 769 samples of activity data were collected from AD mice and 622 samples from WT mice (see Methods for details). To compare the same number of samples between the two types, we randomly selected 600 activity samples for each mouse type. First, we found that the most frequently observed patterns in AD and WT mice appear to have different spatiotemporal characteristics (Fig. 4a). Moreover, the geometric and dynamic profiles of AD and WT activity showed noticeable differences. The geometric profile of WT type activities appeared to have higher amplitude and wider shape than did that of AD type activities. In addition, the WT dynamic profile exhibited higher average velocities than did that of AD mice. Applying the same clustering method for estimated GeoDyn profiles as in Fig. 3c–e, these 1200 activity samples were first classified into 67 pattern groups of distinctive spatiotemporal characteristics (Fig. 4b and Supplementary Fig. S3).
Then, to classify these 67 patterns as belonging to either AD or WT group (Fig. 4b), we used a support vector machine (SVM) linear classifier with the “Leave-One-Out” cross validation algorithm. Specifically, data from a single mouse were chosen as a test set and data from all the other mice were used as training sets, which was repeated for all possible combinations of test and training sets. In this way, the 1200 sample activities were divided into training and test sets. To test the discrimination power of our method, we varied the number of training sets from 80 to 720 samples, while the number of test sets was fixed (n = 240). Then, the SVM linear classifier was trained with the training set for optimal classification of AD and WT type samples. Then, using the trained SVM classifier parameters, the test set samples were classified as AD or WT from their geometric and dynamic profiles (Fig. 4c).
We estimated the average performance of the SVM classifier by repeating this classification test with different training and test sets (N = 100). In addition, we compared performance of our method with two other methods previously suggested: the maximum amplitude map (MAM)40 and phase latency map (PLM)39 methods. The MAM captures the geometric features of activity patterns by plotting the maximal amplitude of each pixel over time, and the PLM represents the dynamic features by plotting phase latency of each pixel (Supplementary Fig. S6). As a result, the average correct ratio of our GeoDyn using 720 training sets was estimated to be 88.04 ± 1.4% (mean ± standard error). This is significantly higher than that of MAM and PLM regardless of the number of training sets used, and even higher than that of the combined MAM + PLM method to capture both geometric and dynamic features (Fig. 4d, *p < 4.883 × 10−4, Wilcoxon signed-rank test). In addition, we also examined the classification performance of each method when we used only a single profile of the activities: the geometric profile, dynamic profile, MAM, or PLM. Even when the geometric and dynamic profiles were used separately, they showed higher performance than MAM and PLM, respectively, in the classification of data by geometric and temporal features. This result explains how our GeoDyn method could perform better than the MAM and PLM; and the combined MAM + PLM, in the classification of WT and AD activities (Fig. 4e, Geo vs. MAM: *p < 0.004, Dyn vs. PLM: *p < 0.011, Wilcoxon signed-rank test).
In the current analysis, we used data from WT and AD mice of comparable ages (N = 7, 4 WT and 3 AD, from 22 to 26 months); however, the activity pattern might vary across individual animals as well as by mouse types. To determine if there is a significant contribution from individual variations, we also performed analysis across individual animals within the same group (Supplementary Fig. S4). From the discrimination test between two mice within each group, our results showed that two mice within the same group were also distinguishable using the GeoDyn indices (both WT vs. WT and AD vs. AD), but its discrimination performance was significantly lower than that between the WT and AD mice (Supplementary Fig. S4a, *p < 8.882 × 10−16, Mann-Whitney U-test). Thus, these results suggest that our method can distinguish activity patterns at the level of differentiation of individual animals. In the current data set, however, activity patterns were more readily distinguishable by mouse types than by individual variation.
Discussion
For this report, we developed a novel method for the classification of various neural activity patterns observed in dynamic imaging data. Our main idea is that both the geometric structure of activity at each time and the dynamic changes of it should be considered together to achieve better classification results. For this, our GeoDyn method defines the geometric and dynamic profiles of each sample, so that different imaging samples can be compared and segregated based on the similarity of their spatiotemporal patterns. In the first test of the method using computer-generated spatiotemporal patterns (Fig. 3), we confirmed that activity patterns of different spatial and/or temporal dynamics appeared noticeably different in our GeoDyn profiles, and thus were readily separable. This result shows that our GeoDyn method could successfully represent distinctive components of spatiotemporal activity patterns.
Next, in the second test using VSDI data from AD and WT mice, we showed how our GeoDyn method could be applied to the classification of real data. When the GeoDyn was estimated from 1200 imaging samples from the AD and WT data, the observed profiles were classified into 67 groups of distinctive spatiotemporal patterns. This means that neural activity in the resting state can vary extremely, even within the same type of AD or WT mice, which makes it hard to distinguish AD and WT samples from their activity pattern analysis. However, using our GeoDyn method, similar spatial (temporal) patterns have similar geometric (dynamic) profiles, so that the 67 profile patterns could be re-grouped as AD or WT according to their similarity (Fig. 4b).
From the inspection of profile indices from AD and WT samples (Fig. 4a,b), we found that the geometric profiles contrasted more between AD and WT samples than their dynamic profiles did. This suggests that, in this set of data, geometric features of the activity patterns might be more informative for the classification of samples. This was also supported by the classification results from other methods (MAM and PLM) that used different components of geometric and dynamic information (Fig. 4d,e and Supplementary Fig. S5). The MAM method mostly measures geometric distribution of patterns, and showed higher classification performance than did the PLM method, which measures the dynamic propagation of patterns. On the other hand, our GeoDyn method considers both geometric and dynamic profile information and showed a significantly higher correct ratio than did the other two methods.
To see whether applying both the MAM and PLM could enhance performance, we performed the classification using the combined MAM + PLM method. The performance of the combined method was still lower than that of the GeoDyn and similar to that of the MAM only. This means that the PLM could not contribute much to the classification of WT and AD (Supplementary Fig. S9). This is also consistent with the result that the geometric profiles differed more between AD and WT samples than did their dynamic profiles. This result suggests that, first, the GeoDyn method can make better use of spatiotemporal activity information than the previously suggested methods. Second, although AD and WT mice seem to generate various types of spatiotemporal profiles, the activity patterns within each group share some geometric and dynamic features, which can be used to distinguish the activity patterns across the two groups.
We suggest that our GeoDyn can contribute significantly to the study on the relationship between the variation of neural activity and the changes in underlying neural circuitry. Previously, there have been a number of studies observing various patterns of neural activities in the brain, and emphasizing their importance in studying neural networks and the connectivity of neural circuitries13,14,22,49. In most cases, the observed activity patterns were considered to reflect some important biological meanings, such as information about local neural circuit or long-range network interaction between different brain regions. For example, one suggested that the activity propagation measured from VSDI was highly correlated with axonal projection in the mouse brain15. From the current study, we propose that such biological meaning of neural activity patterns might be effectively examined using GeoDyn profile analysis. As an example, we might ask a sample question such as “Is there any sign of structural difference in the circuitry of WT and AD type mouse brain?” From the profiles of the average sample-activity of the WT and AD mice (Supplementary Fig. S5a,b), it was clear that the differences between the two dynamic profiles of WT and AD were not significant in their shape. However, their amplitudes were noticeably different (Supplementary Fig. S5c,d). Our analysis also indicated that the activities from the two types of mice differed from their dispersion, but that their propagation speed appeared to be similar. If the propagation patterns were correlated with neural connectivity in the network, we could speculate that the functional connectivity might be modulated in AD mice by some factor. In addition, the difference between the geometric profiles could imply that their neural firing rate or excitability might be different locally. Although the GeoDyn method could help to examine candidates for the biological basis of activity variation, how to relate arbitrary patterns of spatiotemporal features to a direct biological basis remains elusive. Further study of well-designed activity observations with anatomical circuitry analysis is necessary to validate the assumption of a biological basis for the different patterns in WT and AD data. The current study was focused on the development of an analysis algorithm, so we leave this issue for future study.
Our result also shows that the GeoDyn method can successfully classify neural activity patterns even in the resting state, without any control of sensory input stimuli. In previous studies of VSDI data39, neural activities were measured under highly specific conditions of visual stimuli. In such conditions, neural activities show very limited variation in their spatiotemporal profiles. This may be helpful for simplicity in neural activity pattern analysis, but cannot fully explain the various features of neural activity observed under conditions that are more realistic. Here, we showed that our GeoDyn method could readily classify various activity patterns achieved with no input control. Thus, our analysis method appears to be applicable to the classification of any imaging data from realistic conditions. In general, imaging data from different conditions or methods have different spatiotemporal resolution or regions of interest (ROI). This issue can readily be addressed by our GeoDyn method because it extracts a common feature from spatiotemporal patterns regardless of the size or resolution of the imaging data. This enables us to apply our method to imaging data at different spatial and temporal scales. Therefore, our method might be a strong tool for the comparison of neural imaging data from different conditions, different brain regions, or even from different species.
Last, classification of neural activity patterns might also be performed using various types of machine learning techniques. Our GeoDyn method can be used not only as an independent pattern classifier, but also as a pre-processor of raw data to be applied in machine-learning analysis. Raw image data from different experimental environments and subject conditions have various scale and dimension parameters (e.g., brain shape and size), but our GeoDyn gains an advantage by converting raw data at various scales into a scale-free normalized profile. Such pre-processing makes it easier to apply the machine learning techniques used to enhance the classification performance. In addition, GeoDyn was designed to provide information about spatiotemporal patterns by parametrizing geometric and dynamic features of activity (such as speed or size). This means it could help with post-analysis of the classification results from machine learning, such as finding pattern motifs or a biological basis. While machine-learning algorithms can also classify various activity patterns well, they require additional analytical processes to extract physical or biological meanings from the results achieved. Our GeoDyn method can make processes involving machine-learning techniques much simpler.
Methods
Animals
Four WT (C57BL/6) mice (all 22 months old) and three transgenic AD model (APPSWE/PS1ΔE9) mice (22, 24 and 26 months old, respectively) were used for the experiments. They were co-housed in air-conditioned cages under a 12:12 hour light:dark cycle. Free access to UV sterilized water and food was given. All animal experiments were performed in accordance with the guidelines and policies for rodent experimentation provided by the KAIST Institutional Animal Care and Use Committee (IACUC). The protocol used was approved by the IACUC of KAIST (IACUC-14-134).
Surgical Procedure
For anesthesia induction, 3% isoflurane was used and 1–1.5% isoflurane with 100% oxygen was used during surgery. During data collection, 0.5–0.75% isoflurane was used. Head plates were custom designed and fixed to the skull with dental cement (Bosworth Trim II, Keystone Industries, Gibbstown, NJ, U.S.A.) and cyanoacrylate glue (Loctite 401, Henkel, Düsseldorf, Germany). The head plate was then fastened to a metal frame. A large portion of the right hemisphere of the cortex was exposed with a 7 × 6 mm unilateral craniotomy following previous literature. The dura mater was removed with a Vannas micro-scissor (FST, Vancouver, Canada) and fine forceps (FST, Vancouver, Canada). Extreme care was taken to prevent damage to the cortex. A heating pad and a feedback rectal probe (TCAT-2LV, Physitemp instruments, Clifton, NJ, USA) were used to maintain the body temperature at 37 °C throughout the experiment.
Voltage-Sensitive Dye Imaging (VSDI)
Artificial cerebrospinal fluid (aCSF) of pH 7.4 and temperature of 37 °C was used to prepare RH1692 dye (Optical Imaging, Rehovot, Israel) solution (1 mg/ml). Bath application of the dye to the exposed cortex was performed for 60–90 min. The unbound dye was washed away by loading the cortex with aCSF for 30 min after the bath application. Then, 1.5–2% agarose was used to cover the cortex to minimize pulsation and movement artifacts. A 12 mm coverslip was put onto the agarose layer before agarose cooling. The coverslip was then fixed to the head plate with cyanoacrylate glue. A 100 W halogen lamp focused at 400 µm from the cortical surface was used for excitation. The excitation light was filtered with a filter centered at 632 nm (FF02-632/22-25, Semrock, NY, USA) and was reflected onto the cortex using a light guide. The signals coming back from the cortex were collected using a tandem lens macroscope with a long-pass emission filter at 675 nm (84–753, Edmund Optics, Barrington, NJ, USA). The macroscope was connected to a CCD (MV1-D1312-160-CL-12, PhotonFocus, Lachen, Switzerland) that recorded the signals at 150 Hz using a CELOX imaging system and VDAQ software (Optical Imaging, Rehovot, Israel). The data had a spatial resolution of 62.5 μm per pixel.
Pre-processing VSDI images
We collected 370 imaging epochs of 20 s each from the two mice groups (n = 267 from WT and 103 from AD mice, respectively), which were aligned in a 2-D reference space to match the anterior-posterior axis of the brain using a custom built MATLAB function (Fig. 1b). To select a net brain region of interest (ROI) observed in all mice, a mask pattern was applied to achieve the ROI for analysis of all samples. Edge pixels of low signal-to-noise ratio were removed. The amplitudes of activity were z-scored by the mean and standard deviation of the whole activity. To eliminate high-frequency noise signals, each epoch was filtered using a zero-phase band-pass filter at 0.1–6 Hz. To specify significant activity patterns only, the average activity value over time was calculated in each epoch and the intervals of which the amplitude exceeded the threshold (0.5 of z-score) were extracted. In addition, activities that do not show any spatial patterns were excluded.
Generation of simulated activity pattern
Nine sample activity-patterns were generated from computer simulations using the 2-D Gaussian kernel (Eq. 1) or a truncated cone function (Eq. 2) as a cone with an apex cut off, to mimic the observed activity patterns in the imaging data:
where u(x, y, t) is the activity amplitude at spatial position x and y at time point t. A(t) is the peak amplitude at \([({x}_{0}(t),{y}_{0}(t)],\) \({\sigma }_{A}(t)\) represents the standard deviation of the Gaussian kernel, r1(t) and r 2 (t) are the top and bottom radii of the truncated cone function. All samples were designed with a size of 101 × 101 pixels and 50 time frames. The equations of each activity pattern (with detailed parameters) are listed in Table 1.
To design a noisy version of the activity patterns, we added background Gaussian noise to the activity. The mean and standard deviation were set to zero and one, respectively. The amount of noise added was set from visual inspection so that the similar-but-different activity patterns (e.g., Supplementary Fig. S2, #2 and #3; #5 and #6) were visually indistinguishable (Supplementary Videos 1–6).
Analysis of Geometric and Dynamic Features of Neural Activity
Design of the Geometric and Dynamic profiles (GeoDyn)
For a geometric profile, the range of the threshold value for calculating the supra-threshold area of activity was set from ‘0’ to ‘5’ with 0.1 intervals in the z-score. For a dynamic profile, the velocity field was calculated from two consecutive frames using the combined global-local algorithm of the optical flow method47. The MATLAB toolbox (Mathworks, USA) for the optical flow method was adapted from an open source online50. The directional velocity, vθ, was calculated as the weighted sum of the velocity field within a given angular window (Fig. 2h). The angular window, f(ϕ), was designed with a Gaussian kernel that varied through the polar axis (ϕ) (Eq. 3):
where θ represents the center of the window, which rotates from 0° to 360° with 7.5° intervals. The standard deviation of the Gaussian kernel, σ θ was set as 7.5°.
Similarity measurement by geometric or dynamic profiles between different samples
To estimate similarity between geometric (or dynamic) profiles of different samples, we first calculated the squared error between the profiles. The maximum error value from the combinations of all sample profiles was normalized to ‘1’. Then, similarity was defined as the difference between the normalized squared error and ‘1’ (maximum error). To consider any possibility of temporal phase difference between two activity patterns, we measured the similarity while shifting one profile along the time axis, where the amount of time shifting was limited to less than a quarter of the shorter activity pattern duration. Then, the similarity between the two profiles was finally chosen as the maximum among those values.
Hierarchical clustering of pattern groups
A schematic of hierarchical clustering is shown in Supplementary Fig. S1. The dendrogram (Supplementary Fig. S1, right) shows the procedure of hierarchical clustering used to group the samples based on similarity between samples (Supplementary Fig. S1, left). The vertical axis represents the sample objects and the horizontal axis indicates the similarity between the cluster nodes (black circles). The horizontal location of each cluster node indicates the averaged similarity between samples of two leaf cluster nodes. The cluster nodes were merged hierarchically by their similarity: the left ends of cluster nodes represent each single sample, while the right ends represent all the samples in one cluster group. Finally, by varying the similarity cutoff, we could optimize the number of clusters that maximized similarity within each group, but minimized it across the groups. This process was performed using the “linkage” function in MATLAB, where the cutoff value was set to minimize the cluster validation index, i.e. the cluster balance48 (Eq. 4).
The above equations show the calculation of cluster balance when all the samples are grouped into n clusters, [C1, C2…C n ]. Here, \(|{C}_{i}|\) is the number of samples in C i , d ij means 1 – the similarity between the sample i and j. After optimal clustering, the clusters containing a single sample were considered one group for simplicity.
Support Vector Machine (SVM) linear classifier
For the training of the support vector machine (SVM), the clustering result of the training set was provided, including the indices of geometric and dynamic clusters of each sample. We used the “fitcsvm” function in MATLAB to train the SVM classifier. Then, the classification of the test-set type was performed using the “predict” function based on the trained SVM classifier. We also applied a standard “Leave-One-Out” cross-validation to perform a SVM classification correctly: An individual mouse was chosen as a test set and all the other mice were used as training sets. We repeated this sampling 100 times for each of 12 cases (12 possible combination pairs from 4 WT and 3AD mice). Before training the SVM classifier using real data, the geometric and dynamic indices were sorted in ascending order according to the value of \((\frac{{\rm{number}}\,{\rm{of}}\,{\rm{AD}}\,{\rm{samples}}\,{\rm{in}}\,{\rm{cluster}}}{{\rm{number}}\,{\rm{of}}\,{\rm{total}}\,{\rm{samples}}\,{\rm{in}}\,{\rm{cluster}}})\), to perform the classification correctly.
References
Leopold, D. A., Murayama, Y. & Logothetis, N. K. Very slow activity fluctuations in monkey visual cortex: Implications for functional brain imaging. Cereb. Cortex 13, 422–433 (2003).
London, J. A., Cohen, L. B. & Wu, J. Y. Optical recordings of the cortical response to whisker stimulation before and after the addition of an epileptogenic agent. J. Neurosci. 9, 2182–2190 (1989).
Prechtl, J. C., Cohen, L. B., Pesaran, B., Mitra, P. P. & Kleinfeld, D. Visual stimuli induce waves of electrical activity in turtle cortex. Proc. Natl. Acad. Sci. USA 94, 7621–7626 (1997).
Prechtl, J. C., Bullock, T. H. & Kleinfeld, D. Direct evidence for local oscillatory current sources and intracortical phase gradients in turtle visual cortex. Proc. Natl. Acad. Sci. USA 97, 877–882 (2000).
Ahissar, E. et al. Dependence of cortical plasticity on correlated activity of single neurons and on behavioral context. Science 257, 1412–5 (1992).
Huang, X. et al. Spiral Wave Dynamics in Neocortex. Neuron 68, 978–990 (2010).
Song, W. J. et al. Cortical intrinsic circuits can support activity propagation through an isofrequency strip of the guinea pig primary auditory cortex. Cereb. Cortex 16, 718–729 (2006).
Mitra, A., Snyder, A. Z., Blazey, T. & Marcus, E. Correction for Mitra et al., Lag threads organize the brain’s intrinsic activity. Proc. Natl. Acad. Sci. 112, E7307–E7307 (2015).
Fiser, J., Chiu, C. & Weliky, M. Small modulation of ongoing cortical dynamics by sensory input during natural vision. Nature 431, 573–578 (2004).
Kenet, T., Bibitchkov, D., Tsodyks, M., Grinvald, A. & Arieli, A. Spontaneously emerging cortical representations of visual attributes. Nature 425, 954–956 (2003).
Shmiel, T. et al. Neurons of the cerebral cortex exhibit precise interspike timing in correspondence to behavior. Proc Natl Acad Sci 102, 18655–18657 (2005).
Ikegaya, Y., Aaron, G. & Cossart, R. Synfire Chains and Cortical Songs: Temporal Modules of. Sci. (New York, NY) 304, 559–564 (2004).
Benucci, A. & Frazor, R. A. & Carandini, M. Standing Waves and Traveling Waves Distinguish Two Circuits in Visual Cortex. Neuron 55, 103–117 (2007).
Mohajerani, M. H., Mcvea, D. A., Fingas, M. & Murphy, T. H. Mirrored Bilateral Slow-Wave Cortical Activity within Local Circuits Revealed by Fast Bihemispheric Voltage-Sensitive Dye Imaging in Anesthetized and Awake Mice. J. Neurosci. 30, 3745–3751 (2010).
Mohajerani, M. H. et al. Spontaneous cortical activity alternates between motifs defined by regional axonal projections. Nat Neurosci 16, 1426–1435 (2013).
Katz, L. C. & Shatz, C. J. Synaptic activity and the construction of cortical circuits. Science 274, 1133–1138 (1996).
Weliky, M. Correlated Neuronal Activity and Visual Cortical Development. Neuron 27, 427–430 (2000).
An, S., Choi, W. & Paik, S.-B. Development of a computational model on the neural activity patterns of a visual working memory in a hierarchical feedforward Network. J. Korean Phys. Soc. 67, 1713–1718 (2015).
Tsodyks, M. Linking Spontaneous Activity of Single Cortical Neurons and the Underlying Functional Architecture. Science 286, 1943–1946 (1999).
Vanni, M. P. & Murphy, T. H. Mesoscale Transcranial Spontaneous Activity Mapping in GCaMP3 Transgenic Mice Reveals Extensive Reciprocal Connections between Areas of Somatomotor Cortex. J. Neurosci. 34, 15931–15946 (2014).
McVea, D. A., Mohajerani, M. H. & Murphy, T. H. Voltage-Sensitive Dye Imaging Reveals Dynamic Spatiotemporal Properties of Cortical Activity after Spontaneous Muscle Twitches in the Newborn Rat. J. Neurosci. 32, 10982–10994 (2012).
White, B. R. et al. Imaging of functional connectivity in the mouse brain. PLoS One 6 (2011).
Kang, M. et al. Momentary level of slow default mode network activity is associated with distinct propagation and connectivity patterns in the anesthetized mouse cortex. J. Neurophysiol, https://doi.org/10.1152/jn.00163.2017 (2017).
Xu, W., Huang, X., Takagaki, K. & Wu, J. young. Compression and Reflection of Visually Evoked Cortical Waves. Neuron 55, 119–129 (2007).
Han, F., Caporale, N. & Dan, Y. Reverberation of Recent Visual Experience in Spontaneous Cortical Waves. Neuron 60, 321–327 (2008).
Zanos, T. P., Mineault, P. J., Nasiotis, K. T., Guitton, D. & Pack, C. C. A Sensorimotor Role for Traveling Waves in Primate Visual Cortex. Neuron 85, 615–627 (2015).
Ferezou, I. et al. Spatiotemporal Dynamics of Cortical Sensorimotor Integration in Behaving Mice. Neuron 56, 907–923 (2007).
Destexhe, A. & Contreras, D. Neuronal Computations with Stochastic Network States. Science 85, 85–90 (2006).
Sit, Y. F., Chen, Y., Geisler, W. S., Miikkulainen, R. & Seidemann, E. Complex Dynamics of V1 Population Responses Explained by a Simple Gain-Control Model. Neuron 64, 943–956 (2009).
Ayzenshtat, I. et al. Precise spatiotemporal patterns among visual cortical areas and their relation to visual stimulus processing. J Neurosci 30, 11232–11245 (2010).
Spors, H. & Grinvald, A. Spatio-temporal dynamics of odor representations in the mammalian olfactory bulb. Neuron 34, 301–315 (2002).
Gao, X. et al. Interactions between two propagating waves in rat visual cortex. Neuroscience 216, 57–69 (2012).
Petersen, C. C. H., Hahn, T. T. G., Mehta, M., Grinvald, A. & Sakmann, B. Interaction of sensory responses with spontaneous depolarization in layer 2/3 barrel cortex. Proc. Natl. Acad. Sci. USA 100, 13638–43 (2003).
Civillico, E. F. Integration of Evoked Responses in Supragranular Cortex Studied With Optical Recordings In Vivo. J. Neurophysiol. 96, 336–351 (2006).
Huang, X. Spiral Waves in Disinhibited Mammalian Neocortex. J. Neurosci. 24, 9897–9902 (2004).
Schiff, S. J., Huang, X. & Wu, J. Y. Dynamical evolution of spatiotemporal patterns in mammalian middle cortex. Phys. Rev. Lett. 98, 9902 (2007).
Gabriel, A. & Eckhorn, R. A multi-channel correlation method detects traveling γ-waves in monkey visual cortex. J. Neurosci. Methods 131, 171–184 (2003).
Fehérvári, T. D., Okazaki, Y., Sawai, H. & Yagi, T. In vivo voltage-sensitive dye study of lateral spreading of cortical activity in mouse primary visual cortex induced by a current impulse. PLoS One 10 (2015).
Muller, L., Reynaud, A., Chavane, F. & Destexhe, A. The stimulus-evoked population response in visual cortex of awake monkey is a propagating wave. Nat. Commun. 5, 3675 (2014).
Weigel, S. & Luksch, H. Spatiotemporal analysis of electrically evoked activity in the chicken optic tectum: a VSDI study. J. Neurophysiol. 107, 640–8 (2012).
Townsend, R. G. et al. Emergence of complex wave patterns in primate cerebral cortex. J. Neurosci. 35, 4657–4662 (2015).
Contreras, D. & Llinas, R. Voltage-sensitive dye imaging of neocortical spatiotemporal dynamics to afferent activation frequency. J. Neurosci. 21, 9403–9413 (2001).
Llinas, R. R., Leznik, E. & Urbano, F. J. Temporal binding via cortical coincidence detection of specific and nonspecific thalamocortical inputs: a voltage-dependent dye-imaging study in mouse brain slices. Proc. Natl. Acad. Sci. USA 99, 449–54 (2002).
Grinvald, a & Hildesheim, R. VSDI: A new era in functional imaging of cortical dynamics. Nat Rev Neurosci 5, 874–885 (2004).
Tang, Q. et al. In Vivo Voltage-Sensitive Dye Imaging of Subcortical Brain Function. Sci. Rep. 5, 17325 (2015).
Devonshire, I. M., Grandy, T. H., Dommett, E. J. & Greenfield, S. A. Effects of urethane anaesthesia on sensory processing in the rat barrel cortex revealed by combined optical imaging and electrophysiology. Eur. J. Neurosci. 32, 786–797 (2010).
Bruhn, A., Weickert, J. & Schnörr, C. Lucas/Kanade meets Horn/Schunck: Combining local and global optic flow methods. Int. J. Comput. Vis. 61, 1–21 (2005).
Jung, Y., Park, H., Du, D. Z. & Drake, B. L. A decision criterion for the optimal number of clusters in hierarchical clustering. J. Glob. Optim. 25, 91–111 (2003).
Tsodyks, M., Kenet, T. & Arieli, A. Linking Spontaneous Activity of Single Cortical Neurons and The Underlying Functionl Architecture. Science 286, 1943 (1999).
Liu, C., Adviser-Freeman, W. T. & Adviser-Adelson, E. H. Beyond pixels: exploring new representations and applications for motion analysis. Proc. 10th Eur. Conf. Comput. Vis. Part III 28–42 (2009).
Acknowledgements
This research was supported by the Basic Science Research Program (NRF-2016R1C1B2016039, NRF-2016R1E1A2A01939949) (to S.P.) and by the Brain Research Program (NRF-2016M3C7A1913844) (to Y.J.) through the National Research Foundation of Korea (NRF) funded by the Ministry of Science and ICT.
Author information
Authors and Affiliations
Contributions
M.S. designed and performed the simulations, developed software for analysis, analyzed data, and wrote the manuscript. M.K. performed the imaging experiments and wrote the manuscript. H.L. performed the simulations, analyzed data, and edited the manuscript. Y.J. designed the project, directed the experimental research, and edited the manuscript. S.P. designed the project, directed the simulation and analysis research, and wrote the manuscript. All authors discussed and commented on the manuscript.
Corresponding authors
Ethics declarations
Competing Interests
The authors declare no competing interests.
Additional information
Publisher's note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Song, M., Kang, M., Lee, H. et al. Classification of Spatiotemporal Neural Activity Patterns in Brain Imaging Data. Sci Rep 8, 8231 (2018). https://doi.org/10.1038/s41598-018-26605-z
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-018-26605-z
This article is cited by
-
Improved spatio-temporal measurements of visually evoked fields using optically-pumped magnetometers
Scientific Reports (2021)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.