TOPシンポジウム
 
シンポジウム
25 多感覚入力の脳内表象とその解読
25 Sensory representation of visual, auditory, olfactory, and gustatory input in the brain
座長:東原 和成(東京大学)・西本 伸志(大阪大学)
2022年7月2日 14:00~14:24 沖縄コンベンションセンター 会議場B1 第3会場
3S03a-01
視覚と認知の脳内表象:定量とその解読
Modeling and decoding visual, cognitive, and cross-modal representations

*西本 伸志(1,2)
1. 大阪大学、2. 情報通信研究機構
*Shinji Nishimoto(1,2)
1. Osaka University, 2. NICT

Keyword: encoding model, fMRI, representation

Our daily experiences are realized by sensory signal processing followed by diverse cognitive functions in the brain. Mapping these perceptual and cognitive functions in the brain has been one of the important topics in functional neuroimaging studies. However, relatively little has been known about how these diverse functions might be combined or co-represented. Given that many of our natural experiences (e.g., watching, talking, commuting, and studying) involve complex orchestrations of multiple sensory and cognitive processes (e.g., vision, language, memory, and decisions), it is essential to comprehensively understand how such functions are mapped and how some of them might be jointly represented in the cortex and other subcortical areas. Toward such a goal, we have recorded human brain activity using functional Magnetic Resonance Imaging (fMRI) and have built quantitative voxel-wise encoding and decoding models of the brain activity evoked by diverse perceptual and cognitive experiences. Such studies have revealed latent representational spaces of perceptual and cognitive functions and their fine-scale mapping. In this talk, I will introduce some of our recent modeling studies, including the convergence of cross-modal functions, multidimensional mappings of diverse emotions, and a comprehensive representation of more than 100 cognitive functions across the cortex, cerebellum, and subcortex. I will also discuss how such studies will further be applied and generalized toward understanding more naturalistic experiences.
2022年7月2日 14:24~14:48 沖縄コンベンションセンター 会議場B1 第3会場
3S03a-02
Parallel and distributed encoding of speech across human auditory cortex
*Liberty Hamilton(1,2)
1. Dept of Speech, Language, and Hearing Sciences, Univ of Texas at Austin, Austin, USA, 2. Dept of Neurology, Dell Med School, Univ of Texas at Austin, Austin, USA

Keyword: electrocorticography, intracranial, auditory cortex, speech

A prevailing view of information flow through the auditory pathway posits that speech sounds are processed through a hierarchy from brainstem to thalamus to the primary auditory cortex and out to speech cortex on the superior temporal gyrus. Using a combination of high resolution electrocorticography (ECoG) grid recordings across the lateral superior temporal gyrus (STG) and core auditory areas on the temporal plane, cortical stimulation, and ablation, we show that this may not be the case. Adult participants with electrode grids listened to English sentence stimuli while we recorded local field potentials from the brain. We then computed the latencies of auditory responses across the entire cortical auditory hierarchy, spectrotemporal and phoneme receptive fields, and tonotopic representations. Using these models, we replicated our previous findings of an onset-selective region of posterior superior temporal gyrus. Interestingly, this posterior region was activated in parallel to the primary auditory cortex at indistinguishable latencies. We also uncovered absolute-pitch representations that were mostly localized to the primary auditory cortex and temporal plane, while absolute-pitch was localized more anteriorly. Phonological feature and syllabic (peak rate) representations were located in the lateral superior temporal gyrus rather than the temporal plane. While these tuning differences and similar response latencies indicate potential parallel processing in posterior superior temporal gyrus and primary auditory cortex, they are only correlational. Thus, we also performed causal stimulation of the primary auditory cortex and superior temporal gyrus to determine how these regions interact. Stimulation of the primary auditory cortex does not appear to interfere with speech perception, which would be predicted if it provided the sole inputs to this area. I will discuss these results and implications for brain processing of speech.
2022年7月2日 14:48~15:12 沖縄コンベンションセンター 会議場B1 第3会場
3S03a-03
頭表脳波の多変量解析から見える匂いの神経表象の時空間的特性
Neural representation of odors revealed by scalp-recorded EEG decoding

*岡本 雅子(1)
1. 東京大学大学院
*Masako Okamoto(1)
1. The University of Tokyo

Keyword: olfaction, EEG, mvpa

How the human brain translates olfactory input into perception is an important question in understanding the neural basis of sensory processing. Olfaction is unique in multiple ways. Neurally, peripheral inputs, which are coded through a pattern of differential binding at approximately 400 types of olfactory receptors, are directly transmitted to the olfactory bulb (OB) without thalamic relay. Perceptually, odors alone rarely form discernible objects as vision does, and the most salient aspect of olfactory perception has been considered to be pleasantness. Yet, humans can sense thousands of different odorous molecules, and odor does evoke diverse perceptions other than pleasantness, for example, fruity and floral odors. Previous human neuroimaging studies have identified several brain regions as the neural substrates of odor perception. In addition, recent intracranial electroencephalography (EEG) studies have revealed oscillatory neural coding of odor information in olfactory regions. However, spatiotemporal dynamics of the whole brain activities underling the emergence of different aspects of olfactory perception is still now well understood. To examine how representations of odorants evolve into perception in human brain, we used time-resolved multivariate pattern analysis (tMVPA) on the scalp-recorded EEG responses to 10 perceptually diverse odorants, and associated the resulting decoding accuracies to perception and source electrical activities. Significant decoding performance was observed in both subject-wise and cross-subject tMVPA models 100-200 ms after odor onset. This suggested that there are patterns of EEG responses that represent odor information and are consistent across subjects. Representations unique to unpleasantness started 300 ms after odor onset, while those of pleasantness and perceptual quality (e.g., fruity) started at 500 ms, with all of these perceptional representations reached a maximum after 600 ms. The source estimation indicated that odor representations began in primary olfactory cortex 100 ms after odor onset, and gradually expanded to the secondary and downstream brain regions. These results suggest initial odor information coded in the olfactory areas (<350 ms) evolves into their perceptual realizations (300 to >600 ms) through computations in widely distributed cortical regions, with different perceptual aspects having different spatiotemporal dynamics.
2022年7月2日 15:12~15:36 沖縄コンベンションセンター 会議場B1 第3会場
3S03a-04
Electrophysiological representation of odor in the human brain
*christina Zelano zelano(1), Qiaohan Yang(1), Gregory Lane(1), Guangyu Zhou(1)
1. Northwestern University

Keyword: Olfaction, piriform cortex, human

Studies of neuronal oscillations have contributed substantial insight into the mechanisms of visual, auditory, and somatosensory perception. However, progress in such research in the human olfactory system has lagged behind. As a result, the electrophysiological properties of the human olfactory system are poorly understood, and, in particular, whether stimulus-driven high-frequency oscillations play a role in odor processing is unknown. Here, we used direct intracranial recordings from human piriform cortex during an odor identification task to show that 3 key oscillatory rhythms are an integral part of the human olfactory cortical response to smell: Odor induces theta, beta, and gamma rhythms in human piriform cortex. We further show that these rhythms have distinct relationships with perceptual behavior. Odor-elicited gamma oscillations occur only during trials in which the odor is accurately perceived, and features of gamma oscillations predict odor identification accuracy, suggesting that they are critical for odor identity perception in humans. We also found that the amplitude of high-frequency oscillations is organized by the phase of low-frequency signals shortly following sniff onset, only when odor is present. Our findings reinforce previous work on theta oscillations, suggest that gamma oscillations in human piriform cortex are important for perception of odor identity, and constitute a robust identification of the characteristic electrophysiological response to smell in the human brain. Future work will determine whether the distinct oscillations we identified reflect distinct perceptual features of odor stimuli.
2022年7月2日 15:36~16:00 沖縄コンベンションセンター 会議場B1 第3会場
3S03a-05
Cortical representation of taste: there is no Cartesian restaurant
*Jason Avery(1)
1. Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA

Keyword: TASTE, INSULA, SENSORY REPRESENTATION, TASTE CODING

A central goal of neuroscience is identifying how various sources of sensory information is represented in the brain. Compared to senses such as vision and touch, relatively little is understood about the cortical representation of taste within gustatory areas of the cerebral cortex. Based on its success within other sensory domains, such as somatotopy for the sense of touch, retinotopy for vision, and tonotopy for hearing, the topographic model has become a popular default hypothesis for how sensory data are represented at the cortical level. Contrasting lines of evidence within neuroscientific studies of taste seem to support either this model or one where taste is represented via a distributed population code within the neurons of gustatory cortex. Recent investigations into this question have employed high-resolution functional neuroimaging methods and multivariate analytic approaches to examine taste quality coding in both rodent and human brains. Collectively, these recent studies do not support the topographic model of taste quality representation, but rather one where taste quality is represented by distributed patterns of activation across mixed populations of broadly and specifically-tuned neurons within gustatory regions of the insular cortex. This talk will discuss the methods and results of these recent studies, the underlying assumptions behind the competing theoretical accounts, as well as potential models for the logic behind taste representation at the cortical level.