TOP一般口演(若手道場)
 
一般口演(若手道場)
若手道場 感覚系と運動系
Wakate Dojo: Sensory and Movement Themes
座長:加藤 隆弘(九州大学病院 精神科・神経科)・本城 咲季子(筑波大学)
2022年7月1日 14:00~14:15 沖縄コンベンションセンター 会議場A2 第7会場
2WD07a1-01
聴性定常反応の時間変化と脳波の関係
Fluctuation of auditory steady-state response with ongoing EEG oscillation

*岡田 大吾(1)、可部 泰生(1)、高橋 斗威(1)、高橋 宏知(1)
1. 東京大学大学院情報理工学系研究科
*Daigo Okada(1), Yasuo Kabe(1), Toi Takahashi(1), Hirokazu Takahashi(1)
1. Graduate School of Information Science and Technology, The University of Tokyo

Keyword: auditory steady-state response, EEG, alpha oscillations, theta oscillations

Background
The detection of auditory steady-state response (ASSR) in electroencephalography (EEG) is utilized in the objective assessment of the hearing thresholds. However, because of the low signal-to-noise ratio, time-consuming averaging of EEG signals is required to detect the auditory steady-state response. We also hypothesize that the auditory steady-state response amplitude is not constant during measurement and is fluctuated, and that this fluctuation deteriorates the signal-to-noise ratio. In this study, we investigated whether and how the auditory steady-state response was fluctuated, and attempted to predict this fluctuation from the ongoing EEG.

Methods
Participants were given stimulus to the right ear for 5 min while EEG at Cz was recorded. The test stimulus was sinusoidal amplitude-modulated sound with the modulation frequency of 40 Hz, carrier frequency of 1000 Hz, and intensity of 80 dB SPL. The auditory steady-state response amplitude was quantified as correlation coefficient between recorded EEG and estimated auditory steady-state response from the stimulus.

Results
The auditory steady-state response amplitude was fluctuating with a cycle typically at 20 to 60 sec. The auditory steady-state response amplitude significantly decreased with alpha and theta oscillations. This tendency was more obvious in an eye-closing condition than in an eye-opening condition.

Discussion
The theta and alpha oscillations in EEG were likely to reflect arousal and attention, which might cause auditory steady-state response fluctuation. This relationship between the EEG oscillations and auditory steady-state response might become diminished in the eye-opening condition because of visual influences. Taken together, auditory steady-state response detection still leaves room for improvement by monitoring arousal and attention from the ongoing EEG.
2022年7月1日 14:15~14:30 沖縄コンベンションセンター 会議場A2 第7会場
2WD07a1-02
Feature tracking and geometrical priors counteract illusory non-rigidities from outputs of motion-energy cells
*Akihito Maruya(1), Qasim Zaidi(1)
1. Graduate Center for Vision Research, State University of New York

Keyword: Physiology of complex motion perception, Object rigidity versus nonrigidity, Convolutional Neural Network Model, Geometrical priors in motion perception

Why do objects appear rigid when projected retinal images are deformed non-rigidly by object or observer motion? We used rotating rigid objects that can appear rigid or non-rigid to test whether tracking of salient features counteracts nonrigidity.
When two circular rings are rigidly linked at an angle and rotated, they appear wobbling and not linked rigidly. Movies and window displays have used this illusion. These percepts contradict the conventional rigidity assumption. Forced-choices by 10 observers between the link being rigidly connected or not, revealed non-rigid percepts at moderate speeds (6.0 deg/sec). Responses of arrays of MT component cells show that despite the object being physically rigid the pre-dominant motion energy vectors are perpendicular to the contours of the rings instead of in the rotation direction. We trained a convolutional neural network on 9000 motion flows to distinguish between wobbling and rotation, simulating MST and STS which respond to motion flow patterns. Flows from MT component cells to the trained CNN gave a high probability of wobbling.
At slow speeds (0.6 deg/sec) observers reported rigid rotation, so we tested whether feature tracking can promote rigidity by adding salient features. When the link was painted or replaced by a gap, or if the rings were polygons with vertices, the rings appeared rigidly rotating at 6.0 deg/sec. Phenomenologically, the motion of painted segments, gaps, or vertices provides cues for rotation and against wobbling. These salient features can be tracked by arrays of MT pattern-motion cells or by explicit feature-tracking. The CNN gave high probabilities of rotation for motion flows from feature tracking. However, at high speeds (60 deg/sec), all configurations appeared non-rigid. Salient feature-tracking thus contributes to rigidity at slow and moderate speeds, but not at high speeds.
Combinations of CNN outputs from motion energy and feature tracking did not fully explain differences in percepts between circular and polygonal rings because they generate similar motion energy. To test if priors for wobbling based on feature geometries (e.g. a gap moves up and down for wobble but not rotation, and a square is less likely to wobble than a circle) influence the percept, we simulated priors with a physics engine. A CNN model that gave greater weight to motion energy at fast speeds and to feature tracking at slow, combined with the geometrical prior, explained rigid and nonrigid percepts.
2022年7月1日 14:30~14:45 沖縄コンベンションセンター 会議場A2 第7会場
2WD07a1-03
Hierarchical prediction errors in crossmodal sequence processing: an EEG functional connectivity study
*Yi-Yuan Huang(1,2), Chien-Te Wu(1,5), Shinsuke Koike(1,2,3,4), Zenas C. Chao(1)
1. IRCN, Univ of Tokyo, Tokyo, Japan, 2. Grad Sch of Art and Sci, Univ of Tokyo, Tokyo, Japan, 3. UTIDAHM, Univ of Tokyo, Tokyo, Japan, 4. CiSHuB, Univ of Tokyo, Tokyo, Japan, 5. Colg of Med, NTU, Taiwan

Keyword: MULTISENSORY, HIERARCHY, SEQUENCE PROCESSING, FUNCTIONAL CONNECTIVITY

Predictive coding theory proposes that prediction and prediction-error signals interact to learn statistical regularities in sensory experiences and predict incoming inputs. This interaction occurs not only within processing hierarchies to handle different levels of prediction updates and error minimizations, but also between multiple sensory modalities to combine information for a coherent understanding of the environment. However, most studies have focused on the hierarchical interactions, and the crossmodal interactions remain unclear. Here, we address this by investigating how sequential events with multiple hierarchies and modalities are learned and represented in the human brain. We aim to identify hierarchical prediction-error signals when regularities of stimulus transitions from one modality to another modality are violated. We recorded 64-channel EEG in 30 healthy participants during a crossmodal local-global oddball paradigm. To examine crossmodal interactions, we used 3-stimulus audiovisual sequences (AAV or VVA) to establish crossmodal predictions, and unimodal sequences (AAA or VVV) to probe crossmodal prediction errors from the last stimulus item. To examine hierarchical interactions, we controlled the temporal regularities at both the local (the probability of stimulus-to-stimulus transitions) and global level (the probability of 3-stimulus sequences). To evaluate the crossmodal and hierarchical interactions, we analyzed the network functional connectivity of prediction-error responses which were measured by comparing different sequence types. Significant crossmodal prediction-error responses were found at the connections between the central and parieto-occipital areas in the theta and alpha frequency bands. We further applied a model-fitting approach to extract prediction errors from the local and global levels and showed that crossmodal interactions occurred not only at the global level, but also early at the local level. Furthermore, the local-level prediction-error responses were represented as positive theta-gamma and negative alpha-beta oscillations in the frontoparietal connections and central-occipital connections while the global-level prediction-error responses were represented as late alpha oscillations around the centroparietal area. Our findings suggest that the human brain computes crossmodal transitions at the stimulus-to-stimulus level, and the centroparietal area is a key hub for cross audiovisual predictions.
2022年7月1日 14:45~15:00 沖縄コンベンションセンター 会議場A2 第7会場
2WD07a1-04
Alpha-dependent oscillatory coupling accounts for the inter-individual difference in spontaneous motor tempo and context-dependent adaptative changes
*Sai Sun(1,2), Satoshi Shioiri(2), Shinsuke Shimojo(3)
1. Frontier Research Institute for Interdisciplinary Sciences, Tohoku University, Sendai, Japan, 2. Research Institute of Electrical Communication, Tohoku University, Sendai, Japan, 3. Division of Biology and Biological Engineering/Computation and Neural Systems, California Institute of Technology, Pasadena, USA

Keyword: Individual difference, Spontaneous motor tempo, Tempo variations, Intrinsic rhythm

People tend to have their natural tempo preference in daily activities (such as walking, speaking, etc.) and spontaneous tempo variations in response to the dynamically changing environments. For example, your most preferred walking speed is 1.5 meters per second; it could be slightly higher when you are crossing a traffic light but a bit lower when you are walking by the river. Studies based on time perception have assumed a pacemaker (or counter, accumulator) existing in our brain to scale the beats or temporal intervals and continuously compare the current time in working memory to the remembered time in reference memory. Lately, it’s been known that each brain has its unique intrinsic oscillator that is critically relevant to internally driven behavioral dynamics. Specifically, electrophysiological studies have identified a key role of the intrinsic alpha (8-13Hz) in predicting speed-related performance (i.e., reaction time) and temporal variations in perception and attention, leading to the hypothesis that alpha and alpha-dependent networks may be mostly (but not exclusively) responsible for natural tempo preference and adaptative tempo variations. Under this working hypothesis, we captured self-initiated natural tapping tempo, and adaptative tempo variations across three freely tapping sessions after fastest/natural/slowest tapping instructions individually, followed by a correlational exploration of behavioral rhythms and intrinsic brain rhythms using EEG. Our preliminary data among 22 subjects showed that the theta(4-7Hz)-alpha phase-amplitude coupling was negatively correlated with the inter-individual difference in natural tempo preference but positively with adaptative tempo variations as measured by relative changes to natural tempo. Precisely, enhanced theta-alpha synchrony may reflect a slower self-initiated tempo with more considerable adaptative variations, confirming the functional role of alpha-dependent oscillatory coupling in representing natural tempo preference and context-dependent tempo variations. Such results may indicate the possibilities of brain/mind monitoring via behavioral tempo and tempo variations, and further state tuning via frequency-selective externally driven perturbation or electrical stimulation targeting intrinsic alpha to modulate intrinsic motor tempo and tempo stability/variability. Future studies are needed to examine whether such effects are motor-specific or cognition-general.