TOP一般口演(Oral)
 
Oral
New Technologies
一般口演
新技術
7月27日(土)15:20~15:35 第10会場(万代島ビル 6F 会議室)
3O-10a2-1
シンチレーションを用いた神経活動の遠隔操作
Takanori Matsubara(松原 崇紀)1,2,Takayuki Yanagida(柳田 健之)3,Noriaki Kawaguchi(河口 範明)3,Takashi Nakano(中野 高志)4,Jun-ichiro Yoshimoto(吉本 潤一郎)4,Shin-ichiro Horigane(堀金 慎一郎)5,Shuhei Ueda(上田 修平)5,Sayaka Takemoto-Kimura(竹本ー木村 さやか)5,Akihiro Yamanaka(山中 章弘)1,2,6,Takayuki Yamashita(山下 貴之)1,2,6,7
1名古屋大環境医神経系2
2名古屋大院医
3奈良先端大物質創成
4奈良先端大情報科
5名古屋大環境医神経系1
6JST, CREST
7JSTさきがけ

Recent advances in material science in close conjunction with neuroscience have made it possible to achieve remote/wireless optogenetic control of neuronal activities in living animals. Light-sensitive proteins called opsins used for such studies are activated only by visible light, which has a low tissue penetration depth. Therefore, for stimulating a large population of neurons in the deep brain, it would be advantageous to employ up-converting phosphors which emit visible light in response to more tissue-penetrating near-infrared (NIR) light (Chen et al., Science 359: 679-684, 2018; Miyazaki et al., Cell Reports, 26: 1033-1043, 2019.). However, the tissue penetration depth of NIR is limited to several millimetres and NIR illumination can cause abrupt heat generation in body surface. To overcome these issues, we utilize inorganic scintillators that exhibit visible luminescence called scintillation when excited by X-ray. We show that a certain yellow-emitting scintillator can efficiently activate red-shifted opsins upon X-irradiation. With these scintillator/opsin combinations, we successfully activated and inhibited dopamine neurons in the ventral tegmental area of freely moving mice to wirelessly drive related behaviours under mild X-irradiation. The scintillator crystal incubated with dissociated hippocampal neurons had no negative effects on cell viability, suggesting bio-compatibility of the material. These results demonstrate the potential of scintillation to be used for distant control of cellular functions in the brain and other organ systems of living animals including humans.
7月27日(土)15:35~15:50 第10会場(万代島ビル 6F 会議室)
3O-10a2-2
皮質刺激皮質誘発電位(CCEP)における律動性反応
Takuro Nakae(中江 卓郎)1,2,Riki Matsumoto(松本 理器)3,Masaya Togo(十河 正弥)4,Hirofumi Takeyama(武山 博文)7,Katsuya Kobayashi(小林 勝哉)4,Akihiro Shimotake(下竹 昭寛)5,Masao Matsuhashi(松橋 眞生)5,Nobutaka Mukae(迎 伸孝)8,Yukihiro Yamao(山尾 幸広)1,Takayuki Kikuchi(菊池 隆幸)1,Kazumichi Yoshida(吉田 和道)1,6,Takeharu Kunieda(國枝 武治)6,Akio Ikeda(池田 昭夫)5,Susumu Miyamoto(宮本 享)1
1京都大院医脳病態生理 脳神経外科
2滋賀県立総合病院 脳神経外科
3神戸大院医内科学講座 神経内科学分野
4京都大院医脳病態生理 神経内科
5京都大院医 てんかん運動異常生理学講座
6愛媛大院医 脳神経外科
7京都大院医 呼吸管理睡眠制御学講座
8九州大院医 脳神経外科

Introduction:
Human brain operates with oscillatory activity in different frequency bands such as mu rhythm and frontal midline theta. We observed some rhythmic pattern in the averaged waveforms in the CCEP investigation (called ""oscillatory response"" here), which is originally a method to trace connectivity by single electrical pulse stimulation. We report on the distribution of the oscillatory responses in CCEP.

Methods:
Subjects are 15 intractable epilepsy patients who underwent chronic implantation of subdural electrodes for presurgical evaluation. Single-pulse stimuli (0.3 ms, 8 ~ 10 mA, 1 Hz, 30 × 2 trials) were applied through two adjacent electrodes (47.1 ± 8.8 stimulus sites per patient). CCEP was recorded from all of the implanted subdural electrodes (89.6 ± 16.0 electrodes per patient). An oscillatory response was detected by visual inspection on all of the averaged waveform with a constant criteria. If rhythms with different frequency were observed in the same electrode, only the largest one was collected.

Results:
Totally, 2370 oscillatory responses (5.3 %) were detected in 45129 observations in 15 subjects, including 1087 alpha-band, 908 beta-band and 1235 theta-band responses. Theta rhythm was observed predominantly in the frontal lobe. Alpha rhythm was observed frequently in the postcentral gyrus, the inferior temporal gyrus and the precuneal cortex. Beta rhythm was often observed around the precentral gyrus.
When we compare the frequency of oscillatory responses with that of the resting rhythm at each electrode, alpha around the postcentral gyrus and beta around the precentral gyrus showed the concordance between the two. The frequency and localization was consistent with that of mu rhythm. Time-frequency representation analysis revealed that single pulse stimulation seemed to evoke the mu rhythm with its phase locked to the stimulus.

Conclusion:
Oscillatory CCEP responses were evoked predominantly in the perirolandic area, the precuneal cortex and the frontal lobe. Especially for the perirolandic area, we observed phase resetting of the physiological oscillation (mu rhythm) by exogenous input. This unique oscillatory CCEP may reveal new functional aspects which traditional CCEP evaluation could not assess. Although its sensitivity and specificity was not high, combining this finding with other intrinsic brain activities would be of clinical use for functional brain mapping.
7月27日(土)15:50~16:05 第10会場(万代島ビル 6F 会議室)
3O-10a2-3
高汎用性ハイブリッド手法による行動軌跡からの状態推定と特徴抽出
Kotaro Kimura(木村 幸太郎)1,2,Shuhei J. Yamazaki(山崎 修平)1,2,Takuma Kitanishi(北西 卓磨)3,Ken Yoda(依田 憲)4,Akinori Takahashi(高橋 晃周)5,Azusa Kamikouchi(上川内 あずさ)6,Shizuko Hiryu(飛龍 志津子)7,Takuya Maekawa(前川 卓也)8
1名古屋市大院システム自然科学
2大阪大院理生物科学
3大阪市大医
4名古屋大院環境
5極地研
6名古屋大院理
7同志社大生命医科学
8大阪大院情報科学

Animal behavior is the final and integrated output of the brain activity. Thus, recording and analyzing behavior is critical to understand the underlying brain function. While recording animal behavior has become easier than ever with the development of compact and inexpensive devices, detailed behavioral data analysis requires sufficient previous knowledge and/or high content data such as video images of animal postures, which makes it difficult for most of the animal behavioral data to be efficiently analyzed to understand brain function. Here, we report a versatile method using a hybrid supervised/unsupervised machine learning approach to efficiently estimate behavioral states and to extract important behavioral features only from low-content animal trajectory data. As proof of principle experiments, we analyzed trajectory data of worms, fruit flies, rats, and bats in the laboratories, and penguins and flying seabirds in the wild, which were recorded with various methods and span a wide range of spatiotemporal scales—from mm to 1000 km in space and from sub-seconds to days in time. We estimated several states during behavior and comprehensively extracted characteristic features from a behavioral state and/or a specific experimental condition. Physiological and genetic experiments in worms revealed that the extracted behavioral features reflected specific neural or gene activities. Thus, our method provides a versatile and unbiased way to extract behavioral features from simple trajectory data to understand brain function.
7月27日(土)16:05~16:20 第10会場(万代島ビル 6F 会議室)
3O-10a2-4
行動状態依存的な皮質ネットワークダイナミクス解析のためのマウス用バーチャルリアリティシステム開発
Nobuhiro Nakai(中井 信裕),Yukiko Sekine(関根 友紀子),Masaaki Sato(佐藤 正晃),Toru Takumi(内匠 透)
理研CBS 精神生物学

The cortex consists of functional network modules which display dynamic activity patterns elicited intrinsically or by external stimuli. In behaving animals, multimodal sensory cues are thought to be integrated in the cortex and to contribute to incentive for action or volition. However, the underlying flow of information processed in the cortex is still unclear at the network level. Currently, the cortical functional network is studied mainly in sedated resting animals, for example, by functional MRI, but this technique is difficult to be used for actively behaving animals. To overcome this limitation, we monitored the cortical network dynamics using transcranial cortex-wide calcium imaging in mice behaving in virtual reality (VR). In our VR system, a mouse was head-fixed on a spherical treadmill and interactive VR scenes coupled to the rotation of the treadmill were projected onto a semi-translucent spherical screen. The subject mouse moved in an open field-like virtual arena. The behavior in VR was simultaneously recorded with cortical activity imaged by calcium imaging. Pair-wise correlation of activity computed between functional cortical modules was used to represent the state of cortical network dynamics. We found that highly-patterned cortical network dynamics co-occurred with transitions between two distinct behavioral states (i.e. from resting to walking and vice versa) at a temporal scale of seconds. The network patterns could be classified into the two corresponding behavioral states with 90% accuracy by using a support vector machine-based machine learning classifier, indicating that the patterns of network dynamics provide cortical fingerprints that can predict animal's behavioral states. Our study demonstrates that highly-patterned cortical network dynamics in mice has high predictive power for their behavioral states, as revealed by our machine learning-based analytical approach.