Mechanism of Brain and Mind
Abstracts for winter workshop 2026

Abstracts for winter workshop 2026

Special Session

Srinivas C Turaga (HHMI Janelia Research Campus)
[Website]
TBD

TBD


Xiaoyin Chen (Allen Institute)
[Website]
TBD

TBD


Gisella Vetere (ESPCI Paris)
[Website]
TBD

TBD


Topic Session

Michael Breakspear (The University of Newcastel)
[Website]
TBD

TBD


Ai Koizumi (Sony Computer Science laboratories, Inc.)
[Website]
TBD

TBD


Gerald Pao (Okinawa Institute of Science and Technology)
[Website]
TBD

TBD





Shogo Ohmae (Chinese Institute for Brain Research, Beijing (CIBR, Beijing))
[Website]
Brain–AI Convergence: Predictive World Models as a Basis for Multifunctionality

The neocortex and cerebellum are involved in diverse cognitive functions including language, despite exhibiting remarkably homogeneous circuit architectures across functional domains. This suggests that the brain’s multifunctionality may be realized through learning-driven differentiation of functions and internal representations. Interestingly, recent general-purpose AI has also shown that a single architecture can learn to perform a wide range of tasks. From the perspective of brain-AI parallels and their convergent evolution, we investigated the computational principles underlying the brain’s multifunctionality. 

First, at the functional level, we constructed an artificial neural circuit reflecting the biological features of the cerebellum and found that when trained on next-word prediction (a known cerebellar function), the circuit spontaneously acquired syntactic processing, a distinct cerebellar function. This parallels how language AI develops advanced language understanding from next-word prediction. Second, at the internal representation level, we investigated whether representations analogous to AI’s seq2vec (i.e., compressing sequence information into a single vector) exist in the brain. We found that cerebellar granule-cell population activity carried sufficient information to decode motor event sequences with high accuracy, suggesting the presence of seq2vec-like sequence representations. Furthermore, simulations with the cerebellar artificial neural circuit demonstrated that such sequence representations can be formed by next-event prediction learning alone. Third, at the computational theory level, our cross-domain brain-AI comparison points to a shared scheme of predictive-world-model-based multifunctionality (prediction, abstraction, and generation) in the neocortex, the cerebellum, and AI. 

Together, these results suggest that biological evolution of the brain and engineering optimization of AI have converged on similar predictive-world-model-based computational principles, providing insights into the essence of brain intelligence.


Chie Hieida (Nara Institute of Science and Technology)
[Website]
TBD

TBD


Hideaki Shimazaki (Kyoto University)
[Website]
TBD

TBD