Mateo Ve´ lez-Fort,1,3 Lee Cossell,1,3 Laura Porta,1 Claudia Clopath,1,2 and Troy W. Margrie1,4,*
1 Sainsbury Wellcome Centre, University College London, London, UK
2 Bioengineering Department, Imperial College London, London, UK
3 These authors contributed equally
4 Lead contact
*Correspondence: 该Email地址已收到反垃圾邮件插件保护。要显示它您需要在浏览器中启用JavaScript。
https://doi.org/10.1016/j.cell.2025.01.032
SUMMARY
Knowing whether we are moving or something in the world is moving around us is possibly the most critical sensory discrimination we need to perform. How the brain and, in particular, the visual system solves this motion-source separation problem is not known. Here, we find that motor, vestibular, and visual motion signals are used by the mouse primary visual cortex (VISp) to differentially represent the same visual flow information according to whether the head is stationary or experiencing passive versus active translation. During locomotion, we find that running suppresses running-congruent translation input and that translation signals dominate VISp activity when running and translation speed become incongruent. This cross-modal interaction between the motor and vestibular systems was found throughout the cortex, indicating that running and translation signals provide a brain-wide egocentric reference frame for computing the internally generated and actual speed of self when moving through and sensing the external world.
Ugne Klibaite,1,4,* Tianqing Li,2,4 Diego Aldarondo,1,3 Jumana F. Akoad,1 Bence P. O¨ lveczky,1,* and Timothy W. Dunn2,5,*
1 Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA 02138, USA
2 Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
3 Present address: Fauna Robotics, New York, NY 10003, USA
4 These authors contributed equally
5 Lead contact
*Correspondence: 该Email地址已收到反垃圾邮件插件保护。要显示它您需要在浏览器中启用JavaScript。 (U.K.), 该Email地址已收到反垃圾邮件插件保护。要显示它您需要在浏览器中启用JavaScript。 (B.P.O¨ .), 该Email地址已收到反垃圾邮件插件保护。要显示它您需要在浏览器中启用JavaScript。 (T.W.D.)
https://doi.org/10.1016/j.cell.2025.01.044
Social interaction is integral to animal behavior. However, lacking tools to describe it in quantitative and rigorous ways has limited our understanding of its structure, underlying principles, and the neuropsychiatric disorders, like autism, that perturb it. Here, we present a technique for high-resolution 3D tracking of postural dynamics and social touch in freely interacting animals, solving the challenging subject occlusion and partassignment problems using 3D geometric reasoning, graph neural networks, and semi-supervised learning. We collected over 110 million 3D pose samples in interacting rats and mice, including seven monogenic autism rat lines. Using a multi-scale embedding approach, we identified a rich landscape of stereotyped actions, interactions, synchrony, and body contacts. This high-resolution phenotyping revealed a spectrum of changes in autism models and in response to amphetamine not resolved by conventional measurements. Our framework and large library of interactions will facilitate studies of social behaviors and their neurobiological underpinnings.
自成立以来,教育一直是伤口世界的主要关注领域之一。改善伤口管理和伤口护理教育是实现协会改善和发展伤口管理的主要目标的重要因素。 教育活动由教育委员会协调。
协会教育培训活动案例...
健康、关爱、诚信、务实、开拓、创新
让天下没有治不好的伤口。Chinomise, no incurable wound in the world.
伤口世界平台生态圈,以“关爱人间所有伤口患者”为愿景,连接、整合和拓展线上和线下的管理慢性伤口的资源,倡导远程、就近和居家管理慢性伤口,解决伤口专家的碎片化时间的价值创造、诊疗经验的裂变复制、和患者的就近、居家和低成本管理慢性伤口的问题。
2019广东省医疗行业协会伤口管理分会年会
扫一扫了解详情:
任何关于疾病的建议都不能替代执业医师的面对面诊断。所有门诊时间仅供参考,最终以医院当日公布为准。
网友、医生言论仅代表其个人观点,不代表本站同意其说法,请谨慎参阅,本站不承担由此引起的法律责任。