EAGER: Volition Based Anticipatory Control for Time-Critical Brain-Prosthetic Interaction
EAGER:基于意志的预期控制,用于时间关键的大脑-假体交互
基本信息
- 批准号:1550397
- 负责人:
- 金额:$ 17.88万
- 依托单位:
- 依托单位国家:美国
- 项目类别:Standard Grant
- 财政年份:2015
- 资助国家:美国
- 起止时间:2015-08-15 至 2017-07-31
- 项目状态:已结题
- 来源:
- 关键词:
项目摘要
This exploratory project focuses on developing algorithms that will allow the PI's previously implemented prototype drumming prosthesis, which was developed in an effort to help an injured teen, to anticipate human physical actions based on an analysis of EEG signals so that it can respond mechanically in a timely manner. The goal is to enable the enhanced prosthesis to detect volition, the cognitive process by which an individual decides on and commits to a particular course of action hundreds of milliseconds before the action actually takes place, in order to foresee the drummer's actions and achieve sub-second synchronization between artificial and biological limbs, thereby leading to improved performance in a time-sensitive domain where asynchronous operations of more than a few milliseconds are noticeable by listeners. Project outcomes will include cognitive models and technical approaches that will be of great value for improving efficiency and fluency in a wide range of human-robot and human-prosthesis interaction scenarios, from construction tasks where humans and robots collaborate to achieve common goals, to time-critical tasks such as in hospital operating rooms or space stations where humans operate artificial robotic limbs. The work will also lead to creation of a volition trials database that will be documented and shared with the broad community of brain scholars and brain-machine interface researches. And the project will have additional broad impact by supporting students in the Robotic Musicianship group at Georgia Tech as it transitions from its previous focus on robotic musicianship into the fields of prosthetic and human augmentation.Prior studies of volition have shown that across multiple repetitions of (real or imagined) motor activity one can derive the Event-Related-Potential (ERP) associated with the intent to move the hand, up to a few seconds prior to the generation of the movement. Additionally, studies of mirror neurons have shown that observing a motor activity can trigger sets of cells in the brain that replicate the activity depicted when a subject is engaged in the action itself. In this project the PI will build on such findings to develop new pattern recognition algorithms for EEG signal analysis in an effort to identify volition and design new anticipatory algorithms for brain-machine interfaces that reduce latency and allow for synchronization at the millisecond level. The work will be carried out in stages. The PI will first collect EEG data from a large number of experimental trials where participants are engaged in a voluntary motor action. The data will be studied to detect patterns indicative of volition activity from electrodes monitoring both the motor and pre-motor cortices (SMA and pre-SMA), and also to isolate the neural correlates of imagined vs. real movement. A variety of general purpose machine learning classifiers, as well as music focused feature extraction techniques, will be used to distinguish between anticipatory patterns of activity preluding an action (volition) and patterns generated when the action is indeed manifested. As part of the analysis the PI will attempt to acquire an understanding of the delta times between volition and action under different conditions, and he will develop a repeatability / reliability matrix to be utilized for synchronization in the next stage of the work, in which the PI will develop a "latency compensation engine" that generates robotic drum hits at the exact anticipated action time, compensating for mechanical latencies while taking into account the projected delta time between volition and action. Multi-modal integration with data from other sensors (EMG, microphones, proximity, etc.) will be exploited to correct errors is detection and classification. Finally, success of the new algorithms will be evaluated using both objective and subjective measures by having the amputee drummer perform a series of musical tasks with the robotic arm.
这个探索性项目的重点是开发算法,使 PI 之前实现的击鼓假肢原型(该假肢是为了帮助受伤的青少年而开发的)能够根据脑电图信号分析来预测人类的身体动作,从而能够在及时的方式。 目标是使增强型假肢能够检测意志,即个人在行动实际发生之前数百毫秒决定并致力于特定行动方针的认知过程,以便预见鼓手的行动并实现子目标。假肢和生物肢体之间的第二次同步,从而提高了时间敏感领域的性能,在该领域中,听众可以注意到超过几毫秒的异步操作。 项目成果将包括认知模型和技术方法,对于提高各种人机和人机交互场景的效率和流畅性具有重要价值,从人与机器人协作实现共同目标的施工任务,到时间- 关键任务,例如在医院手术室或空间站中人类操作人造机器人肢体。 这项工作还将创建一个意志试验数据库,该数据库将被记录下来并与广大脑学者和脑机接口研究人员共享。 该项目还将通过支持佐治亚理工学院机器人音乐小组的学生来产生更广泛的影响,因为该小组从之前的重点是机器人音乐转向假肢和人体增强领域。先前对意志的研究表明,通过多次重复(真实的或想象的)运动活动,人们可以在运动产生之前几秒钟导出与移动手的意图相关的事件相关电位(ERP)。 此外,镜像神经元的研究表明,观察运动活动可以触发大脑中的一组细胞,这些细胞会复制受试者参与动作本身时所描绘的活动。 在这个项目中,PI 将基于这些发现开发用于脑电图信号分析的新模式识别算法,以努力识别意志并为脑机接口设计新的预期算法,以减少延迟并允许毫秒级的同步。 这项工作将分阶段进行。 PI 将首先从大量实验中收集脑电图数据,其中参与者参与自愿运动动作。 将研究这些数据,以检测来自监测运动皮质和前运动皮质(SMA 和 pre-SMA)的电极的意志活动模式,并分离想象与真实运动的神经相关性。 各种通用机器学习分类器以及以音乐为中心的特征提取技术将用于区分动作(意志)前的预期活动模式和动作确实表现出来时生成的模式。 作为分析的一部分,PI 将尝试了解不同条件下意志和行动之间的增量时间,他将开发一个可重复性/可靠性矩阵,用于下一阶段工作的同步,其中PI 将开发一种“延迟补偿引擎”,在精确的预期动作时间生成机器人鼓击,补偿机械延迟,同时考虑到意志和动作之间的预计增量时间。 将利用与其他传感器(肌电图、麦克风、接近度等)数据的多模式集成来纠正检测和分类中的错误。 最后,将通过让截肢鼓手用机械臂执行一系列音乐任务,使用客观和主观措施来评估新算法的成功。
项目成果
期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
数据更新时间:{{ journalArticles.updateTime }}
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
Gil Weinberg其他文献
Synchronization in human-robot Musicianship
人机音乐同步
- DOI:
10.1109/roman.2010.5598690 - 发表时间:
2010 - 期刊:
- 影响因子:0
- 作者:
Guy Hoffman;Gil Weinberg - 通讯作者:
Gil Weinberg
Robotic Musicianship - Musical Interactions Between Humans and Machines
机器人音乐——人与机器之间的音乐互动
- DOI:
10.5772/5206 - 发表时间:
2007 - 期刊:
- 影响因子:0
- 作者:
Gil Weinberg - 通讯作者:
Gil Weinberg
Visual cues-based anticipation for percussionist-robot interaction
基于视觉线索的打击乐手与机器人交互的预期
- DOI:
10.1145/2157689.2157713 - 发表时间:
2012 - 期刊:
- 影响因子:0
- 作者:
Marcelo Cicconet;Mason Bretan;Gil Weinberg - 通讯作者:
Gil Weinberg
The embroidered musical ball: a squeezable instrument for expressive performance
刺绣音乐球:一种可挤压的表现力乐器
- DOI:
- 发表时间:
2000 - 期刊:
- 影响因子:0
- 作者:
Gil Weinberg;Maggie Orth;Peter Russo - 通讯作者:
Peter Russo
Emotional musical prosody for the enhancement of trust: Audio design for robotic arm communication
增强信任的情感音乐韵律:机械臂通信的音频设计
- DOI:
10.1515/pjbr-2021-0033 - 发表时间:
2021 - 期刊:
- 影响因子:0
- 作者:
Richard J. Savery;Lisa Zahray;Gil Weinberg - 通讯作者:
Gil Weinberg
Gil Weinberg的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
{{ truncateString('Gil Weinberg', 18)}}的其他基金
Data Driven Predictive Auditory Cues for Safety and Fluency in Human-Robot Interaction
数据驱动的预测听觉线索可确保人机交互的安全性和流畅性
- 批准号:
2240525 - 财政年份:2023
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
NRI: FND: Creating Trust Between Groups of Humans and Robots Using a Novel Music Driven Robotic Emotion Generator
NRI:FND:使用新颖的音乐驱动机器人情感发生器在人类和机器人群体之间建立信任
- 批准号:
1925178 - 财政年份:2019
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
I-Corps: Dexterous Robotic Prosthetic Control Using Deep Learning Pattern Prediction from Ultrasound Signal
I-Corps:利用超声波信号的深度学习模式预测灵巧的机器人假肢控制
- 批准号:
1744192 - 财政年份:2017
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
EAGER: Sub-second human-robot synchronization
EAGER:亚秒级人机同步
- 批准号:
1345006 - 财政年份:2013
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
HCC: Small: Multi Modal Music Intelligence for Robotic Musicianship
HCC:小型:机器人音乐的多模式音乐智能
- 批准号:
1017169 - 财政年份:2010
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
HRI: The Robotic Musician - Facilitating Novel Musical Experiences and Outcomes through Human Robot Interaction
HRI:机器人音乐家 - 通过人机交互促进新颖的音乐体验和成果
- 批准号:
0713269 - 财政年份:2007
- 资助金额:
$ 17.88万 - 项目类别:
Standard Grant
相似国自然基金
解析人类意志:分离自主动作和控制信念对认知表现的促进作用
- 批准号:32300883
- 批准年份:2023
- 资助金额:30 万元
- 项目类别:青年科学基金项目
城市行政意志与城市规划关联性的历史研究——以近代沈阳为例
- 批准号:51908483
- 批准年份:2019
- 资助金额:21.0 万元
- 项目类别:青年科学基金项目
相似海外基金
Examining the electroencephalographic fingerprint of default mode network hyperconnectivity for scalable and personalized neurofeedback in schizophrenia
检查默认模式网络超连接的脑电图指纹,以实现精神分裂症的可扩展和个性化神经反馈
- 批准号:
10509002 - 财政年份:2022
- 资助金额:
$ 17.88万 - 项目类别:
Design and Model-Based Safety Verification of a Volitional Sit-Stand Controller for a Powered Knee-Ankle Prosthesis
动力膝踝假肢自主坐站控制器的设计和基于模型的安全验证
- 批准号:
10388466 - 财政年份:2022
- 资助金额:
$ 17.88万 - 项目类别:
Influence of task complexity and sensory feedback on cortical control of grasp force
任务复杂性和感觉反馈对皮质控制抓握力的影响
- 批准号:
10289762 - 财政年份:2021
- 资助金额:
$ 17.88万 - 项目类别:
Elucidation of the Process of Arendt's Thought Formation Based on a Close Examination of the Complete Works
从细看全集阐释阿伦特思想形成过程
- 批准号:
21K00100 - 财政年份:2021
- 资助金额:
$ 17.88万 - 项目类别:
Grant-in-Aid for Scientific Research (C)
Influence of task complexity and sensory feedback on cortical control of grasp force
任务复杂性和感觉反馈对皮质控制抓握力的影响
- 批准号:
10480085 - 财政年份:2021
- 资助金额:
$ 17.88万 - 项目类别: