Task-aware and Autonomous Robotic C-arm Servoing for Flouroscopy-guided Interventions
用于荧光镜引导干预的任务感知和自主机器人 C 臂服务
基本信息
- 批准号:10375489
- 负责人:
- 金额:$ 23.79万
- 依托单位:
- 依托单位国家:美国
- 项目类别:
- 财政年份:2020
- 资助国家:美国
- 起止时间:2020-04-15 至 2023-12-31
- 项目状态:已结题
- 来源:
- 关键词:3-DimensionalAge-YearsAlgorithmsAnatomyAnteriorAreaArtificial IntelligenceAssessment toolAutomobile DrivingAwardAwarenessBackBladderCadaverCaringClinicalClinical DataCompression FractureComputer softwareComputersCustomDataData SetDecision MakingDevelopmentDisciplineEnvironmentExhibitsExpert SystemsFluoroscopyFractureFracture FixationFundingGenerationsGoalsImageIncidenceInjuryInterventionLabelLearningLengthMachine LearningManualsMedicalMedical ImagingModalityModernizationMorbidity - disease rateOperative Surgical ProceduresOpticsOrthopedicsPatientsPelvisPhysicsPopulationPositioning AttributeProbabilityProceduresProcessPsychological reinforcementRadiation Dose UnitRiskRoboticsRoentgen RaysScientistSpecimenStructureSurgeonSystemTestingTimeTrainingTraumaUnited StatesUnited States National Institutes of HealthVariantVertebral columnWidthWorkX-Ray Medical Imagingactive visionadverse outcomealgorithm trainingarmbaseboneconvolutional neural networkdeep learning algorithmdeep reinforcement learningfemoral arteryimaging modalityimaging systemimproved outcomein silicoinnovationlearning algorithmmortalitymultitasknovel strategiespre-clinicalsample fixationsimulationspine bone structurestructured datasuccesstooltrauma surgery
项目摘要
Project Summary
Fluoroscopy guidance using C-arm X-ray systems is used in more than 17 million procedures across the US
and constitutes the state-of-care for various percutaneous procedures, including internal fixation of pelvic ring
injuries. To infer procedural progress from 2D radiographs, well-defined views onto anatomy must be achieved
and restored multiple times during surgery. This process, known as ”fluoro hunting”, is associated with 4.7 s
of excessive fluoroscopy time per C-arm position (c. f. 120 s total per fixation), yielding radiographs that are never
interpreted clinically, but drastically increasing procedure time and radiation dose to patient and surgical staff.
Our long-term project goal is to use concepts from machine learning and active vision to develop task-aware
algorithms for autonomous robotic C-arm servoing that interpret intra-operative radiographs and autonomously
adjust the C-arm pose to acquire fluoroscopic images that are optimal for inference. We have three specific aims:
1) Detecting unfavorable K-wire trajectories from monoplane fluoroscopy images: We will extend a physics-based sim-
ulation framework for fluoroscopy from CT that enables fast generation of structured and realistic radiographs
documenting procedural progress. Based on this data, we will train a state-of-the-art convolutional neural net-
work that interprets fluoroscopic images to infer procedural progress. 2) Developing and validating a task-aware
imaging system in silico: Using the autonomous interpretation tools and simulation pipeline available through
Aim 1, we will train an artificial agent based on reinforcement learning and active vision. This agent will be
capable of analyzing intra-operative fluoroscopic images to autonomously adjust the C-arm pose to yield task-
optimal views onto anatomy. 3) Demonstrating feasibility of our task-aware imaging concept ex vivo: Our third aim
will establish task-aware C-arm imaging in controlled clinical environments. We will attempt internal fixation
of anterior pelvic ring fractures and our task-aware artificial agent will interpret intra-operatively acquired ra-
diographs to infer procedural progress and suggest optimal C-arm poses that will be realized manually with an
optically-tracked mobile C-arm system.
This work combines the expertise of a computer scientist, a surgical robotics expert, and an orthopedic
trauma surgeon to explore the untapped, understudied area of autonomous imaging enabled by advances in
machine learning in fluoroscopy-guided procedures. This development has only recently been made feasible
by innovations in fast fluoroscopy simulation from CT to provide structured data for training that is sufficiently
realistic to warrant generalization to clinical data. With support from the NIH Trailblazer Award, our team
will be the first to investigate autonomous and task-aware C-arm imaging systems, paving the way for a new
paradigm in medical image acquisition, which will directly benefit millions of patients by task-oriented image
acquisition on a patient-specific basis. Subsequent R01 funding will customize this concept to other high-volume
procedures, such as vertebroplasty.
项目概要
美国各地超过 1700 万例手术中使用了使用 C 形臂 X 射线系统的荧光镜引导
并构成各种经皮手术的护理状态,包括骨盆环内固定
为了从 2D X 光照片推断手术进展,必须获得明确的解剖视图。
并在手术期间多次恢复,这个过程被称为“荧光狩猎”,与 4.7 秒相关。
每个 C 形臂位置的透视时间过长(参见每次固定总共 120 秒),产生的射线照片永远不会
临床解释,但显着增加了患者和手术人员的手术时间和辐射剂量。
我们的长期项目目标是利用机器学习和主动视觉的概念来开发任务感知
用于自主机器人 C 形臂伺服的算法,可解释术中射线照片并自主
调整 C 臂姿势以获得最适合推理的荧光透视图像我们有三个具体目标:
1)从单平面透视图像中检测不利的克氏针轨迹:我们将扩展基于物理的模拟
CT 荧光分析框架,可快速生成结构化且真实的射线照片
基于这些数据,我们将训练最先进的卷积神经网络。
解释荧光透视图像以推断程序进度的工作 2) 开发和验证任务感知。
计算机成像系统:使用可通过以下方式获得的自主解释工具和模拟管道
目标 1,我们将训练一个基于强化学习和主动视觉的人工智能体。
能够分析术中透视图像以自主调整 C 形臂姿势以完成任务
3)展示我们的任务感知成像概念的体外可行性:我们的第三个目标
将在受控临床环境中建立任务感知 C 形臂成像 我们将尝试内固定。
前骨盆环骨折和我们的任务感知人工代理将解释术中获得的骨折
用于推断程序进度并建议最佳 C 形臂姿势,这些姿势将通过手动实现
光学跟踪移动C型臂系统。
这项工作结合了计算机科学家、外科机器人专家和骨科专家的专业知识
创伤外科医生将探索自主成像领域尚未开发、尚未充分研究的领域,这得益于先进技术的进步
机器学习在透视引导程序中的应用直到最近才变得可行。
通过 CT 快速透视模拟的创新,为训练提供充分的结构化数据
在 NIH 开拓者奖的支持下,我们的团队能够实现临床数据的推广。
将是第一个研究自主和任务感知 C 臂成像系统的公司,为新的
医学图像采集的范例,任务导向的图像将直接造福数百万患者
后续 R01 资金将根据患者具体情况定制这一概念。
手术,例如椎体成形术。
项目成果
期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
数据更新时间:{{ journalArticles.updateTime }}
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
Mathias Unberath其他文献
Mathias Unberath的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
相似海外基金
Brain metabolism across the lifespan using multi-parametric MRS
使用多参数 MRS 分析整个生命周期的脑代谢
- 批准号:
10738647 - 财政年份:2023
- 资助金额:
$ 23.79万 - 项目类别:
Development of a commercially viable machine learning product to automatically detect rotator cuff muscle pathology
开发商业上可行的机器学习产品来自动检测肩袖肌肉病理
- 批准号:
10268004 - 财政年份:2021
- 资助金额:
$ 23.79万 - 项目类别:
Development of a commercially viable machine learning product to automatically detect rotator cuff muscle pathology
开发商业上可行的机器学习产品来自动检测肩袖肌肉病理
- 批准号:
10495191 - 财政年份:2021
- 资助金额:
$ 23.79万 - 项目类别:
Early Detection of Vascular Dysfunction Using Biomarkers from Lagrangian Carotid Strain Imaging
使用拉格朗日颈动脉应变成像生物标志物早期检测血管功能障碍
- 批准号:
10442390 - 财政年份:2020
- 资助金额:
$ 23.79万 - 项目类别:
The Morphology and Characteristics of Hallux Rigidus
拇强直的形态及特征
- 批准号:
9892891 - 财政年份:2020
- 资助金额:
$ 23.79万 - 项目类别: