Extending the explainability of machine learning models in policy decision making
扩展机器学习模型在政策决策中的可解释性
基本信息
- 批准号:2887425
- 负责人:
- 金额:--
- 依托单位:
- 依托单位国家:英国
- 项目类别:Studentship
- 财政年份:2023
- 资助国家:英国
- 起止时间:2023 至 无数据
- 项目状态:未结题
- 来源:
- 关键词:
项目摘要
Governments and policy makers are increasing their use of machine learning (ML) to support decision-making. The performance of ML algorithms generally improves with the increase of model complexity, which makes it harder for end-users to interrogate the model output. Lack of transparency and accountability in how the model is structured and the key variables underpinning predictions can lead to mistrust. Concerns about the rapidly expanding use of ML in making critical decisions have been voiced, especially for policies affecting marginalised sectors and communities. While the area of explainable ML has expanded in recent years, the existing methods are often "general-purpose" that fail to capture the specific needs of real-world end-users. As such, the effectiveness of the existing explainable ML approaches remains unclear without understanding the domain knowledge and specific requirements/goals. Understanding the specific needs for explainable ML is highly demanded in policy decision-making since the policy emerges as a compromise between people pursuing different goals.This PhD project aims to bridge the gap by developing a novel process and framework to ensure that ML models can be better understood, and therefore more readily adopted by policy makers. We focus on two aspects. First, we approach the problem by considering the application of ML as a decision-support tool, rather than a predictive tool. To do this, when developing the ML models, we explicitly capture the views of the decision-maker and make sure it is formally captured in the models in a way they can understand, modify and interrogate. Second, we use visual tools commonly deployed for wicked problems such as causal maps to capture the overarching ML process for a decision-maker so that the overall process is better understood and explained. By mixing these two approaches, quantitative and qualitative, meaningful progress will be achieved in developing a framework on explainable ML.
政府和政策制定者越来越多地使用机器学习 (ML) 来支持决策。机器学习算法的性能通常随着模型复杂性的增加而提高,这使得最终用户更难以询问模型输出。模型的构建方式和支持预测的关键变量缺乏透明度和问责制可能会导致不信任。人们对机器学习在关键决策中迅速扩大的使用表示担忧,特别是对于影响边缘化部门和社区的政策。尽管近年来可解释的机器学习领域不断扩大,但现有方法通常是“通用的”,无法捕捉现实世界最终用户的特定需求。因此,如果不了解领域知识和具体要求/目标,现有可解释的机器学习方法的有效性仍然不清楚。在政策决策中,非常需要了解可解释的机器学习的具体需求,因为政策是追求不同目标的人们之间的妥协。这个博士项目旨在通过开发一种新颖的流程和框架来弥合差距,以确保机器学习模型可以被更好地理解,因此更容易被政策制定者采用。我们重点关注两个方面。首先,我们通过将机器学习的应用视为决策支持工具而不是预测工具来解决这个问题。为此,在开发机器学习模型时,我们明确捕获决策者的观点,并确保以他们可以理解、修改和询问的方式在模型中正式捕获这些观点。其次,我们使用通常针对棘手问题部署的可视化工具(例如因果图)来为决策者捕获总体的机器学习流程,以便更好地理解和解释整个流程。通过混合定量和定性这两种方法,在开发可解释的机器学习框架方面将取得有意义的进展。
项目成果
期刊论文数量(0)
专著数量(0)
科研奖励数量(0)
会议论文数量(0)
专利数量(0)
数据更新时间:{{ journalArticles.updateTime }}
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
数据更新时间:{{ journalArticles.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ monograph.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ sciAawards.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ conferencePapers.updateTime }}
{{ item.title }}
- 作者:
{{ item.author }}
数据更新时间:{{ patent.updateTime }}
其他文献
Products Review
- DOI:
10.1177/216507996201000701 - 发表时间:
1962-07 - 期刊:
- 影响因子:2.6
- 作者:
- 通讯作者:
Farmers' adoption of digital technology and agricultural entrepreneurial willingness: Evidence from China
- DOI:
10.1016/j.techsoc.2023.102253 - 发表时间:
2023-04 - 期刊:
- 影响因子:9.2
- 作者:
- 通讯作者:
Digitization
- DOI:
10.1017/9781316987506.024 - 发表时间:
2019-07 - 期刊:
- 影响因子:0
- 作者:
- 通讯作者:
References
- DOI:
10.1002/9781119681069.refs - 发表时间:
2019-12 - 期刊:
- 影响因子:0
- 作者:
- 通讯作者:
Putrescine Dihydrochloride
- DOI:
10.15227/orgsyn.036.0069 - 发表时间:
1956-01-01 - 期刊:
- 影响因子:0
- 作者:
- 通讯作者:
的其他文献
{{
item.title }}
{{ item.translation_title }}
- DOI:
{{ item.doi }} - 发表时间:
{{ item.publish_year }} - 期刊:
- 影响因子:{{ item.factor }}
- 作者:
{{ item.authors }} - 通讯作者:
{{ item.author }}
{{ truncateString('', 18)}}的其他基金
An implantable biosensor microsystem for real-time measurement of circulating biomarkers
用于实时测量循环生物标志物的植入式生物传感器微系统
- 批准号:
2901954 - 财政年份:2028
- 资助金额:
-- - 项目类别:
Studentship
Exploiting the polysaccharide breakdown capacity of the human gut microbiome to develop environmentally sustainable dishwashing solutions
利用人类肠道微生物群的多糖分解能力来开发环境可持续的洗碗解决方案
- 批准号:
2896097 - 财政年份:2027
- 资助金额:
-- - 项目类别:
Studentship
A Robot that Swims Through Granular Materials
可以在颗粒材料中游动的机器人
- 批准号:
2780268 - 财政年份:2027
- 资助金额:
-- - 项目类别:
Studentship
Likelihood and impact of severe space weather events on the resilience of nuclear power and safeguards monitoring.
严重空间天气事件对核电和保障监督的恢复力的可能性和影响。
- 批准号:
2908918 - 财政年份:2027
- 资助金额:
-- - 项目类别:
Studentship
Proton, alpha and gamma irradiation assisted stress corrosion cracking: understanding the fuel-stainless steel interface
质子、α 和 γ 辐照辅助应力腐蚀开裂:了解燃料-不锈钢界面
- 批准号:
2908693 - 财政年份:2027
- 资助金额:
-- - 项目类别:
Studentship
Field Assisted Sintering of Nuclear Fuel Simulants
核燃料模拟物的现场辅助烧结
- 批准号:
2908917 - 财政年份:2027
- 资助金额:
-- - 项目类别:
Studentship
Assessment of new fatigue capable titanium alloys for aerospace applications
评估用于航空航天应用的新型抗疲劳钛合金
- 批准号:
2879438 - 财政年份:2027
- 资助金额:
-- - 项目类别:
Studentship
Developing a 3D printed skin model using a Dextran - Collagen hydrogel to analyse the cellular and epigenetic effects of interleukin-17 inhibitors in
使用右旋糖酐-胶原蛋白水凝胶开发 3D 打印皮肤模型,以分析白细胞介素 17 抑制剂的细胞和表观遗传效应
- 批准号:
2890513 - 财政年份:2027
- 资助金额:
-- - 项目类别:
Studentship
Understanding the interplay between the gut microbiome, behavior and urbanisation in wild birds
了解野生鸟类肠道微生物组、行为和城市化之间的相互作用
- 批准号:
2876993 - 财政年份:2027
- 资助金额:
-- - 项目类别:
Studentship
相似国自然基金
基于样本迁移和可解释性机器学习的草本湿地植物群落分类研究
- 批准号:42301429
- 批准年份:2023
- 资助金额:30 万元
- 项目类别:青年科学基金项目
基于场景认知的可解释性机器阅读理解技术研究
- 批准号:62376144
- 批准年份:2023
- 资助金额:50 万元
- 项目类别:面上项目
基于信息流理论与可解释性机器学习的北方岩溶泉水流量动态变化机理研究
- 批准号:42307088
- 批准年份:2023
- 资助金额:30 万元
- 项目类别:青年科学基金项目
融合物理规律的混凝土构件性能预测可解释性机器学习模型研究
- 批准号:
- 批准年份:2022
- 资助金额:34 万元
- 项目类别:地区科学基金项目
基于机器学习方法的三维气象干旱事件识别分类与可解释性统计模拟
- 批准号:42205191
- 批准年份:2022
- 资助金额:20 万元
- 项目类别:青年科学基金项目
相似海外基金
An explainability oriented approach to manage dependent supply chain risks
一种以可解释性为导向的方法来管理相关供应链风险
- 批准号:
LP230100379 - 财政年份:2024
- 资助金额:
-- - 项目类别:
Linkage Projects
CAREER: Information-Theoretic Measures for Fairness and Explainability in High-Stakes Applications
职业:高风险应用中公平性和可解释性的信息论测量
- 批准号:
2340006 - 财政年份:2024
- 资助金额:
-- - 项目类别:
Continuing Grant
Hierarchical Sentiment Polarity Judgement and Explainability for Paragraphs and Sentences in Securities Reports
证券报告段落、句子的层次情感极性判断及解释
- 批准号:
23H03459 - 财政年份:2023
- 资助金额:
-- - 项目类别:
Grant-in-Aid for Scientific Research (B)
Extraction and Use of Highly Explainable and Transferable Indicators for AI in Education
高度可解释和可转移的人工智能教育指标的提取和使用
- 批准号:
23H01001 - 财政年份:2023
- 资助金额:
-- - 项目类别:
Grant-in-Aid for Scientific Research (B)
Explaining automated test agents and their test results
解释自动化测试代理及其测试结果
- 批准号:
23K11062 - 财政年份:2023
- 资助金额:
-- - 项目类别:
Grant-in-Aid for Scientific Research (C)