{"id":5621,"date":"2024-12-30T18:29:18","date_gmt":"2024-12-30T18:29:18","guid":{"rendered":"https:\/\/xaiworldconference.com\/2025\/?page_id=5621"},"modified":"2024-12-30T18:32:15","modified_gmt":"2024-12-30T18:32:15","slug":"actionable-explainable-ai","status":"publish","type":"page","link":"https:\/\/xaiworldconference.com\/2025\/actionable-explainable-ai\/","title":{"rendered":"Actionable Explainable AI"},"content":{"rendered":"\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"1024\" src=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/XAI-2025-actionable-explainable-ai-1024x1024.png\" alt=\"\" class=\"wp-image-5628\" srcset=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/XAI-2025-actionable-explainable-ai-1024x1024.png 1024w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/XAI-2025-actionable-explainable-ai-300x300.png 300w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/XAI-2025-actionable-explainable-ai-150x150.png 150w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/XAI-2025-actionable-explainable-ai-768x768.png 768w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/XAI-2025-actionable-explainable-ai-1536x1536.png 1536w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/XAI-2025-actionable-explainable-ai-2048x2048.png 2048w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/XAI-2025-actionable-explainable-ai-470x470.png 470w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<p style=\"font-size:14px\">Following the success of Explainable AI in generating faithful and understandable explanations of complex ML models, there has been increasing attention on how the outcomes of Explainable AI can be systematically used to enable meaningful actions. These considerations are studied within the subfield of Actionable XAI. In particular, research questions relevant to this subfield include (1) what types of explanations are most helpful in enabling human experts to achieve more efficient and accurate decision-making, (2) how one can systematically improve the robustness and generalization ability of ML models or align them with human decision making and norms based on human feedback on explanations, (3) how to enable meaningful actioning of real-world systems via interpretable ML-based digital twins, and (4) how to evaluate and improve the quality of actions derived from XAI in an objective and reproducible manner. This special track will address both the technical and practical aspects of Actionable XAI. This includes the question of how to build highly informative explanations that form the basis for actionability, aiming for solutions that are interoperable with existing explanation techniques such as Shapley values, LRP or counterfactuals, and existing ML models. This special track will also cover the exploration of real-world use cases where these actions lead to improved outcomes.<\/p>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<ul class=\"wp-block-list\">\n<li style=\"font-size:14px\">Structured explanation techniques (e.g. higher-order, hierarchical) designed for actionability<\/li>\n\n\n\n<li style=\"font-size:14px\">Multifaceted explanation techniques (e.g. disentangled or concept-based) designed for actionability<\/li>\n\n\n\n<li style=\"font-size:14px\">Explanation techniques based on optimization in input space (e.g. counterfactuals or prototypes) designed for actionability<\/li>\n\n\n\n<li style=\"font-size:14px\">Hybrid methods combining multiple explanation paradigms to improve actionability further<\/li>\n\n\n\n<li style=\"font-size:14px\">Attribution or attention-based techniques for helping users taking meaningful actions in data-rich environments<\/li>\n\n\n\n<li style=\"font-size:14px\">Shapley-, LRP-, or attention-based XAI techniques for retrieving relevant features from gigapixel images<\/li>\n\n\n\n<li style=\"font-size:14px\">Explanation-guided dimensionality reduction to facilitate taking action under high-throughput data or real-time constraints<\/li>\n\n\n\n<li style=\"font-size:14px\">XAI-based techniques for aligning the model decision making with ground-truth provided by human annotators<\/li>\n\n\n\n<li style=\"font-size:14px\">XAI methods, such as CAM and LRP, to support semantic segmentation from limited annotations<\/li>\n\n\n\n<li style=\"font-size:14px\">Techniques that leverage user explanatory feedback to produce an improved ML model<\/li>\n<\/ul>\n\n\n\n<p><\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<ul class=\"wp-block-list\">\n<li style=\"font-size:14px\">Explanation-driven pruning or retraining to robustify an ML model against spurious correlations and dataset shifts<\/li>\n\n\n\n<li style=\"font-size:14px\">Counterfactual and attribution methods combined with digital twins to identify effective actions in real-world systems<\/li>\n\n\n\n<li style=\"font-size:14px\">Counterfactual and attribution methods combined with reinforcement learning to produce effective real-world control policies<\/li>\n\n\n\n<li style=\"font-size:14px\">Design of environments (e.g. simulated environments) for end-to-end evaluation of XAI actionability<\/li>\n\n\n\n<li style=\"font-size:14px\">Utility-based metrics (e.g. added-value in a deployed setting) for end-to-end evaluation of XAI actionability<\/li>\n\n\n\n<li style=\"font-size:14px\">Indirect metrics (e.g. explanation informativeness, action-response prediction accuracy) for component-wise evaluation of XAI actionability<\/li>\n\n\n\n<li style=\"font-size:14px\">Datasets (with simulated environments) for evaluating actions derived from XAI explanations in a reproducible manner<\/li>\n\n\n\n<li style=\"font-size:14px\">Application of actionable XAI in biomedicine, e.g. for acting on molecular pathways<\/li>\n\n\n\n<li style=\"font-size:14px\">XAI in clinical practice, e.g. for proposing targeted therapies<\/li>\n\n\n\n<li style=\"font-size:14px\">Application of actionable XAI in industry, e.g. for calibration in manufacturing processes<\/li>\n<\/ul>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\"><div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"206\" src=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-bifold-1-1024x206.png\" alt=\"\" class=\"wp-image-5622\" style=\"width:244px;height:auto\" srcset=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-bifold-1-1024x206.png 1024w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-bifold-1-300x60.png 300w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-bifold-1-768x154.png 768w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-bifold-1-1536x308.png 1536w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-bifold-1-2048x411.png 2048w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/div><\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\"><div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"400\" height=\"134\" src=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-freie_universitat_berlin.png\" alt=\"\" class=\"wp-image-5623\" style=\"width:242px;height:auto\" srcset=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-freie_universitat_berlin.png 400w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-freie_universitat_berlin-300x101.png 300w\" sizes=\"auto, (max-width: 400px) 100vw, 400px\" \/><\/figure>\n<\/div><\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\"><div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1000\" height=\"342\" src=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-technische_universitat_berlin-1.png\" alt=\"\" class=\"wp-image-5624\" style=\"width:213px;height:auto\" srcset=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-technische_universitat_berlin-1.png 1000w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-technische_universitat_berlin-1-300x103.png 300w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-technische_universitat_berlin-1-768x263.png 768w\" sizes=\"auto, (max-width: 1000px) 100vw, 1000px\" \/><\/figure>\n<\/div><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\"><div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"1024\" src=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universita_padova-1024x1024.png\" alt=\"\" class=\"wp-image-5625\" style=\"width:159px;height:auto\" srcset=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universita_padova-1024x1024.png 1024w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universita_padova-300x300.png 300w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universita_padova-150x150.png 150w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universita_padova-768x768.png 768w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universita_padova-470x470.png 470w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universita_padova.png 1200w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/div><\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\"><div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"450\" src=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universite_de_rouen_normandie.jpg\" alt=\"\" class=\"wp-image-5626\" style=\"width:249px;height:auto\" srcset=\"https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universite_de_rouen_normandie.jpg 800w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universite_de_rouen_normandie-300x169.jpg 300w, https:\/\/xaiworldconference.com\/2025\/wp-content\/uploads\/2024\/12\/logo-universite_de_rouen_normandie-768x432.jpg 768w\" sizes=\"auto, (max-width: 800px) 100vw, 800px\" \/><\/figure>\n<\/div><\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Following the success of Explainable AI in generating faithful and understandable explanations of complex ML models, there has been increasing attention on how the outcomes of Explainable AI can be systematically used to enable meaningful actions. These considerations are studied within the subfield of Actionable XAI. In particular, research questions relevant to this subfield include &hellip; <\/p>\n","protected":false},"author":2,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_eb_attr":"","footnotes":""},"class_list":["post-5621","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/xaiworldconference.com\/2025\/wp-json\/wp\/v2\/pages\/5621","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/xaiworldconference.com\/2025\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/xaiworldconference.com\/2025\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/xaiworldconference.com\/2025\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/xaiworldconference.com\/2025\/wp-json\/wp\/v2\/comments?post=5621"}],"version-history":[{"count":3,"href":"https:\/\/xaiworldconference.com\/2025\/wp-json\/wp\/v2\/pages\/5621\/revisions"}],"predecessor-version":[{"id":5630,"href":"https:\/\/xaiworldconference.com\/2025\/wp-json\/wp\/v2\/pages\/5621\/revisions\/5630"}],"wp:attachment":[{"href":"https:\/\/xaiworldconference.com\/2025\/wp-json\/wp\/v2\/media?parent=5621"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}