Explainable AI Needs Formalization to Address Stakeholder Needs
Recent discussions in the field of Explainable AI (XAI) highlight the need for formalization to better address stakeholder needs. Current XAI methods have been criticized for their lack of robustness and consistency, often providing arbitrary explanations that can be manipulated. The fundamental limitation is the absence of formal specifications for XAI problems, which hinders the development of methods that are fit for their intended purposes. A proposed requirement-driven development process includes assessing stakeholder information needs, defining formal requirements, designing suitable methods, performing theoretical analyses, and empirical validation. This approach aims to ensure that XAI methods systematically address common information needs, thereby enhancing their value for specific explanation goals and machine learning quality control.