AI Competencies Self-Assesssment Checklist

Accountability of AI Caused Damage

An AI-based diabetic management system that adjusts insulin dosages in real-time may
rely on insufficient or biased patient data, potentially failing to account for all influencing factors.

The affected patient might suffer complications caused by yet unknown conditions by the AI. Staff may fail to detect issues until the patient’s condition worsens.

 

Accountability Measures:

Hospital/Doctor Accountability: the healthcare organization or individual must be
aware that AI systems can misdiagnose and cause problems to patients, and that
constant monitoring, and update the system to minimize such problems, being
transparent about the problem to patients, developers and other stakeholders, and
making proper compensations agreed prior to adaptation and usage of the AI system.

 

1. Developer Accountability: Constantly work with the hospital/doctors to minimize
errors and to be transparent about AI system’s limitations and potential risks, and
appropriate fail-safes and to mandate human in the loop where necessary.

 

2. Legal/Regulatory Accountability: Healthcare regulations must be in place for proper
and safe use of AI medical systems. And account for AI caused errors and harms, to
enable fines, and compensations.

 

AI generated image using Midjourney.com
Skip to toolbar