Student Responsibilities in AI Development
1. Human Responsibility in Student AI Creations
-
Responsibility Declaration in Project Reports
- Whenever you create an AI project, clearly list any ethical concerns you see.
- Explain who is responsible if something goes wrong and how you plan to address it.
-
Compliance with Anti-Discrimination and Privacy Protection
- Make sure the data you use is fair and does not include biases (for example, excluding or favoring a certain group without good reason).
- Protect personal data by not using private information in your project unless you have clear and legal permission.
2. Key Considerations in Student AI Design and Creations
-
Discuss Potential AI Misuse
- Spend time talking about how AI can be used in the wrong way.
- Ask questions like: “How can developers, deployers, and users share responsibility for preventing AI misuse?”
- Encourage everyone to think about ethical responsibility and real-world examples.
-
Restricted Use of Sensitive Data
- Do not use sensitive data—like health information or personal identifiers—unless it is legally and ethically allowed.
- Seek advice from teachers or trusted adults if you are unsure about using certain types of information.
Example Guideline on Sensitive Data
When using sensitive data, individuals and organizations must:
- Follow Legal Requirements: Obey data protection laws and only collect the information you truly need.
- Protect Privacy: Keep personal information safe and secure, minimizing any chance of it being leaked or misused.
- Promote Fairness and Accountability: Regularly check for biases in your data or your AI model, and correct them to avoid harming or discriminating against people.
- Ensure Transparency: Be clear about why you are collecting data, how it will be used, and who can see it.
- Uphold Human Dignity: Treat all individuals with respect and equality, making sure AI tools do not harm or exclude anyone.”