Plan
The Plan section of the AI Toolkit supports importance of responsible AI practices, provides insight into Microsoft's approach to responsible AI and suggests ways that you can get your institution and policies AI-ready.
Understand principles of responsible AI
It's important to consider responsible AI principles when implementing AI in education to ensure these technologies are used responsibly, safely, and in ways that enhance educational opportunities for students while preparing them for the future. Responsible AI principles can be used to help inform policies and usage guidelines adopted by districts, states, and ministries of education. They help ensure that AI is used in ways that are fair, transparent, and respect the privacy of students and staff. At the core of Microsoft's AI work are six key principles that define and guide responsible AI use.
Keep these principles in mind as you develop, introduce, and refine your AI program. Doing so helps you create a program that meets your instruction, technology, policy, and community needs and expectations.
Engage your community
Explore the Plan section for suggestions on connecting and communicating with your community. AI technologies are revolutionizing education, presenting exciting opportunities for teaching and learning. However, implementing AI-driven tools requires thoughtful planning, clear communication, and collaboration with supportive communities. Stakeholders including administrators, educators, students, parents, and community members have diverse responsibilities and seek understanding of how AI tools will influence their daily experiences and future endeavors. This section addresses significant challenges and opportunities in involving your community in the adoption of AI-powered education tools.
Additionally, it offers practical advice and strategies for:
- Building trust and support for AI-powered tools in education
- Understanding and addressing your community's concerns
- Matching tools to your goals and needs
- Establishing a shared vision with your community
Understand and address your community's concerns
As you meet with different community members, you encounter a multitude of concerns, interests, and needs. Take advantage of this opportunity to cultivate empathy by actively listening to their worries, demonstrating your AI knowledge and leadership expertise, and inspiring their support for your AI initiative. The AI Toolkit provides your team with sample concerns that community members may have and then a suggested response with supporting resources when available. Here are two examples of what you find in this section.
Role | Concern | Sample response |
---|---|---|
Education leaders | School leaders may have concerns about equity and accessibility when it comes to integrating AI. Schools want to ensure that AI tools are accessible for all students, including those with disabilities, and that the tools don't exacerbate existing inequalities. | We'll evaluate all AI tools to make sure they can be equitably accessed and used by students from various socio-economic backgrounds, different levels of technology access, and diverse learning needs. We aim to understand how AI helps us build a fairer educational landscape and remediate any issues, as exemplified by institutions like the University of Texas. |
Educators | Based on past experiences, educators may feel that new programs and initiatives are introduced, supported for a short time, and then forgotten. Some teachers are hesitant to adopt technology unless they're comfortable with their own skills and can support any questions or issues their students may encounter. | We're committed to making sure that you and your students know how to use AI tools responsibly. Our plan includes age-appropriate instructional materials, suggested conversation starters, guidance on modeling appropriate use, and taking an iterative approach to adapting policies. You can also refer to resources like Microsoft Learn's Equip your students with AI and tech skills for today—and tomorrow and Empower educators to explore the potential of artificial intelligence modules for self-paced learning. |
Practical steps for education leaders
One of the central challenges in adopting generative AI in education is helping leaders translate guidelines and frameworks into actionable steps. The AI Toolkit addresses this by providing practical assistance. Here's one example from the Toolkit that highlights what you can expect.
Need | Suggested actions | Resources |
---|---|---|
Revise policies to address generative AI | Leaders can use Copilot to review policy documentation like Acceptable Use Policies to incorporate language about the use of generative AI. | Rethinking Acceptable Use Policies in the Age of AI, District Administration |
Policy considerations
Establishing policies creates structure and guidelines for your faculty, staff, students, and community. The AI Toolkit helps you get started. Review the entire policy section and begin with these practical suggestions:
- Start now. Your students and staff are likely using AI already and need guidance. Read this entire section of the AI Toolkit and then create initial policies.
- Identify key areas of need and critical questions to guide your process.
- Establish what needs a policy and what doesn't. Focus on the largest areas of impact.
- Learn from peers and familiarize yourself with resources like the TeachAI toolkit, developed with support from Microsoft.
Plan to iterate as you go.
Organizational policy considerations
The policy process includes critical stages like creation, revision, and communication. The AI Toolkit provides several resources that help you through this cycle and includes multiple policy examples like the following to get started.
- Academic integrity
- Data protection and privacy
- Staff and faculty use
- Classroom syllabi
- Accessibility and Universal Design for Learning (UDL)
Each of these examples includes guiding questions and sample policies from schools and higher education institutions that can serve as a model.
Use Copilot to update a policy
Generative AI tools like Copilot can serve as helpful partners when updating existing policies. The AI Toolkit guides you through one way that your team could use Copilot to update an academic integrity policy in just a few steps. Each step in the evolution includes a sample policy which is followed by a quick analysis of its effectiveness.
Method | Policy | Analysis |
---|---|---|
Initial policy written by humans | Presenting another person's work as your own is an act of dishonesty. This behavior undermines your integrity and contradicts the principles upheld by [our institution]. We maintain the belief that academic success is contingent upon the dedication you invest in your studies. | This policy addresses human-authored texts. Given the many ways that students can use generative AI tools, clear guidance on responsible AI use is essential to maintain academic integrity and prevent plagiarism. |
Revised by humans | Presenting another person's work or content created by a generative AI tool as your own is an act of dishonesty. This behavior undermines your integrity and contradicts the principles upheld by [our school]. We maintain the belief that academic success is contingent upon the dedication you invest in your studies. We expect you'll approach your assignments honestly, as your work reflects your capabilities. | This policy covers generative AI. It broadens the range of permitted uses for students beyond mere assignment copying but doesn't offer appropriate uses for AI. We recommend setting guidelines for additional uses of generative AI like revising work, seeking formative feedback, and utilizing AI as a brainstorming partner. |
Your leadership teams can create prompts to assess existing policies for improvement and explore various wording options. For instance, Copilot can analyze a revised policy, review it for potential biases, and generate a simplified version in plain language accessible to all student and community groups. These are steps an education leader would take to generate a policy while using Copilot.
Navigate to Microsoft Copilot in your browser. Enter the included prompt and then attach a copy of your current student privacy policy.
As CTO of a school district, analyze your student privacy policy, focusing on AI app data use, parental consent, and data sharing. Suggest actionable improvements to enhance privacy. Review the policy for clarity and that it can be understood by all members of the school community. Include best practices from other policies and a revised draft. Refer to suggestions from the US Department of Education's Policies for Users of Student Data Checklist.
Copilot creates a student privacy policy based on your prompt.
Student Privacy Policy
Your privacy is important to us. We developed this policy to help you understand how we collect, use, communicate, disclose, and make use of personal information. This policy specifically addresses the use of AI app data, parental consent, and data sharing.
- Purpose of data collection: Before or at the time of collecting personal information, we'll clearly identify the purposes for which information is being collected. This includes the use of AI applications in our educational programs.
- Data use: We'll collect and use personal information solely with the objective of fulfilling those purposes specified by us and for other compatible purposes, unless we obtain the consent of the individual concerned or as required by law. This includes data generated by AI applications.
- Parental consent: For students under the age of 18, we'll obtain parental consent before collecting personal data. Parents have the right to review and request changes to their child's data.
- Data sharing: We won't share personal data with third parties without explicit consent from the student or parent, except as required by law. This includes data generated by AI applications.
- Data retention: We'll only retain personal information as long as necessary for the fulfillment of those purposes.
- Data accuracy: Personal data should be relevant to the purposes for which it's used, and, to the extent necessary for those purposes, should be accurate, complete, and up-to-date.
- Data security: We'll protect personal information by reasonable security safeguards against loss or theft, as well as unauthorized access, disclosure, copying, use, or modification.
- Transparency: We'll make readily available to customers information about our policies and practices relating to the management of personal information.
We're committed to conducting our business in accordance with these principles to ensure the confidentiality of personal information is protected and maintained.
Take a moment to try these steps with the policy of your choice at copilot.microsoft.com.
Increase cybersecurity and data governance
Education and technology leaders prioritize data protection and cyberattack prevention to ensure learning environments are safe, secure, and effective. Bad actors and cybercriminals target data-rich organizations like schools, universities, and ministries of education, as shown by increased attacks and new social engineering threats.
Enhance cybersecurity
Schools and universities are noting governments' calls for increased cybersecurity protection and close examination of security and privacy in AI systems. Governments are adopting policies for safely using AI systems in school districts with help from companies like Microsoft and guiding organizations. Microsoft is also working closely with higher education institutions like the University of Michigan to deploy copilots that are as secure as other existing infrastructure.
The Plan section also helps you safely and securely implement generative AI tools. You'll learn how Microsoft's AI systems and A3/A5 Microsoft 365 Education plans provide security tools that give you control and protection in managing AI in your school's infrastructure. Here are some considerations from the AI Toolkit.
- Understand the importance of a responsible AI framework
- Identify outcomes and data sources for AI systems
- Establish data governance, roles, and responsibilities
- Determine data privacy procedures and safeguards
- Develop an incident response plan to address issues that arise
Each consideration contains an overview, guiding questions, resources to follow, and suggested steps that your team can take to ensure your infrastructure and IT team can support a successful AI program.
Data governance
Creating a strong security posture includes a well-defined data governance framework. Data governance and security are fundamentally intertwined, each reinforcing the other to safeguard the confidentiality, availability, and integrity of data. By combining effective data governance with robust security measures, your organizations can defend against a wide range of cyber threats, ensuring that your data is both well-managed and highly secure. Here's a sample of cybersecurity needs to consider:
- Cloud data consolidation
- Data governance and privacy needs
- Data governance in AI
- Data privacy considerations in AI-driven education
- Student, educator, and faculty data privacy
- Compliance
As you explore the data governance tenants, consider who your stakeholders are, identify the decision makers, and delegate responsibilities to specific technology team members to help ensure that your school data is safe and follows established policies. Share these practical tips with your leadership team.
- Ensure the data you collect and use is the minimum needed for the task. The less data collected, the lower risk of a harmful data breach.
- Where possible, anonymize student data to protect student identities. This could involve removing personally identifiable information (PII) or replacing it with pseudonyms.
- Have a plan in place for responding to data breaches. This should include steps or identifying and containing the breach, notifying affected individuals, and preventing future breaches.
- Consider how your team could use Copilot to help establish and maintain your data governance. Use the following sample prompt as a starting point.