This story on data privacy in special education originally appeared on CoSN’s blog and is reposted here with permission.
Key points:
- Ensuring data protection helps prevent discrimination and stigmatization
- How to maintain secure access and data privacy
- It’s not business, it’s personal: Building a culture of trust to protect data
- For more news on data privacy, visit eSN’s IT Leadership hub
Adam Garry is the former Senior Director of Education Strategy for Dell Technologies and current President of StrategicEDU Consulting. Through his expertise as a professional development strategist, he has supported districts in the implementation of Generative Artificial Intelligence in their schools. CoSN approached him to discuss the importance of data privacy and the different approaches towards creating IEPs with GenAI while ensuring student data privacy.
Protecting the data of students with disabilities is crucial for several reasons. Firstly, all students have a right to privacy, and their personal and sensitive information must be kept confidential to protect them from unwanted exposure of their Personal Identifiable Information (PII) and its potential misuse. Ensuring the protection of this information helps prevent discrimination and stigmatization, and in more critical cases, identity theft. To ensure data privacy, legal standards such as FERPA and IDEA have been designed, which require schools to limit the access to the students’ PII. When it comes to the use of Generative AI tools, educators must be aware of the data privacy risks that their implementation entails.
Special education professionals have started to notice the potential of Generative AI to create Individualized Education Programs (IEPs), as it could help provide recommendations of personalized learning experiences by analyzing vast amounts of data, and tailor educational paths to each student’s unique needs. However, there is a critical concern: IEPs require detailed information about students’ disabilities, learning needs, medical history, and academic performance. Because many AI tools and platforms used in education are developed by third-party vendors, sharing student data through these tools requires trusting that vendors will handle the data responsibly and securely. Any lapse in their data protection practices can result in unauthorized access or exposure.
Adam suggests a three-level solution for the safe implementation of Generative AI in school districts. The levels are organized in terms of how much personalization of the tool is possible. For each level, he mentions that it is necessary to ponder their risks and rewards.
General level: Utilizing a Large Language Model (LLM) like Google’s Gemini or Microsoft’s Copilot
Google and Microsoft have created their own GenAI tools specifically targeted for educators. At a more general level, these tools could be valuable to create personalized content for students.
- Reward: Microsoft and Google ensure their tools comply with student data protection regulations. These tools protect user and organizational data while chat prompts and responses are not saved. Additionally, these companies ensure that students’ information is not retained or used to train the AI models (Microsoft Education Team, 2024; Google for Education, n.d.).
- Risk: The risk is very low in terms of security, yet it exists. Moreover, there might be some loss in functionality compared to other tools, as it cannot build on from a prompt standpoint. In other words, the prompt cannot “learn” from previous answers, as the latter are not saved by the model.
Small Language Models
Educators could utilize technology from Microsoft or Google to build a Small Language Model. Small Language Models are simpler, more resource-efficient text processors that handle basic language tasks and can be easily deployed on everyday devices like smartphones. Districts can strip out the LLM functions they do not need and focus the tool on specific tasks, such as creating IEPs.
- Reward: An SLM maintains the privacy protections established by Google or Microsoft while personalizing the tool for a specific need. By targeting a specific task, it is also easier to set specific guardrails and train teachers.
- Risk: In addition to the risks mentioned with LLMs, they might have a more limited knowledge base compared to an LLM.
The Open-Source Model
The district could create their own GenAI application through the use of an open-source model. This model is a type of artificial intelligence (AI) where the underlying code and data are made publicly available for anyone to use, modify, and distribute.
- Reward: The models are highly customizable, allowing districts to tailor them to their specific needs and integrate them with existing systems. This allows them to maintain control over their data, ensuring it is used in compliance with privacy regulations and local policies.
- Risk: Setting up and maintaining an open-source model requires significant technical expertise and substantial computational resources, which may necessitate additional investments in infrastructure and staff training. There are security risks involved in handling sensitive student data, and ensuring robust protection is essential. Unlike proprietary software, open-source projects may lack formal customer support, and ensuring legal and regulatory compliance can be complex and challenging.
Whatever option is selected, Adam highlights the importance of merging the framework that the district has already in place to protect data privacy and go about specific tasks (such as the creation of IEPs) while detailing the tools, guidelines, and resources required in the implementation of GenAI tools.
Integrating Generative AI tools in school districts offers significant benefits, particularly in creating personalized learning experiences and Individualized Education Programs (IEPs). However, it’s crucial to balance these innovations with strong data privacy measures. By choosing the right AI model—whether a general Large Language Model, a tailored Small Language Model, or a customizable open-source model—districts can enhance education while protecting sensitive student information. With careful planning, school districts can use AI to support diverse student needs in a secure, inclusive environment.
References:
Microsoft Education Team. (2024, January 23). Meet your AI assistant for education: Microsoft Copilot. https://www.microsoft.com/en-us/education/blog/2024/01/meet-your-ai-assistant-for-education-microsoft-copilot/
Google for Education. (n.d.). Guardian’s Guide to AI. https://services.google.com/fh/files/misc/guardians_guide_to_ai_in_education.pdf
- How schools can take full advantage of the FCC’s new cybersecurity program - October 11, 2024
- To start the school year off right, invest in literacy - October 10, 2024
- Kids in special education are struggling–it’s time to rethink their education - October 10, 2024