Home $ Executive Summary $ Data Privacy & Cybersecurity

Data Privacy and Cybersecurity

In an era of rapidly evolving AI technologies, ensuring robust data privacy and cybersecurity is paramount for protecting students, educators, and school systems. AI tools often rely on extensive data collection to function effectively, but this creates potential vulnerabilities if data is not managed securely and ethically. Schools must be vigilant in safeguarding sensitive information and adapting existing policies to meet the unique challenges posed by AI technologies.

This section provides a high-level overview of key considerations, including understanding critical data definitions (e.g., anonymized versus de-identified data), addressing emerging threats like deep fakes, and practical strategies for updating privacy policies to account for AI. For a more in-depth exploration of these topics and actionable resources, we encourage readers to review the materials provided by your subgroup in our Appendix.

Access 4 Learning Community

Several districts in Florida are part of the A4L National Student Data Privacy Consortium and and use a standard Data Privacy Agreement. This could be a great resource for your district as well! Contact the Task Force to learn more.

Key Terminology

Term Definition Example
Data Privacy The practice of ensuring that personal information is collected, stored, and shared in a manner that protects individuals’ identities and prevents misuse. Ensuring student data is only accessible to authorized personnel and not used for unauthorized purposes.
Data Security Measures and protocols designed to protect digital data from unauthorized access, corruption, or theft. Using encryption and secure networks to safeguard student records from cyberattacks.
Personally Identifiable Information (PII) Information that can directly or indirectly identify a specific individual, such as names, addresses, social security numbers, or student ID numbers. Student attendance records containing names and school IDs.
Anonymized Data Data that has been stripped of all personally identifiable information (PII), making it impossible to trace back to an individual. Aggregating student test scores without linking them to names or IDs.
De-Identified Data Data from which PII has been removed or masked, but which could still be linked back to an individual with additional information. De-identified data is less secure than anonymized data because re-identification is possible under certain conditions.
Data Governance The framework of policies, procedures, and standards that ensure effective and ethical management of data. District-wide rules on who can access student performance data and for what purpose.
Data Modernization Upgrading legacy systems and processes to handle, store, and analyze data more efficiently and securely using modern technologies. Transitioning from paper-based student records to a secure cloud-based platform.
Algorithmic Bias Systematic errors in AI outputs that result from biases in the data or design of the algorithm. AI recommending fewer advanced math opportunities for certain student demographics due to historical inequities in data.
Data Breach Unauthorized access or exposure of sensitive information. A cyberattack leading to the leak of students’ PII from a school database.
Data Minimization The principle of collecting only the data necessary for a specific purpose and nothing more. Only collecting attendance data needed for funding reports, without storing additional unrelated information.
Consent Management Processes that ensure individuals or their guardians have given informed consent for data collection and usage. Gaining parent approval before using a new AI-based classroom app that collects student data.

Deep Fakes

Deep fakes—synthetic media generated using AI to manipulate images, videos, or audio—have emerged as a significant concern. Deep fakes can be used to create content that appears authentic but is entirely fabricated, posing risks for misinformation, cyberbullying, and even security threats. For Florida’s schools, addressing deep fakes is essential to fostering a safe, informed, and ethical learning environment.

Why Deep Fakes Matter

Impact on Students and Staff: Deep fakes can be weaponized to target students or staff, potentially leading to cyberbullying, reputational harm, or psychological distress.
Spread of Misinformation: Schools play a vital role in promoting media literacy. Without proper education on deep fakes, students may struggle to discern credible sources from manipulated content, impacting their ability to make informed decisions.
Security Risks: Deep fakes could be used to impersonate school administrators or staff in phishing scams, jeopardizing data security and trust.

Approaches for Mitigation

  • Promote Media Literacy: Update lessons and standards to include recognizing and evaluating deep fakes, including: teach students to analyze inconsistencies in audio or video and discuss the role of algorithms in creating and detecting manipulated media
  • Strengthening Digital Citizenship Programs: Florida’s digital citizenship curriculum can be enhanced by integrating modules on the ethical use of AI and the dangers of manipulated media
  • Leveraging AI Detection Tools: Districts can explore partnerships with technology companies to access AI tools that detect deep fakes and should train IT staff to use these tools and recognize red flags
  • Policy Alignment: Schools should adapt existing Acceptable Use Policies (AUPs) to explicitly prohibit the creation, sharing, or distribution of deep fakes
  • Crisis Response Planning: Include deep fakes in cybersecurity incident response plans to ensure schools can respond swiftly and effectively when such content surfaces

01
Raise Awareness

Host workshops or assemblies for students, teachers, and parents to understand the risks of deep fakes and learn how to identify them.

02
Develop Detection Skills

Encourage hands-on activities where students compare real and manipulated media, promoting critical thinking.

03
Establish Clear Policies

Define consequences for creating or sharing deep fakes within the school community to deter misuse.

04
Partner with Experts

Work with universities, media organizations, and technology companies to access cutting-edge research and tools for mitigating deep fake risks.

Relevant Policies for K-12 Florida Schools

FERPA

VIsit FERPA Website

Family Educational Rights and Privacy Act: Protects the privacy of student education records & gives parents certain rights regarding student education records

  • AI systems must protect the privacy of student education records and comply with parental consent requirements. Data must remain within the direct control of the educational institution.
COPPA

Visit COPPA Website

Children’s Online Privacy Protection Act: Imposes requirements on websites and online services directed to children under 13 years of age, or that collect personal information from a child under 13.

  • AI chatbots, personalized learning platforms, and other technologies collecting personal information and user data on children under 13 must require parental consent.
IDEA

Visit IDEA Website

Individuals with Disabilities Education Act: Ensures students with disabilities are provided with free appropriate education that is tailored to their individual needs.

  • The use of AI as an instructional tool must be reasonable, appropriate, and individualized based on unique needs for students with disabilities.
  • AI AI may provide options for expanding learning experiences through unique accommodations or as a supplement to assistive technology.
  • Remember: The “I” in AI is not the same as the “I” in IDEA or IEP
CIPA
Visit CIPA Website

Children’s Internet Protection Act: Requires schools and libraries that receive federal funds for Internet access or internal connections to adopt and enforce policies to protect minors from harmful content online.

  • Schools must ensure AI content filters align with CIPA protections against harmful content.
Section 504

Visit Section 504 Website

A federal law designed to protect the rights of individuals with disabilities in programs that receive federal financial assistance from the US Department of Education.

  • This section of the Rehabilitation Act applies to both physical and digital environments.
  • Schools must ensure that their digital content and technologies, like AI, are accessible to students with disabilities.