As school districts adjust to Google’s updated terms for Additional Services, which significantly impact YouTube, Maps, Translate, and Earth, some educators may wonder aloud, ‘Why not just obtain parental permission and continue using it?’ I could curse Google, but I see an opportunity in what they have presented to school districts nationwide. On the surface, gathering consent for Google’s Addtional Services (i.e., YouTube) to market and collect data on students under 18 can be considered a reasonable workaround. Still, in practice, it is anything but ethical or straightforward. School leaders should focus on meeting strict privacy standards instead of requiring parents to forgo safeguards via a parental consent form.

Point to Consider: Parental Consent Does Not Equal Ethical Compliance

There is a common misconception that schools can bypass a vendor’s refusal to sign a Data Privacy Agreement (DPA) by asking parents to opt in or out. While tempting, this approach shifts the risk to families, away from the district and the vendor. It sends a troubling message that we, the district, are willing to use software (or websites) that do not fully comply with student privacy laws, as long as families assume the liability. Read that again…

First steps can be simple. Educators and districts can steadfastly adopt only approved software that meets stringent data privacy standards. One way to approach this is for district leadership to join the Student Data Privacy Consortium (SDPC) or a local equivalent, which simplifies and strengthens how school districts manage Data Privacy Agreements (DPAs). For the SDPC partnership, it provides access to over 1,700 pre-negotiated agreements for educational software and offers legal support to secure new ones (if needed). The SDPC ensures compliance with privacy laws, such as the Family Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Act (COPPA), while reducing administrative burdens.

In the past, without resources or understanding of student data privacy considerations, a school district had to deny using specific software that did not meet privacy standards or lacked signed DPAs. This transition to the SDPC is especially timely, as evidenced by Google’s Additional Services consent requirements, which have demonstrated the challenges of balancing instructional needs with legal requirements and parental consent. Through the SDPC, school districts can make more consistent and informed decisions about which software to adopt, helping us protect student data while promoting equity and innovation in the classroom.

Point to Consider: It Undermines Community Trust

When schools ask parents and guardians to waive privacy protections on a consent form, it can erode confidence in the district’s data stewardship. This concern has been echoed at recent local and national conferences on student data privacy and artificial intelligence. Families expect that the academic software used in their child’s classroom meets rigorous safety and privacy standards, not that they will need to make complex legal decisions for each new app.

This expectation becomes even more pressing when considering the scale and fluidity of edtech usage in schools. According to Lightspeed Systems’ 2022 Edtech App Report, individual students accessed roughly 72 apps during the school year, with sixth graders using the most at 82. Despite most school districts owning over 2,000 apps in 2021-22, just 300 accounted for 99% of actual use. The top 20 apps used in schools that were surveyed include familiar platforms like Google Workspace, YouTube, Clever, and Kahoot (Zalaznick, 2022).

Adding to the complexity, the digital tools students rely on are far from static: approximately 91% of edtech apps changed their privacy policies over the past year (Zalaznick, 2022). This constant evolution makes it nearly impossible for parents (and teachers) to keep up with potential data risks, and it reinforces the need for districts, not families, to take the lead in ensuring privacy and compliance across the board. Just imagine the never-ending wave of emails coming and going to parents, for yearly updates to terms of service, and gathering consent for usage. As both parent and central office administrator, I’m exhausted by just that thought.

Point to Consider: It Creates Equity Gaps

Not all families are equally comfortable putting their child’s data at risk…and I say that as someone who also wears the hat of Director of Technology & Digital Learning. When a software vendor has not signed a data privacy agreement (DPA), and the district requires specific parental consent, we ask families to waive key protections. Some parents understandably decline. However, their children can be excluded from essential instructional tools or learning opportunities when that happens. This also causes agita for the educators dialing up different methods…and, no, data privacy workarounds do not represent the UDL core principle of multiple means of engagement. No student should be penalized because their family is privacy-conscious.

Simply…when access to learning tools depends on consent-based privacy exceptions, we risk creating new equity gaps in the classroom.

Point to Consider: The Logistics Do Not Work…Especially for The Team Gathering Consent!

Even if most families opt in, the remaining percentage that do not, for example, with Google Addtional Services, can be 15% or more, creating a huge operational burden for the technology team. Teachers would need to track who has access and create separate pathways for those who do not. The technology team must monitor user permissions across multiple systems, adding risk and complexity.

Despite multiple, thorough communications to families, including emails, phone calls, website updates, live feed posts, and newsletter mentions, you still may have a substantial number of students, in our case nearly 20%, whose parents have not consented to Google’s Additional Services. Understandably, parents might glaze over on data privacy legalize, which is why we shouldn’t burden them with it in the first place. This opt-in process must be repeated annually (or more frequently if terms of service change), creating a recurring challenge that consumes administrative time and directly impacts instructional equity. Given these barriers and the persistent consent gap, we are considering turning off Google’s Additional Services district-wide and relying on teacher-curated videos, embedded in approved software, to ensure a more equitable and manageable instructional experience. If the SDPC does not approve software (in this case, Google’s Addtional Services) or lacks DPAs, that tool should be eliminated or deployed so that student data privacy considerations are at the forefront.

Point to Consider: Better Options Exist

Tools like YouTube can still be embedded through Google Classroom, even if student accounts are directly disabled from accessing YouTube.com. Teachers can use these Addtional Services as instructional tools if they so choose. Privacy-compliant alternatives to Google Maps, Earth, and Translate can serve similar instructional purposes. For software and apps outside of the battered Google Addtional Services, strict adherence to data privacy agreements should provide the foundation for either adopting or declining the software. This is another reason to have technology at the table when making curriculum decisions.

Final Thoughts

Student data privacy should never be optional. All students deserve software and edtech tools that support learning and protect their personal information. While turning off popular services may feel disruptive at first, it is a necessary step toward building a safer and more equitable digital learning environment.

References

Edtech App Report (2022). https://www.lightspeedsystems.com/ebook/edtech-app-report

Zalaznick, M. (2022). District Administration. https://districtadministration.com/briefing/the-top-20-apps-schools-are-using-most-contains-a-few-surprises