Legal Center
Safety & Reporting

Child Safety Policy

This policy describes how MakerHub supports child safety in supervised school use, including reporting expectations, platform safeguards, escalation paths, and response practices.

Effective date: March 10, 2026
Last updated: March 10, 2026

Contract Priority

This public page complements your signed agreement, collection notices, and local legal review. If there is a conflict, the signed agreement controls.

Audience

Written for school authority, board, district, and education authority deployments in a school-managed environment.

01

Safety Commitment

MakerHub is designed to support school-managed environments that prioritize child safety, dignity, privacy, and adult supervision.

The service is intended for supervised educational use rather than open public social networking.

02

Safety by Design Controls

MakerHub applies product and policy controls intended to reduce risk and support early intervention.

  • Role-based access controls for students, teachers, administrators, and support roles.
  • School-governed class, group, and assignment scoping.
  • Restricted access to records and uploads based on institution-managed permissions.
  • Support for safety training, completion tracking, audit logging, and classroom supervision workflows.
03

Prohibited Harm, Exploitation, and Unsafe Content

MakerHub prohibits any content or behavior involving child exploitation, grooming, coercion, harassment, abuse, threats, or hate-based targeting.

Any suspected child sexual abuse material, exploitation attempt, credible threat, or severe safety concern is treated as a critical incident.

04

Reporting and Escalation

Users must report child safety concerns immediately through school or school authority reporting channels and to MakerHub when platform action is needed.

For immediate danger, contact emergency services and follow local school emergency procedures first.

  • Students should report concerns to a teacher, principal, counselor, or other trusted staff member right away.
  • Staff should follow their institution's mandatory reporting, safeguarding, and escalation procedures.
  • Platform reports may be sent to support@makerhub.app with relevant account identifiers, class context, links, and timestamps.
  • Do not continue engagement with a suspicious account or content while awaiting review.
05

Incident Response

MakerHub may take immediate protective action, including content restriction, account lockout, session invalidation, evidence preservation, and access review, when a child safety risk is suspected.

Where legally required or institutionally directed, MakerHub may cooperate with schools, regulators, child protection authorities, and law enforcement in incident investigation and evidence preservation.

06

School and Family Responsibilities

Institutions are responsible for supervision standards, account governance, family notices, and mandatory reporting obligations in their jurisdiction.

Schools should maintain internal escalation playbooks for digital and physical safety incidents linked to class activity, media sharing, and project work.

Families and guardians should continue to use their school's usual reporting channels for safety concerns involving a student account or assignment.

08

Review and Updates

MakerHub reviews child safety controls and response procedures based on platform risk, legal developments, and school feedback.

Schools are encouraged to provide recurring digital safety, safeguarding, and reporting training for both staff and students.

This policy may be updated to address legal requirements, safety standards, or platform changes. Updated versions include a revised effective date.