1. Introduction, Scope, and Architectural Philosophy
Monarkh LLC (hereinafter referred to as "the Company," "we," or "our") has engineered the Monarkh Suite™ (hereinafter referred to as "the Service") to function as an advanced, privacy-first Communication Enhancement Layer, deliberately distinguishing itself from traditional Student Information Systems (SIS). The Service is designed to bridge the gap between institutional educational systems and familial communication channels without unnecessarily centralizing sensitive student data. Recognizing the inherent vulnerabilities in legacy platforms that aggregate extensive dossiers of minor data, the Company has architected the Service upon a foundational "Zero-Knowledge" cryptographic framework.
This Privacy Policy constitutes a legally binding disclosure detailing the exact mechanisms by which data is collected, processed, mathematically segregated, encrypted, and retained across both enterprise and consumer deployments. This document has been drafted to align with the stringent regulatory requirements of the Family Educational Rights and Privacy Act (FERPA), the Children's Online Privacy Protection Act (COPPA), the General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), the United Arab Emirates Federal Decree-Law No. 45 of 2021 on Personal Data Protection, and the UAE Child Digital Safety Law No. 26 of 2025. This policy applies universally to all participating educational institutions, school districts, educators, parents, legal guardians, and students interacting with the Service across all supported global jurisdictions.
2. Lexicon of Defined Terms
To ensure absolute clarity regarding the technical and legal parameters of the Company's data processing activities, the following terminology is utilized throughout this Privacy Policy and the subsequent Terms of Service.
3. Data Collection Mechanisms and the Zero-Knowledge Paradigm
The Company's data collection practices are strictly bifurcated between data stored locally on the User's end-device and data transmitted to the Company's cloud infrastructure. This bifurcation is the cornerstone of the Zero-Knowledge privacy model, ensuring that the Company never possesses the cryptographic material necessary to reconstruct a student's identity.
3.1. Local Device Storage and the On-Device Privacy Filter
Upon the initiation of the Service, a parent or legal guardian connects their external communication account (e.g., Google Gmail, Microsoft Outlook) via standard OAuth 2.0 protocols. The Service requests the minimum necessary scopes to access incoming messages. Crucially, the parsing of these messages occurs on the client side, directly on the User's mobile device or local machine.
The native application deploys an "On-Device Privacy Filter," an advanced local algorithm that scans incoming text to identify direct identifiers, including real student names, physical addresses, and unencrypted contact details. Once identified, these raw data points are instantly encrypted using AES-256 encryption. The resulting ciphertexts and their corresponding cryptographic decryption keys are deposited strictly within the local device's secure hardware enclave—specifically, the Apple Keychain for iOS environments and the Android Keystore system for Android environments. The Company fundamentally lacks the technical capacity to access, mirror, or extract the contents of these local keystores, thereby ensuring that the most sensitive PII remains mathematically inaccessible to the Company's remote servers and any potential threat actors targeting the cloud infrastructure.
3.2. Cloud Backend Ingestion and Pseudonymization at Source
Following the localized privacy filtering, the native application transmits the residual, sanitized structural metadata and communication vectors to the Company's cloud servers. The cloud backend, which operates on enterprise-grade relational databases and vector databases, ingests only pseudonymized data.
The transmission utilizes Transport Layer Security (TLS 1.3) to protect the data in transit. The cloud servers receive mathematically generated, non-reversible identifiers (e.g., "USR_A7X_9KQ_3F2D") rather than readable names. This structural design fulfills the highest standards of "Pseudonymization at Source" as articulated in Article 4(5) of the GDPR and equivalent global statutes, ensuring that the resulting cloud-stored data cannot be reverse-engineered to reveal original system characteristics without the localized keys held exclusively by the User.
3.3. The Document Vault and Tier 2 Data Processing
In addition to automated communication ingestion, the Service features a "Document Vault," an encrypted repository allowing parents, legal guardians, and educators to manually upload Tier 2 educational materials. These materials may encompass individualized education programs (IEPs), pedagogical assessments, permission slips, and multimedia assignments.
Because the Company must host these files to facilitate collaborative sharing, Tier 2 Data is stored within secure cloud buckets (e.g., Amazon S3 or Google Cloud Storage) and is encrypted at rest using AES-256 server-side encryption. However, the visibility and distribution of Tier 2 Data are governed by strict, user-controlled toggle protocols. Users may designate files as "Private" (accessible only by the uploading account), "Family Shared" (accessible only by accounts cryptographically linked via family authorization tokens), or "School Wide" (accessible to verified educators within the institutional network). The Company asserts no ownership over this User-Generated Content and processes it strictly as a neutral hosting provider.
4. Artificial Intelligence Processing and Large Language Model Integrations
The Service heavily integrates a proprietary Artificial Intelligence assistant, designated as "Elle," to extract temporal calendar events, summarize extensive pedagogical communications, and provide an interactive chatbot interface for organizational queries. The deployment of AI within an educational technology framework requires profound privacy safeguards, particularly concerning the handling of sensitive communications.
4.1. Asynchronous Ingestion and Vectorization
AI processing within the Service occurs during an asynchronous ingestion phase. To prevent the exposure of PII to AI models, the pseudonymized text data—already stripped of localized identifiers by the on-device privacy filter—is compiled into discrete mathematical vectors. These vectors are indexed within secure vector databases, allowing the "Elle" AI to perform semantic similarity searches and generate highly contextualized summaries regarding school schedules or curriculum updates without ever mapping the query back to a legally identifiable natural person on the server side.
4.2. Third-Party LLM APIs and Absolute No-Training Guarantees
To power the advanced natural language processing capabilities of the "Elle" assistant, the Company transmits pseudonymized, privacy-filtered prompts to enterprise-tier third-party Large Language Model (LLM) Application Programming Interfaces (APIs), strictly utilizing commercial providers such as OpenAI, Google (Gemini), and Anthropic.
The Company explicitly rejects the use of standard consumer-grade APIs. Instead, the Company maintains rigorous enterprise Data Processing Agreements (DPAs) with these third-party subprocessors. These agreements mathematically and legally guarantee that User data, prompts, inputs, and generated outputs are strictly prohibited from being utilized to train, fine-tune, or otherwise improve the third-party providers' foundation models. Furthermore, all transmitted data is subjected to automated zero-retention policies on the third-party provider's infrastructure, ensuring that the pseudonymized prompts are instantly purged from volatile memory following the generation of the response.
4.3. Pedagogical Guardrails and Ethical AI Deployment
The Company recognizes the risk of AI platforms inadvertently facilitating academic dishonesty. Consequently, the student-facing iteration of the "Elle" AI assistant is hard-coded with a strict "Guide, Don't Answer" pedagogical protocol. Utilizing advanced system prompts and output filtering mechanisms, the AI is engineered to provide Socratic questioning, study frameworks, and conceptual explanations, but is explicitly restricted from generating completed essays, solving mathematical equations directly, or providing direct answers to academic assessments. This approach ensures that the Service functions as a supplemental tutoring aid rather than a mechanism for academic bypass.
5. Mathematical Data Segregation: The "One ID Per Child" Rule
A critical vulnerability in legacy educational platforms is the risk of cross-contamination, wherein a parent inadvertently gains access to the records of another student, or siblings access each other's distinct educational communications. To entirely mitigate this risk, the Company enforces the "One ID Per Child" rule, powered by a proprietary "Multi-Layer Identity Verification System."
5.1. The Multi-Layer Identity Verification Architecture
The Service utilizes differential privacy principles and cryptographic compartmentalization to ensure that data access is mathematically restricted based on verified relationships. The architecture is structured upon three distinct cryptographic tokens:
- Source Tokens: Unique cryptographic hashes assigned to the incoming communication source, such as a specific school district's broadcast email address or a designated teacher's communication portal.
- Family Authorization Tokens: Cryptographic linkages securely mapping a verified parent, legal guardian, or authorized caregiver to a specific student profile. The family authorization token acts as the authorization key for family-level data sharing.
- Anonymized Profile References: The overarching, pseudonymized reference identifier (e.g., "USR_A7X_9KQ_3F2D") residing in the cloud backend, ensuring that the database only processes abstract relationships rather than named individuals.
By requiring the simultaneous validation of the source token, the family authorization token, and the anonymized profile reference prior to decrypting any payload, the Company mathematically guarantees that parents and students only possess the capacity to view data explicitly authorized for their specific profile. This strict segregation prevents any unauthorized horizontal data leakage between siblings or unrelated users within the same educational institution.
6. Children's Privacy, COPPA, and FERPA Compliance
The Company considers the privacy of minors to be of paramount importance and subjects the Service to the most stringent international regulatory frameworks governing children's data. The compliance posture adapts dynamically based on whether the Service is deployed via enterprise institutional channels or individual consumer acquisition.
6.1. B2B Enterprise School Integration (FERPA Compliance)
When the Service is deployed via direct contract with an educational institution or school district (the B2B track), the Company operates exclusively in the capacity of a "School Official" possessing legitimate educational interests under the Family Educational Rights and Privacy Act (FERPA).
In this enterprise context, the educational institution maintains ultimate ownership, control, and governance over the educational records. The Company merely acts as a data processor executing instructions on behalf of the institution. The Company explicitly warrants that it shall not utilize any student data, whether raw or pseudonymized, for behavioral targeting, profiling, or advertising purposes under any circumstances. The institution remains responsible for obtaining any necessary parental consents required by local jurisdictions prior to provisioning student accounts.
6.2. B2C Parent Acquisition and "Parent-Managed Child Profiles" (COPPA Compliance)
When the Service is adopted directly by a parent or legal guardian independent of a school contract (the B2C track), the Company assumes the role of the data controller. In strict adherence to the Children's Online Privacy Protection Act (COPPA), the Company requires explicit, verifiable parental consent before establishing an interactive environment for a child under the age of thirteen (13).
To facilitate continuous COPPA compliance without permanently alienating underage users, the Company utilizes a "Parent-Managed Child Profile" infrastructure. A parent-managed child profile is a privacy-enhanced, decoupled profile managed entirely by the verified parent. The child interacts with the Service purely through this parent-controlled proxy. The parent maintains absolute authority to review, modify, or delete the child's data inputs and AI interactions. Once the child reaches the legal age of digital consent (which varies by jurisdiction, typically 13 in the US and 16 in the EU), the parent-managed child profile may, upon parental authorization, seamlessly transition into an autonomous user account, preserving the historical organizational utility without violating statutory age restrictions.
7. Global Regulatory Compliance: UAE, GDPR, and CCPA
The Company operates a globally distributed infrastructure and is committed to upholding the specific statutory rights granted to users across diverse legislative frameworks.
7.1. United Arab Emirates Personal Data Protection Law (Law 45 of 2021)
For operations and users situated within the United Arab Emirates, the Company strictly adheres to the UAE Federal Decree-Law No. 45 of 2021 regarding the Protection of Personal Data (PDPL). The Company upholds the fundamental rights of UAE Data Subjects, including the right to request the correction of inaccurate data, the right to restrict processing, and the right to demand the cessation of processing, particularly concerning direct marketing or profiling.
In compliance with Article 5 of the PDPL, all data processing is conducted fairly, transparently, and lawfully, supported by the explicit consent of the data owner or established legitimate interests. Furthermore, recognizing the requirement for robust technical security under Article 20 of the PDPL, the Company's Zero-Knowledge architecture and localized AES-256 encryption satisfy the highest standards for data anonymization and security, ensuring protection against unauthorized breaches. Should a verified data subject request the cessation of processing, the Company's architecture ensures the immediate deletion of the pseudonymized cloud data, effectively rendering any localized residual data mathematically isolated and inert.
7.2. UAE Child Digital Safety Law (Law 26 of 2025)
The Company is fully compliant with the UAE Federal Decree-Law No. 26 of 2025 regarding Child Digital Safety, which establishes a comprehensive framework to protect minors in the digital environment. Pursuant to this law:
- Commercial Prohibitions: The Company strictly prohibits the use of children's data for commercial purposes, targeted electronic advertising, or any form of behavioral profiling.
- Commercial Gaming Ban: The Service inherently blocks all access to online commercial gaming, betting, or wagering functionalities, ensuring the environment remains strictly pedagogical.
- Age Verification and Consent: The Service integrates robust age-verification mechanisms and mandatory, verifiable parental consent protocols for all accounts associated with minors.
- Parental Controls: Custodians and legal guardians are provided with comprehensive, user-friendly tools to actively monitor, supervise, and restrict the digital content accessible by their children.
- CSAM and Harmful Content: The Company implements proactive, privacy-preserving hash-matching to detect Child Sexual Abuse Material (CSAM) or other severely harmful content. Upon detection, the Company mandates immediate reporting to the competent UAE authorities and the execution of rapid takedown procedures.
7.3. GDPR (European Union) and CCPA (California)
For users within the European Economic Area (EEA) and the United Kingdom, the Company complies with the General Data Protection Regulation (GDPR), relying on user consent and the performance of a contract as the primary legal bases for processing. The Zero-Knowledge architecture explicitly fulfills the GDPR Article 25 mandate for "Data Protection by Design and by Default." For users in California, the Company complies with the CCPA, explicitly affirming that the Company does not "sell" or "share" personal information for cross-context behavioral advertising.
8. Sub-processors and Third-Party Data Sharing
The Company operates under a strict paradigm of data minimization and operational necessity. We do not rent, trade, sell, or otherwise distribute User information to any third parties for marketing or advertising purposes.
8.1. Authorized Infrastructure Providers
To deliver the Service, the Company engages rigorously vetted third-party service providers (Sub-processors) to facilitate critical infrastructure components. These include enterprise cloud hosting platforms (e.g., Amazon Web Services, Google Cloud Platform, Supabase, Render), encrypted communication gateways (e.g., Twilio, SendGrid), and payment processing facilities (e.g., Stripe).
All engaged Sub-processors are bound by stringent Data Processing Agreements (DPAs) that enforce privacy, confidentiality, and security obligations consistent with, and at least as protective as, the standards outlined within this Privacy Policy. The Company regularly audits these Sub-processors to ensure adherence to SOC 2 Type II and ISO 27001 compliance standards.
8.2. Compelled Legal Disclosures
The Company reserves the right to disclose pseudonymized backend data or localized metadata if legally compelled to do so by a court of competent jurisdiction, a valid subpoena, or a binding administrative order. In such events, the Company will, to the extent legally permissible, provide the User with prior written notice of the request to allow the User an opportunity to seek a protective order or quash the subpoena.
However, the Company explicitly notes to both Users and regulatory authorities that, due to the proprietary Zero-Knowledge architecture and the on-device privacy filter, the Company possesses absolutely no technical capacity to surrender decrypted PII, raw email communications, or unhashed student names to any governmental, judicial, or law enforcement entity, as the Company does not possess the localized cryptographic decryption keys.
9. Data Retention and Deletion: The Automated Data Lifecycle Protocol
The Company rejects the industry practice of indefinite data retention. Instead, the Service implements a highly structured, automated data lifecycle management system designated as the "automated data lifecycle" protocol, ensuring strict adherence to the principles of data minimization and storage limitation.
9.1. Automated Archiving and Cascading Sweeps
The automated data lifecycle protocol continuously monitors the temporal relevance of ingested communication vectors. When an academic term concludes, or when a specific data payload (such as a summarized calendar event) exceeds its operational utility timeframe, the protocol initiates an automated archiving sequence. The system executes automated database sweeps across the relational and vector databases, permanently and irretrievably purging deprecated, pseudonymized records. This ensures that the cloud infrastructure remains lean and that historical data exposure is systematically mitigated over time.
9.2. Account Deletion and the Right to Erasure
Users retain the absolute right to terminate their accounts and mandate the immediate deletion of associated data at any time. Upon receiving a verified deletion request, the Company executes a permanent, cryptographic erasure of the User's anonymized profile references, family authorization tokens, and all associated vector embeddings within a maximum of thirty (30) days.
For users situated in the European Union and California, this mechanism comprehensively fulfills the statutory "Right to be Forgotten" and "Right to Deletion." The Company emphasizes, however, that while deleting the cloud-based account severs the connection to the localized application and purges the pseudonymized backend data, the User must manually delete the native application from their physical device to purge the localized Apple Keychain or Android Keystore encryptions. The Company cannot remotely wipe a User's local hardware enclave.