There are more AI-powered tools pitched to school districts right now than at any point in the history of education technology. Attendance predictors. Writing assessors. Scheduling optimizers. Chatbots for parents, chatbots for teachers, chatbots for chatbots. Every one of them wants access to student data. And every one of them has a privacy page on their website.
Most of those pages say the same thing: "We take privacy seriously." Then a wall of legal text that no director of technology has time to parse, or a handful of badge icons that may or may not mean what you think they mean.
District leaders are right to be skeptical. When you're responsible for the records of thousands of children — their names, their grades, their disciplinary history, their disability status, their home addresses — vague reassurances aren't enough. You need specifics.
So here are ours.
Our Operating Principle
Arcline operates as a "school official" under FERPA, with a legitimate educational interest in the data we access. In practice, that means: we access student data solely at the direction of the district. We never use it for advertising. We never sell it. We never share it with third parties outside the scope of the district's agreement. And we never train general-purpose AI models on it.
That last point matters more than it might seem. Many AI vendors aggregate data across customers to improve their models — your district's data helps make the product better for every other district. We don't do that. Your data trains nothing. It informs your queries and your reports. That's it.
The Specific Decisions We Made
Privacy principles are only as real as the engineering choices behind them. Here's what we actually built:
Data residency
All district data is stored in the United States, on AWS infrastructure in us-east and us-west regions. No data is processed or stored internationally. For districts in states with data residency requirements — Texas, California, New York, and a growing list of others — this isn't optional. We treat it as a baseline.
Encryption
AES-256 encryption at rest. TLS 1.3 for all data in transit. These are industry standards, not differentiators — but it's worth stating explicitly because not every vendor in the K-12 space meets them.
Role-based access control
A principal at Jefferson Elementary sees Jefferson Elementary's data. Not the district's. Not the school across town. A district curriculum coordinator sees aggregate trends across schools but doesn't see individual student disciplinary records unless their role requires it. Permissions are configured during onboarding with the district's IT team, and they mirror the access structures the district already has in place.
Query processing
When a user asks Arcline a question in natural language — "Which 9th graders have missed more than 15 days this semester?" — that query is processed against the district's own data, in the district's own encrypted environment. The query itself is not stored for product improvement. It is not used to train models for other customers. It is logged for the district's own audit trail and then it's done.
Compliance and agreements
We have a SOC 2 Type II audit in progress, with completion expected in Q3 2026. A signed Data Processing Agreement is available for every district — not behind a sales call, not after negotiation. If your procurement process requires a DPA before you evaluate the product, we'll have one in your inbox the same day.
The AI Question: Does It See Student Names?
This is the question we get most often from CTOs and directors of technology, and it deserves a straight answer: yes, when a query requires it.
If a superintendent in Clarke County asks "Which students are chronically absent at the middle school level?", the system returns a list of students. It has to — that's the question. A response that only returned aggregate counts wouldn't be useful to the administrator who needs to intervene.
But that access is governed by three constraints:
- RBAC permissions apply to AI queries identically to every other data access method. If your role doesn't grant you access to student-level attendance data, the AI won't return it. The natural language interface doesn't bypass permissions — it's bound by them.
- Every query that touches PII is audit-logged. The district can see who asked what, when, and what data was returned. These logs are available to district administrators at any time.
- PII never leaves the district's encrypted environment. Student names, IDs, and other personally identifiable information are not transmitted to external model providers. Arcline processes queries within the district's data boundary.
We could have designed the system to anonymize everything by default. We considered it. But after talking to dozens of directors of student services, attendance coordinators, and school counselors, the feedback was consistent: anonymized data is useful for board presentations, but when you're trying to figure out which kids need help, you need to know which kids need help.
Five Questions to Ask Any AI Vendor About Student Data
Whether you're evaluating Arcline or any other product, these are the questions that separate vendors who've thought about privacy from those who've only written about it:
1. Where is my data physically stored?
You want a specific answer — region and provider, not "the cloud." If the vendor can't tell you, that's a problem.
2. Is my data used to train or improve models for other customers?
This is the big one. If the answer is yes, or "only in aggregate," or "only de-identified," push harder. De-identification of student records is more fragile than most vendors acknowledge, especially in small districts where a grade level might have 40 kids.
3. Who can access student-level data, and how is that controlled?
Look for role-based access that maps to your existing permission structures. If the vendor's access model is "admin or not admin," it's not built for K-12.
4. What happens to my data when I cancel?
You want deletion, not archival. Within a defined timeframe. In writing.
5. Can you provide a signed DPA before we start?
Any vendor serious about K-12 privacy has a DPA ready to go. If they need to "check with legal," they haven't prioritized this.
Why This Matters More Than the Product
Privacy isn't a feature we ship alongside dashboards and query tools. It's a design constraint that shapes every engineering decision, every product choice, every conversation with a district. Where the data lives. How long we keep it. Who can see what. What happens when the contract ends. These aren't afterthoughts — they're the architecture.
In K-12, this isn't negotiable, because the data doesn't belong to us. It doesn't belong to the district, either. It belongs to children and their families. Every vendor in this space is a temporary custodian of something that matters far more than any software product.
We think about that every day. We think the vendors you work with should too.