The recent legal development involving The New York Times and OpenAI has raised serious questions about user data privacy. In a controversial move, the plaintiffs have asked a federal court to force OpenAI to retain all user content indefinitely, including prompts and outputs from ChatGPT and its API.
TRULEO Users are Protected from this Court Order
At TRULEO, we understand how sensitive data in policing can be, especially when it involves personally identifiable information (PII), victim and witness statements, body-worn camera (BWC) footage or audio transcripts, and any officer-enter prompts or notes. That’s why we want to be clear:
We are the only vendor to proactively redact all PII from body camera audio before it is processed. This provides another layer of protection for victims, officers, and community members.
We also have a Zero Data Retention (ZDR) enterprise agreement with OpenAI, which means none of the other data our law enforcement partners send or receive is ever stored. Simply put, this means no prompts or outputs are ever logged, stored, or retrievable—even by OpenAI. The recent court order only applies to data that is retained. Since ours is not stored, it is completely out of scope. It cannot be subpoenaed, leaked, or held by OpenAI because it doesn’t exist after processing.
Other Vendors are Putting Departments at Risk
For police departments experimenting with AI report-writing tools, this new requirement raises significant privacy risks:
A Final Word to Agencies Exploring AI
If your department is testing AI tools, make sure you ask the right questions:
If the answer to any of these is unclear or “no,” your agency could be at serious risk.
With TRULEO, you’re covered–No data retention. No storage. No risk.