Recent Example of Accidental Data Exposure

The startup Flock accidentally had 60+ of their AI-powered security camera feeds available for the entire world to watch. This is just another reminder of how security mistakes happen, leading to data being compromised, despite the best of intentions.


In case you missed it, a bunch of AI-powered security camera feeds were accidentally made available for the general public to access and watch. These cameras are run by Flock, and oops, somebody forgot to configure something important, leading to this data exposure!

A screenshot of an article from 404 Media about the data exposure from Flock.

(Read the full article from 404 Media here: https://www.404media.co/flock-exposed-its-ai-powered-cameras-to-the-internet-we-tracked-ourselves/)

Flock says on their website that their security framework is "built on transparency and verifiable implementation of industry expected safeguards. We have deployed rigorous controls across our platform that have been validated through independent third-party attestations including SOC 2 principles and ISO certifications to provide you with confidence that your data is consistently protected against a rapidly evolving threat landscape."

Of course they didn't WANT this leak to happen... to have live video feeds publicly accessible for all to watch. I fully believe that they thought they took appropriate measures to prevent this.

But mistakes happen.

Human error happens.

SOC 2 and ISO certifications (or for healthcare-related companies, HIPAA compliance) do not guarantee protection against mistakes and human error and bugs in software. They don't make you immune -- and there are plenty of examples (like this latest one with Flock) to help confirm that.

What we should be asking ourselves is... Do we need these security cameras? Does this data need to exist, considering how easy it is to fall into the wrong hands? We'll close this specific loophole at Flock now, but incidents like this will continue to happen across all sorts of industries and data feeds. It's not a matter of "if" they'll happen, it's a matter of "when".

How does this relate to Quill?

More and more companies are encouraging therapists to record their entire therapy sessions, for the "convenience" of converting those audio conversations into transcripts, which can then be transformed into progress notes. And these companies are quick to reference their HIPAA compliance, offering that as a reassurance that all will be secure and protected.

But examples like this Flock one makes clear: Mistakes do happen. They will always happen.

Today is 60 live video feeds from security cameras. But could tomorrow bring... therapy session records? Therapy session transcripts?

Companies can have the best of intentions to keep this data secure! But data breaches will continue to happen, despite those intentions, because software and humans are not flawless.

Published on Jan. 2, 2026.

Data Breaches HIPAA Privacy Security

Quill Therapy Solutions

What is Quill?

Quill streamlines progress notes for therapists, saving time by generating notes from a verbal or typed session summary. With privacy at its core, Quill never records client sessions, protecting the therapist-client relationship and avoiding ethical and confidentiality risks. Just record a summary, click a button, and Quill generates your notes for you.

Try Quill for free today, no credit card required. And for unlimited notes (and other types of therapy documentation), it's only $20/month. (Even less for teams.)

Try Quill and save time on notes.