Overview of NC Child Pornography Case

Raleigh Child Pornography Lawyer

A typical child pornography case begins with a Cybertip by an Electronic Service Provider (ESP) (such as Google, Apple iCloud, Kik, Flingster) that a user has uploaded or shared suspected Child Sexual Abuse Material (CSAM) on the service.

Detection of Suspected Child Sexual Abuse Material

While Internet Service Providers (ISP) and Electronic Service Providers are not obligated by law to search for CSAM or child pornography on their systems, many do because the damage to business reputation of being seen as a haven for such material.  Moreover, ESPs can be shut down if they are seen as allowing such material.  In 2023, Leif K-Brooks, owner of the video chat service Omegle, shut the service down after 14 years of operation, even though it scanned its service and reported suspected CSAM.  K-Brooks cited the abuse of Omegle by traffickers in CSAM as contributing to his decision to shut the service down.

Nearly all ESPs run software in the background of their services to automatically monitor for known CSAM.  Some services employ people or engage volunteers who monitor the chat or distribution services they operate.

But, owing to the prevalence of CSAM on systems or the danger that CSAM could proliferate, background automated software is the primary tool to compare the hash values of known CSAM to content that has been uploaded, shared, or distributed on the ESP.

Call & Text (919) 352-9411

The Chetson Firm - Board Certified Raleigh Criminal Defense Lawyer

Share:

More Posts

Running for DA: What Does it Cost?

Running for the District Attorney of a roughly million person county takes money.  What are the expenses and what kind of money is required?  In

Wake County District Attorney Race

Wake County District Attorney Lorrin Freeman announced she would finish out her term, but not run for re-election.   As I wrote on Facebook, we

Send Us A Message

A typical child pornography case begins with a Cybertip by an Electronic Service Provider (ESP) (such as Google, Apple iCloud, Kik, Flingster) that a user has uploaded or shared suspected Child Sexual Abuse Material (CSAM) on the service.

Detection of Suspected Child Sexual Abuse Material

While Internet Service Providers (ISP) and Electronic Service Providers are not obligated by law to search for CSAM or child pornography on their systems, many do because the damage to business reputation of being seen as a haven for such material.  Moreover, ESPs can be shut down if they are seen as allowing such material.  In 2023, Leif K-Brooks, owner of the video chat service Omegle, shut the service down after 14 years of operation, even though it scanned its service and reported suspected CSAM.  K-Brooks cited the abuse of Omegle by traffickers in CSAM as contributing to his decision to shut the service down.

Nearly all ESPs run software in the background of their services to automatically monitor for known CSAM.  Some services employ people or engage volunteers who monitor the chat or distribution services they operate.

But, owing to the prevalence of CSAM on systems or the danger that CSAM could proliferate, background automated software is the primary tool to compare the hash values of known CSAM to content that has been uploaded, shared, or distributed on the ESP.

A typical child pornography case begins with a Cybertip by an Electronic Service Provider (ESP) (such as Google, Apple iCloud, Kik, Flingster) that a user has uploaded or shared suspected Child Sexual Abuse Material (CSAM) on the service.

Detection of Suspected Child Sexual Abuse Material

While Internet Service Providers (ISP) and Electronic Service Providers are not obligated by law to search for CSAM or child pornography on their systems, many do because the damage to business reputation of being seen as a haven for such material.  Moreover, ESPs can be shut down if they are seen as allowing such material.  In 2023, Leif K-Brooks, owner of the video chat service Omegle, shut the service down after 14 years of operation, even though it scanned its service and reported suspected CSAM.  K-Brooks cited the abuse of Omegle by traffickers in CSAM as contributing to his decision to shut the service down.

Nearly all ESPs run software in the background of their services to automatically monitor for known CSAM.  Some services employ people or engage volunteers who monitor the chat or distribution services they operate.

But, owing to the prevalence of CSAM on systems or the danger that CSAM could proliferate, background automated software is the primary tool to compare the hash values of known CSAM to content that has been uploaded, shared, or distributed on the ESP.

Call or Text (919) 352-9411

We fight for the best possible outcome in every case.  We are honest, aggressive and compassionate. 

Practice Areas

Contact Us or Call (919) 352-9411

We will respond soon.