CyberTips and Suppression: Avoiding and Defending Against Fourth Amendment Claims

Page created by Mary Holmes
 
CONTINUE READING
Child Exploitation and Obscenity Section (CEOS)                               3/12/2021
Criminal Division
United States Department of Justice
Washington, D.C.

             Information Paper for Prosecutors and Law Enforcement Officers

CyberTips and Suppression:
Avoiding and Defending Against Fourth Amendment Claims

        CyberTipline Reports (“CyberTips”) are invaluable tools in the collaborative
effort to make the internet a safer place. The CyberTipline, operated by the
National Center for Missing and Exploited Children (NCMEC), serves as an
efficient, responsive, and organized method by which Electronic Service Providers
and Internet Service Providers (“Providers”) can report child exploitation related
conduct occurring on their service platform to law enforcement. In this way, the
CyberTipline also assists Providers in their efforts to keep their platforms safe and
their brand perception positive.

      Because CyberTips are the inception for many investigations, it behooves
prosecutors to understand exactly what information a CyberTip can provide and
what legal issues might arise in investigations involving CyberTips. A better
understanding of how the CyberTipline reporting process works can both prevent
the need for and inform a response to suppression claims in later stages of
prosecution.

Statutory Background

       Federal law requires NCMEC to operate the CyberTipline, and Providers to
report apparent instances of child pornography offenses. Specifically, 18 U.S.C. §
2258A(a) imposes a duty on Providers to submit a CyberTip report containing the
facts or circumstances from which there is an “apparent violation of section 2251,
2251A, 2252, 2252A, 2252B, or 2260 that involves child pornography.” 18 U.S.C. §
2258A(a)(1)(A)(i). Providers also have the discretion to submit reports concerning
planned or imminent child pornography offenses. 18 U.S.C. §§ 2258A(a)(1)(A)(ii)
and (a)(2)(B). Reports of apparent child pornography violations shall be submitted
“as soon as reasonably possible after obtaining actual knowledge of any facts or
circumstances” of the violation.

      The report may, in the sole discretion of the provider, include the following
information:
(1)      INFORMATION ABOUT THE INVOLVED INDIVIDUAL.—Information relating
            to the identity of any individual who appears to have violated or plans to
            violate a Federal law described in subsection (a)(2), which may, to the
            extent reasonably practicable, include the electronic mail
            address, Internet Protocol address, uniform resource locator, payment
            information (excluding personally identifiable information), or any other
            identifying information, including self-reported identifying information.

   (2)      HISTORICAL REFERENCE.—Information relating to when and how a
            customer or subscriber of a provider uploaded, transmitted, or received
            content relating to the report or when and how content relating to the
            report was reported to, or discovered by the provider, including a date and
            time stamp and time zone.

   (3)      GEOGRAPHIC LOCATION INFORMATION.—Information relating to the
            geographic location of the involved individual or website, which may
            include the Internet Protocol address or verified address, or, if not
            reasonably available, at least one form of geographic identifying
            information, including area code or zip code, provided by the customer or
            subscriber, or stored or obtained by the provider.

   (4)      VISUAL DEPICTIONS OF APPARENT CHILD PORNOGRAPHY.—Any visual
            depiction of apparent child pornography or other content relating to the
            incident such report is regarding.

   (5)      COMPLETE COMMUNICATION.—The complete communication containing
            any visual depiction of apparent child pornography or other content,
            including—
             (A) any data or information regarding the transmission of the
         communication; and

               (B) any visual depictions, data, or other digital files contained in, or
         attached to, the communication.

18 U.S.C. § 2258A(b). Upon submission of the report, Providers are required to
preserve the contents of the report for 90 days. 18 U.S.C. § 2258A(h). Critically,
Section 2258A(f) specifically says that Providers have no affirmative duty to search
for child pornography. Providers face criminal fines for willful failure to comply
with the reporting requirement. 18 U.S.C. § 2258A(e).

                                             2
“Pursuant to its clearinghouse role as a private, nonprofit organization, and
at the conclusion of its review in furtherance of its nonprofit mission,” NCMEC shall
“make available” each CyberTip to federal, state, local, and/or foreign law
enforcement agencies involved in the investigation of child sexual exploitation,
kidnapping, or enticement crimes. 18 U.S.C. § 2258A(c). See also 34 U.S.C. §
11293(b)(1)(K)(i).

      Both Providers and NCMEC have criminal and civil immunity for any actions
taken to comply with the CyberTip reporting requirements, except for intentional,
reckless, or other misconduct. See 18 U.S.C. §§ 2258B, 2258D.

CyberTip Contents

      Providers first register an account with NCMEC. Once that is done, its
personnel can submit reports and upload content via a secure connection to the
CyberTipline. Aside from required information as to incident type, date, and time,
reporters can also fill in voluntary reporting fields such as user or account
information, IP addresses, or information regarding the uploaded content itself.
NCMEC staff cannot change the information submitted by the Provider, but they
can review some submitted material and add supplemental information.

      Although each Provider has its own procedures and practices concerning the
information it includes in CyberTips, all CyberTips share a general organizational
format: Section A consists of information provided by the Provider; Section B
contains data automatically generated by NCMEC; and Section C summarizes the
information contained in the previous two sections, as well as any information that
was manually collected and added by NCMEC staff.

        Section A, “Reported Information,” shows: which Provider submitted the tip;
the incident type, time, any associated URLs for the file (usually including where
the file was found on the platform prior to being removed); and user information
such as the user’s email account or IP address. Per the below example, the last part
of Section A contains “Uploaded File Information,” including the file name for each
file and whether that file was reviewed by the reporting Provider.

                                         3
“Was the file reviewed by Company” indicates whether the reported image or
file has been subjected to review, as defined by that company. For instance,
according to a recent declaration by Google, when Google includes a statement or
indication in a CyberTip that an image was viewed or reviewed by Google, it is
referring to a viewing of that image by a human reviewer concurrent to or
immediately preceding making the report. However, other Providers may define
review differently. Therefore, it is always useful to inquire of your Provider exactly
what form such review took.

        As of February 2014, NCMEC analysts will only review an uploaded file
(which might contain an image, video, screenshot, email or any other kind of file
depicting child pornography) if the report indicates that file was either viewed by
the reporting company, or that it was publicly available. If neither case is indicated,
a filter prevents NCMEC staff from viewing the uploaded file. Section C of the
CyberTip indicates if NCMEC has not opened or viewed any uploaded files
submitted with the report, and whether NCMEC has any information concerning
the content of the uploaded files, other than information provided by the Provider.

Reviewing the Contents of CyberTips

       If a file was reviewed by the Provider or if the file was publicly available,
your agent may review that image without a search warrant pursuant to the
private search doctrine, discussed infra. The agent may then use that information
to develop probable cause for a search warrant for the premises of the target and/or
for the Provider account which is the subject of the CyberTip.

        If the file was not reviewed by the Provider, then law enforcement should
obtain a search warrant for the reported material before reviewing the contents of
files contained in the CyberTip.

                                           4
Probable cause for the reported material itself can be articulated in various
ways: by reference to the file name, if it is explicit or suggestive; by reference to the
reported hash value, if it matches a known child pornography image; or if there
have been multiple individual CyberTips referencing the target’s specific username
or account information in conjunction with apparent child pornography. Including
information about the CyberTip reporting mandates may also help establish
probable cause.

       Certain CyberTips will also offer additional information about the reported
image and its contents to support probable cause. For instance, a CyberTip will
sometimes indicate whether an uploaded image is categorized or uncategorized.
Some Providers, including Google, categorize known images according to a shared
industry classification system. If an image has been categorized, the category
designation will speak to the contents of the image. For example, an image
designated as category “A1” signifies an image of a prepubescent minor and involves
sexually explicit conduct. “B1” indicates an image of a pubescent minor and sexually
explicit conduct. Categories “A2” and “B2” involve lascivious exhibition images of
prepubescent and pubescent minors, respectively.

        Additionally, investigators can search various law enforcement databases to
see if any of the reported information (such as usernames or hash values) has been
previously logged by another agency. For example, if a CyberTip includes the hash
value of the reported image, a search of law enforcement databases might reveal a
matching image. An investigator may then review the image in the law enforcement
database to determine the content of the CyberTip image, without having to open
and review the CyberTip image itself. By way of such resources investigators may
gather enough evidence to indicate a fair probability that the reported files contain
child pornography.

      Once an investigator has reviewed the images pursuant to a warrant (or
reviewed matching images in a law enforcement database) and confirmed that they
are sexually explicit images, investigators may seek a warrant for the target’s
residence as well as for the account used to receive or distribute sexually explicit
images. If investigators still have not gathered enough evidence to give rise to
probable cause, another option is to have agents obtain subscriber information via
subpoena for purposes of engaging in a “knock and talk.”

                                            5
Responding to Motions to Suppress

       Defendants may file motions to suppress targeting the Provider (arguing that
the Provider is a state actor that violated the Fourth Amendment when it searched
its networks for child pornography and submitted the CyberTip), NCMEC (arguing
that it is a state actor that violated the Fourth Amendment when it processed the
CyberTip), or law enforcement (for example, challenging law enforcement’s
warrantless review of the contents of a CyberTip). These are discussed in turn.

      A.     Providers are Not Government Agents

       Numerous courts have held that Providers do not act as government agents
when they monitor their users’ activities on their servers, or when they implement
their own internal security measures against users engaging in illegal activity
through their services. See United States v. Stevenson, 727 F.3d 826, 831 (8th Cir.
2013) (noting that “AOL's decision on its own initiative to ferret out child
pornography does not convert the company into an agent or instrument of the
government for Fourth Amendment purposes .... AOL's voluntary efforts to achieve
a goal that it shares with law enforcement do not, by themselves, transform the
company into a government agent.”); United States v. Cameron, 699 F.3d 621, 637-
38 (1st Cir. 2012) (holding Yahoo!, Inc., did not act as agent in searching emails and
sending reports to NCMEC); United States v. Richardson, 607 F.3d 357, 366 (4th
Cir. 2010) (holding that AOL's scanning of email communications for child
pornography did not trigger Fourth Amendment's warrant requirement because no
law enforcement officer or agency asked provider to search or scan defendant's
emails); United States v. Stratton, 229 F. Supp. 3d 1230, 1236-39 (D. Kan. 2017)
(holding that Sony was not government agent when it searched images stored on
defendant's PS3); United States v. DiTomasso, 81 F. Supp. 3d 304, 309-11 (S.D.N.Y.
2015) (chat service provider Omegle held not to be Government agent and its search
of defendant's chat messages held to be pure private search beyond the reach of the
Fourth Amendment); United States v. Miller, No. 8:15CR172, 2015 WL 5824024, at
*4 (D. Neb. Oct. 6, 2015) (holding that Google is “private, for profit entity” that
“complied with its statutory duty to report violations of child pornography laws” and
did not become a state actor by doing so); United States v. Ackerman, No. 13-10176-
01-EFM, 2014 WL 2968164, at *5-6 (D. Kan. July 1, 2014) (holding that AOL is not
state actor), rev'd on other grounds, 831 F.3d 1292 (10th Cir. 2016); United States v.
Drivdahl, No. CR-13-18-H-DLC, 2014 WL 896734, at *3-4 (D. Mont. Mar. 6, 2014)
(holding that Google is not government agent); United States v. Keith, 980 F. Supp.
2d 33, 40-42 (D. Mass. 2013) (holding that AOL is not government agent).

                                          6
Section 2258A(f) specifically says that Providers are not required by law to
search their platforms for child pornography. Rather, they are only required by 18
U.S.C. § 2258A(a) to report any child pornography imagery they become aware of on
their platforms. Most Providers will provide prosecutors with a declaration stating
that they have a strong business interest in enforcing their terms of service and
ensuring that their services are free of illegal content, and in particular, child
sexual abuse material. Most Providers will also provide prosecutors with a
declaration that states that the Provider independently and voluntarily takes steps
to monitor and safeguard its platform. A Provider’s independent and voluntary
steps to monitor and safeguard its platform to promote its own business
interests does not make it a government actor. Thus, the Fourth Amendment is
not implicated, and no warrant is required for a Provider to monitor, search,
and safeguard its platform.

      B.    NCMEC is not a State Actor.

      In United States v. Keith, 980 F.Supp.2d 33, 41-42 (D. Mass. 2013), the court
found that NCMEC was a state actor when it opened and examined a file that an
ISP had not reviewed but had submitted in a CyberTipline Report, NCMEC
changed its procedures for the review of files uploaded to the CyberTipline.

        Subsequent to Keith, the Tenth Circuit held that NCMEC’s warrantless
search of a defendant’s email violated the Fourth Amendment in United States v.
Ackerman, 831 F.3d 1292, 1306-1307 (10th Cir. 2016). In Ackerman, AOL’s hash
filtration system identified one attachment to an email as child pornography. Id.
The NCMEC analyst received and reviewed not only that image, but also the email
which contained it and three additional attachments. The content of the email and
the additional attachments had not been previously examined by AOL. Id. The
court found that NCMEC was a government entity, or in the alternative was a state
actor, and that its search exposed private, non-contraband information outside the
scope of AOL’s initial search, in violation of the Fourth Amendment. Id. 1

      1  When the Ackerman court reversed the district court’s denial of a motion to
suppress, it remanded the case for further consideration of whether the third-party
doctrine precludes the defendant’s Fourth Amendment claim (that is, whether the
defendant had a reasonable expectation of privacy in material he shared with a
third-party, in this case, AOL). Id. at 1304-1308. On remand, the district court
again denied the motion to suppress, and the defendant appealed. In Ackerman II,

                                         7
If you receive a suppression motion directly implicating NCMEC, you should
contact NCMEC’s John Shehan or Yiota Souras immediately. They can be reached
at JShehan@NCMEC.ORG and YSouras@NCMEC.ORG.

             i.    Policy Changes Following Keith

       Critically, NCMEC changed its procedures following the Keith decision such
that they now no longer review the attachments to CyberTips unless the Provider
indicates that they have reviewed the file or that it was found in a publicly
accessible place. Of note, the CyberTip at issue in Ackerman was sent before Keith
was decided, so was processed before NCMEC stopped reviewing the contents of
reports as a matter of course. Therefore, the first line of attack in any case
involving a CyberTip sent after February 2014, is to demonstrate that NCMEC took
no action that exceeded the scope of the private search, if any, conducted by the
Provider. To do so, prosecutors should submit declarations from the Provider and
NCMEC as discussed above. All of the case law discussed above concerning the
private search doctrine applies with equal force to NCMEC’s procedures in the post-
Keith era.

             ii.   Legislative Developments in Response to Ackerman

       Further, prosecutors should note that the CyberTipline statutes were
amended in response to Ackerman. See The CyberTipline Modernization Act of
2018, Pub. L. No. 115-395, 132 Stat. 5287 (Dec. 21, 2018). This legislation revises
18 U.S.C. §§ 2258A – 2258E to make clear that NCMEC serves a “clearinghouse
role as a private, nonprofit organization,” and that any review it conducts of
CyberTips is “in furtherance of its nonprofit mission.” 18 U.S.C. § 2258A(c). See

the Tenth Circuit affirmed the denial, holding that the good-faith exception to the
exclusionary rule applied in this case. See United States v. Ackerman, 804
Fed.Appx. 900, 2020 WL 916073, at *3 (10th Cir. Feb. 26, 2020). However, the
opinion does not address whether the defendant had a reasonable expectation of
privacy in his email and the four attachments. Prosecutors should not argue that
the defendant lacks a reasonable expectation of privacy in an email or other digital
account, unless it is supported by specific facts in the case. For example, in
Ackerman, the government relied on the fact that the defendant’s account had been
terminated by the provider according to their terms of service.

                                          8
also 18 U.S.C. § 2258D(a) (referencing NCMEC’s “clearinghouse role as a private,
nonprofit organization and its mission to help find missing children, reduce online
sexual exploitation of children and prevent future victimization.”). The
CyberTipline is operated to “reduce the proliferation of online child sexual
exploitation and to prevent the online sexual exploitation of children.” Id. at
Section 2258A(a)(1)(A). See also 18 U.S.C. § 2258(b) (information is reported in a
CyberTip in “an effort to prevent the future sexual victimization of children”.)

       Similarly, 34 U.S.C. §§ 11291 and 11293(b), the statutes that set forth
NCMEC’s duties and responsibilities, were also amended following Ackerman. See
The Missing Children’s Assistance Act, Pub. L. No. 115-267, § 2, 132 Stat. 3757-3760
(Oct. 11, 2018). 2 For example, 34 U.S.C. § 11291(10) (2017) had said that NCMEC:

      (A) serves as a national resource center and clearinghouse;

      (B) works in partnership with the Department of Justice, the Federal
      Bureau of Investigation, the United States Marshals Service, the
      Department of the Treasury, the Department of State, the Bureau of
      Immigration and Customs Enforcement, the United States Secret
      Service, the United States Postal Inspection Service, and many other
      agencies in the effort to find missing children and prevent child
      victimization; and

      (C) operates a national network, linking the Center online with each of
      the missing children clearinghouses operated by the 50 States, the
      District of Columbia, and Puerto Rico, as well as with international
      organizations, including Scotland Yard in the United Kingdom, the
      Royal Canadian Mounted Police, INTERPOL headquarters in Lyon,
      France, and others, which enable the Center to transmit images and
      information regarding missing and exploited children to law
      enforcement across the United States and around the world instantly.

      This statute now says that NCMEC:

      (A) serves as a nonprofit, national resource center and clearinghouse to
      provide assistance to victims, families, child-serving professionals, and
      the general public;

      2  These statutory changes were effectively made twice, as nearly identical
legislative language was included in The Trafficking Victims Protection Act of 2017,
Pub. L. No. 115-393, § 202, 132 Stat. 5267-5270 (Dec. 21, 2018).

                                         9
(B) works with the Department of Justice, the Federal Bureau of
      Investigation, the United States Marshals Service, the Department of
      the Treasury, the Department of State, U.S. Immigration and Customs
      Enforcement, the United States Secret Service, the United States
      Postal Inspection Service, other agencies, and nongovernmental
      organizations in the effort to find missing children and to prevent child
      victimization; and

      (C) coordinates with each of the missing children clearinghouses
      operated by the 50 States, the District of Columbia, Puerto Rico, and
      international organizations to transmit images and information
      regarding missing and exploited children to law enforcement agencies,
      nongovernmental organizations, and corporate partners across the
      United States and around the world instantly.

34 U.S.C. § 11291(7) (2018) (emphasis added). These changes emphasize NCMEC’s
mission as a non-profit organization and highlight its work with those outside law
enforcement.

      In a similar vein, Section 11293(b), which lists NCMEC’s various
responsibilities, was amended to capture NCMEC’s extensive work with child-
serving professionals, the general public, and the private sector in furtherance of its
mission, and its efforts to support victims and their families. The extensive changes
made to Section 11293(b) are not discussed in detail here, but a redline which shows
how the statute was amended by The Missing Children’s Assistance Act is included
as an Appendix to this article.

       The Ackerman court looked almost exclusively at 18 U.S.C. § 2258A and 34
U.S.C. § 11293 when analyzing whether NCMEC was a government entity.
Ackerman, 831 F.3d at 1296-1297. 3 Prosecutors outside the Tenth Circuit have
been, and still are, encouraged to argue that Ackerman was incorrectly decided, but
this intervening legislation provides an opportunity to argue in the alternative that
Ackerman’s conclusion that NCMEC is a government entity is no longer valid and
should not be followed. Even prosecutors in the Tenth Circuit should consider using
this legislation to challenge this aspect of Ackerman, as the statutory framework
which underpins Ackerman’s analysis has changed.

      3   At the time Ackerman was decided, Section 11293 was codified at 42 U.S.C.
§ 5773.

                                          10
The Supreme Court says that when “… the Government creates a corporation
by special law, for the furtherance of governmental objectives, and retains for itself
permanent authority to appoint a majority of the directors of that corporation, the
corporation is part of the Government...” Lebron v. Nat’l R.R. Passenger Corp., 513
U.S. 374, 399 (1995). Just looking at the operative statutes (which is just one data
point for the overall analysis), they now paint a very different picture. 4 For
example, the amendments to Section 11291 and 11293 highlight NCMEC’s work
with the general public, victims, families, educators, child-serving professionals,
and the private sector. And the legislation undercuts Ackerman’s conclusion that
NCMEC operates the CyberTipline solely as a law enforcement function.
Ackerman, 831 F.3d at 1296. Rather, Section 2258A(c) now makes clear that
NCMEC receives and forwards CyberTips “[p]ursuant to its clearinghouse role as a
private, nonprofit organization” and “in furtherance of its nonprofit mission”. 5
While much of NCMEC’s work still supports law enforcement, and while law
enforcement is clearly a “governmental objective” within the meaning of LeBron, the
statutes now show that NCMEC has a much broader constituency and mandate. As

      4  In its Petition for Panel Rehearing, one of the United States’ primary
criticisms of Ackerman was that the Tenth Circuit reached its conclusion that
NCMEC was a government entity without the benefit of any fact-finding on that
point by the district court, particularly with respect to NCMEC’s governing
structure. Prosecutors litigating the government entity issue should develop a
robust record, and in particular should note that NCMEC no longer has law
enforcement representatives on its Board of Directors, and with respect to its office
space, has physically separated law enforcement from NCMEC staff.

      5  The Ackerman court relied in part on the fact that the statute at the time
said that “when NCMEC confirms it has received a report, the ISP must treat that
confirmation as a request to preserve evidence issued by the government itself”
citing 18 U.S.C. § 2258A(h)(1) (2016) (“the notification to an [ISP] ... by the
CyberTipline of receipt of a report ... shall be treated as a request to preserve, as if
such request was made pursuant to section 2703(f).” Ackerman, 831 F.3d at 1297.
This statutory provision was changed by The CyberTipline Modernization Act, such
that the preservation requirement is now triggered by the Provider’s submission of
the report, as opposed to NCMEC’s confirmation of receipt: “a completed submission
by a provider of a report to the CyberTipline under subsection (a)(1) shall be treated
as a request to preserve the contents provided in the report for 90 days after the
submission to the CyberTipline.” Pub. L. No. 115-395, § 2, 132 Stat. 5291; 18 U.S.C.
§ 2258A(h)(1) (2019).

                                          11
such, it cannot be said that as a statutory matter NCMEC, in its entirety, is a
governmental entity.

      C.     Law Enforcement’s Review of Images Viewed and Reported by
             a Provider

             i.    Law Enforcement Replicates the Provider’s Review of the
                   Image

      In response to defense suppression arguments that law enforcement’s review
of images reported by a Provider constitutes an unlawful search, prosecutors will
need to make a threshold determination as to whether the Provider reviewed the
reported image. If personnel at the Provider previously viewed the reported image,
prosecutors should be able to persuasively argue that law enforcement review of the
images previously viewed and reported by the Provider was within the ambit of the
private search doctrine, did not exceed the scope of any private search conducted by
the Provider, and therefore no Fourth Amendment violation occurred.

       The protection of the Fourth Amendment extends to governmental action
only; “it is wholly inapplicable ‘to a search or seizure, even an unreasonable one,
effected by a private individual not acting as an agent of the Government or with
the participation or knowledge of any governmental official.’ ” United States v.
Jacobsen, 466 U.S. 109, 113 (1984) (quoting Walter v. United States, 447 U.S. 649
(1980) (Blackmun, J., dissenting)). So once an individual’s expectation of privacy in
particular information has been frustrated by a private individual, the Fourth
Amendment does not prohibit law enforcement’s subsequent use of that
information, even if obtained without a warrant. Id. at 116. As a result, a
warrantless law-enforcement search conducted after a private search violates the
Fourth Amendment only to the extent to which it is broader than the scope of the
previously occurring private search. Id. at 115; see also United States v. Garcia–
Bercovich, 582 F.3d 1234, 1238 (11th Cir.2009). As the Sixth Circuit has explained,
a government search will be deemed to stay within the scope of the private search
when “the officers in question had near-certainty regarding what they would find
and little chance to see much other than contraband.” United States v.
Lichtenberger, 786 F.3d 478, 486 (6th Cir. 2015). 6

      6 There is a split in the Circuits applying the private search reconstruction
doctrine to the scope of a computer search. In United States v. Runyan 275 F.3d 449
(5th Cir.2001) and Rann v. Atchison, 689 F.3d 832 (7th Cir. 2012), the Fifth and

                                         12
The reasonableness of a particular intrusion by the government is “appraised
on the basis of the facts as they existed at the time that invasion occurred.”
Jacobsen, 466 U.S. at 115. Under the private search doctrine, the critical measures
of whether a governmental search exceeds the scope of the private search that
preceded it are how much information the government stands to gain when it re-
examines the evidence and, relatedly, how certain it is regarding what it will find.
Id. at 119–20. For example, in United States v. Bowers, 594 F.3d 522, 526 (6th Cir.
2010), the defendant’s roommate’s boyfriend discovered a photo album containing
what he believed to be child pornography in the defendant’s bedroom dresser. The
Sixth Circuit upheld the agents’ search of the photo album under the private search
doctrine because the roommate had already described the contents of the album to
agents, so the agents therefore knew the album contained child pornography, and
“learn[ed] nothing that had not previously been learned during the private search,”
and “infringed no legitimate expectation of privacy.”). See also United States v.
Richards, 301 Fed.Appx. 480, 483 (6th Cir. 2008) (police entry into a storage unit
containing images of child pornography was sufficiently limited under the private
search doctrine because “[t]he officers merely confirmed the prior knowledge that
[the private party] learned earlier in the day—that unit 234 contained child
pornography.”).

      In United States v. Drivdahl, No. CR 13-18-H-DLC, 2014 WL 896734, at *4
(D. Mont. Mar. 6, 2014), Google provided CyberTips and a supplemental report,
which NCMEC sent to Pennsylvania law enforcement for further investigation.
Because a Google employee had opened and viewed the reported material prior to

Seventh Circuits take the view that when a single computer file has been searched
by a private party, the entire physical device has been searched and the government
can search the entire computer without a warrant. In United States v. Lichtenberger,
786 F.3d 478 (6th Cir. 2015) and United States v. Sparks, 806 F.3d 1323 (11th Cir.
2015), the Sixth and Eleventh Circuits adopted a rule that the proper scope of review
is the data or a file instead of the physical device, so that anything else the
government views on the device that was not actually viewed by the private party
exceeds the scope of the private search. This split has little implication for CyberTips,
as law enforcement will not be reviewing an entire device or “complex electronic
devices” that were the bases of the holdings in Lichtenberger (computer), Sparks
(cellular telephone), Rann (memory card), and Runyan (computer disks). The only
items law enforcement will be viewing in CyberTip Reports is the imagery that the
Provider previously reviewed and included in its report.

                                           13
the submission of the CyberTip, the court held that a warrant was not required, as
the government search had not expanded upon the private search. Id.

       Cases such as Drivdahl illustrate that, in relying on the private search
doctrine, the key is to educate the court on the Provider’s review and reporting
process. It is also important to show the court both how the CyberTipline is
operated and for what reasons, as well as how a reported image was treated both by
the Provider and NCMEC in your particular case. This is where declarations or
affidavits can prove useful. It may be most helpful to the court to get two
declarations: one from the Provider and one from NCMEC. As mentioned above, a
CyberTip will indicate whether the Provider previously viewed the image or not, but
will not provide any further details as to when or under what circumstances. Thus,
submitting declarations as an attachment to your response in opposition will
provide further facts to support your argument under the private search doctrine.

       Provider Declaration. A declaration from the reporting Provider should focus
on whether and when it viewed the reported material, how it reviewed the reported
material, and should also offer facts showing that the Provider is not acting as an
agent of the Government when it initially reviews files. A declaration articulating
that the Provider already reviewed the reported material provides the strongest
foundation on which to argue that any subsequent, co–extensive search by NCMEC
or an investigator has not violated the Fourth Amendment.

       A declaration from the Provider should also inform the court on the
platform’s particular terms of service (TOS). The TOS bear on the user’s reasonable
expectation of privacy, if any, in the content of his or her account by making explicit
whether the Provider has informed the user that they may review or remove
content. See e.g. United States v. Stratton, 229 F. Supp. 3d 1230, 1242 (D. Kan.
2017) (Provider policy “explicitly nullified its users reasonable expectation of
privacy” where users had to agree to the Terms of Service Agreement, which
reserved the right to monitor online activity and turn over information to law
enforcement, before signing up for an account). TOS also often a contain language to
support the idea that a provider has a strong business interest in keeping their
platform free from contraband or other illegal activities. For these reasons, it may
also be helpful to attach the actual TOS agreement existing at the time of the
CyberTipline Report, as it appears to a user, as an additional exhibit to your
Provider declaration.

                                          14
Knowing the facts with regard to the CyberTip reporting Provider is
essential, as the system of review for each Provider may differ in process and type of
review. For instance, Google uses its own proprietary hashing technology to identify
apparent child sexual abuse images. According to Google’s procedures, “before a
hash is added to Google’s repository of confirmed child sexual abuse material, an
offending image will be reviewed manually by a Google employee”. United States v.
Lien, No. 16-CR-00393-RS-1, 2017 U.S. Dist. LEXIS 188903, at *9 (N.D. Cal. May
10, 2017). See also United States v. Miller, No. 16-47-DLB-CJS, 2017 WL 2705963,
at *1 (E.D. Ky. June 23, 2017) (finding evidence indicated that Google itself had
already viewed the images and identified them as apparent child pornography
before agent’s search as “no hash is added to [Google’s] repository without the
corresponding image first having been visually confirmed by a Google employee to
be apparent child pornography”) (internal citation omitted). In some cases, after an
image is flagged, Google does not view the image again, but instead automatically
reports the user to the CyberTipline. Id. In other cases, Google personnel will
conduct a manual, human review, before reporting it to NCMEC. Id.

       NCMEC Declaration. A declaration from NCMEC can offer further support
to your response in opposition. First, the declaration can confirm what information
was provided to NCMEC at the time of the CyberTip and clarify whether NCMEC
personnel reviewed anything beyond what the Provider had reviewed at that time.
Second, a NCMEC declaration can provide useful context to the court on how the
CyberTipline works and to what extent NCMEC and the CyberTipline intersect
with governmental law enforcement actors. A NCMEC declaration can also give the
court background information on the CyberTip process and the relevant policies and
procedures for review, as well as flesh out the argument that NCMEC is a private
entity rather than a government actor. The declaration will usually include
information such as the fact that NCMEC is a private, non-profit organization
which receives substantial support from private funding and in-kind donations. It
should also make clear that NCMEC created and operates the CyberTipline without
any government direction. In a circuit that has not held that NCMEC is a
government agent, it is worth making this argument, although the court should also
be made aware that if the searches are co-extensive, there is no need for the Court
to reach a decision on the issue.

      ***

      When seeking a declaration, it is useful to have a conversation with the
declarant about the purpose of the declaration. Many Providers, as well as NCMEC,

                                         15
receive multiple requests regarding diverse factual scenarios and it can be helpful
for prosecutors to identify specific issues where clarification is needed. It may also
be useful to follow up with the Provider after the pertinent issues are decided,
which can help to inform the Provider regarding the impact of their policies. A
sample declaration (from Google) is appended at the end of this article.

       Armed with declarations from NCMEC and the Provider, the government can
persuasively argue that even if the Provider’s search constituted a “private search”
for Fourth Amendment purposes, an investigator’s review of the same image did not
affect an intrusion on a defendant's privacy interest that he did not already
experience as a result of the private search. Thus, under the private search
doctrine, law enforcement’s review of the same imagery previously viewed and
reported by the Provider does not constitute a search in violation of the Fourth
Amendment.

             ii.    Warrantless Review of Images Flagged Via Hash Matching
                    Technology Without Human Review – Some Caution
                    Recommended

      Some Providers use hashing systems to identify apparent child pornography,
but do not conduct their own manual or human review of images or videos flagged
by hash value as a matter of course. In that event, it is advisable to obtain a
warrant to review the content of the reported image or to attempt to match the hash
to images in law enforcement databases, because there is litigation risk in
reviewing the content of a hash-matched image in a CyberTipline report that has
not been human—reviewed by the provider.

       The court in Keith opined that “matching the hash value of a file to a stored
hash value is not the virtual equivalent of viewing the contents of the file.” 980 F.
Supp. 2d at 43. Post- Keith, the majority of cases that have denied a suppression
claim because the government did not exceed the scope of an Provider’s private
search have highlighted the fact that the reporting Provider at least conducted a
manual review of an image at some point in time, even if that was before hash
review and the generation of a report. For instance, in United States v. Wilson, No.
3:15-CR-02838-GPC, 2017 WL 2733879, (S.D. Cal. June 26, 2017) (unpublished) a
district court denied a motion to suppress where a law enforcement officer viewed
four images that Google had flagged using its proprietary hashing technology, all of
which images had been reviewed by Google personnel and determined to be
apparent child pornography before they were hashed and included on Google’s hash

                                          16
list. Id. at *10. The court found that the searches were co-extensive and there was
no Fourth Amendment violation. Id. at *10-11.

       In Drivdahl, the court drew a distinction between the facts before the court
there – a Google declaration confirmed that an employee had put eyes on the image
– and those in the Keith case, in which “AOL’s internal process for discovering child
pornography relied entirely on algorithmic ‘hash value’ information.” Id. at 4. The
court in United States v. (William) Miller drew a similar distinction, emphasizing
the fact that all images in Google’s repository have been visually confirmed by a
Google employee. 2017 WL 2705963 at *6 (citing Keith at 37 n.2 and 42-43). Thus,
the greater body of case law suggests that the safest way to rely on the private
search doctrine is to highlight facts indicating that someone within the reporting
Provider viewed the images, whether that was prior, subsequent, or concurrent to
any hash review.

       One recent case appears to support an argument that hash matching
technology, without any human review by the Provider, constitutes a private search
that allows law enforcement to view the reported files without a search warrant.
However, for the reasons set forth below, we caution against reliance on this case
law.

      In United States v. Reddick, the Fifth Circuit deemed an automatic review by
Microsoft, using Photo DNA, to be a private search and held that law enforcement
did not exceed the scope of that search because the agent opened only the files that
had been identified as child pornography via a PhotoDNA match. United States v.
Reddick, 900 F.3d 636 (5th Cir. 2018) (cert. filed). After the defendant in Reddick
had uploaded files to SkyDrive, Microsoft’s PhotoDNA7 program automatically
reviewed the hash values of the uploaded SkyDrive files and compared them
against an existing database of known child pornography hash values. Law

      7 Photo DNA automatically scans files and compares them has values of known
images of child pornography. Reddick at 639. According to the Microsoft website,” See
also https://www.microsoft.com/en-us/photodna (“PhotoDNA creates a unique digital
signature (known as a “hash”) of an image which is then compared against signatures
(hashes) of other photos to find copies of the same image. When matched with a
database containing hashes of previously identified illegal images, PhotoDNA is an
incredible tool to help detect, disrupt, and report the distribution of child exploitation
material.) (last accessed on April 4, 2019).

                                           17
enforcement was then alerted that the hash values of the files corresponded to hash
values of confirmed child pornography images. Id. at 638.

       Thus, in Reddick, no one from the private hosting service visually examined
the images before they were reported to law enforcement. Noting that the “exact
issues presented by this case may be novel,” the Reddick Court, relying on the
private search doctrine, held that whatever expectation of privacy the defendant
might have had in the hash values of his files was frustrated by service’s private
search, and the law enforcement search did not violate the Fourth Amendment
because the hash comparison had already indicated with “almost absolute
certainty” that the files were child pornography. Id. at 640. Since law enforcement
reviewed only those files whose hash values corresponded to the hash values of
known child pornography images, as ascertained by the PhotoDNA program, the
Court found no Fourth Amendment violation. Id. at 640.

        Reddick equates a private party’s digital hash scan of a file to the opening
and reviewing of the contents of a digital image by the private party. Id. at 639.
While that novel interpretation may prove to be a useful and defensible view of the
significance of a hash value scan, to date, no other circuit has had occasion to
address that specific issue. Moreover, Reddick clearly limited its interpretation to
files that were actually scanned and matched via hash value, as opposed to other
associated files or communications that may be included along with a report from a
private entity. Id. at 639-640. Based on Reddick, prosecutors in the Fifth Circuit
may choose to authorize law enforcement to review CyberTips without a search
warrant if the material was found through equivalent means, even though those
imagery has not previously been viewed by the Provider. In a situation where a law
enforcement agent has already conducted a warrantless review of hash-flagged
image, Reddick should prove useful in defending against a suppression challenge.

       However, all prosecutors, including those in the Fifth Circuit, may want to
avoid relying on this opinion, and instead take a more conservative approach and
obtain a search warrant to review the CyberTips. Where prosecutors have an
opportunity at the outset of an investigation to chart a course of action that may
significantly mitigate litigation risk in still largely uncharted territory, a more
cautious approach is recommended and agents should obtain a search warrant to
review files that have been detected by hash match and not previously reviewed
manually.

                                         18
iii.   Warrantless Review By Law Enforcement Should be
                    Avoided.

       Though Ackerman is not binding outside of the Tenth Circuit, it raised
awareness in all jurisdictions regarding the potential need for a search warrant to
review submitted CyberTipline files. Post- Ackerman, most courts have framed the
key questions as whether there has been a private search by the reporting Provider
and whether a subsequent review by a government actor or agent would exceed the
scope of that private search. Often, the answers to these questions are evident from
the CyberTip itself. Some CyberTips will contain files from a Provider that are both
viewed and unviewed – for instance, a CyberTipline Report may also contain email
attachments that include images that have not been viewed by the reporting entity.
Therefore, an investigator should pay close attention to the uploaded file
information for each individual file and any notes included by the NCMEC report
writer. An investigator should only view, without a search warrant, files that have
been previously viewed by the Provider or that were found in a publicly accessible
place. Beyond that, there is considerable litigation risk in reviewing the content of a
CyberTip report without a warrant.

       Diligent prosecutors will seek to avoid agents’ warrantless review of reported
child exploitation images whenever possible, to understand the exact circumstances
of a Provider’s review and subsequent report, and to educate courts on the relevant
processes and technologies regarding CyberTips. In doing so, prosecutors will not
only safeguard important evidence, but also to help ensure that future court
decisions recognize the significant, independent role that the CyberTipline plays in
the battle against child exploitation.

CONCLUSION

       Although the variety of potential suppression issues implicated by CyberTips
may seem daunting, careful understanding of the processes used by the Provider, by
NCMEC, and by law enforcement will allow prosecutors to gather the facts needed
to defeat any manner of motion to suppress brought by a defendant.

                                          19
DECLARATION OF CATHY A. MCGOFF

I, Cathy A. McGoff, declare as follows:
    1. I am a Senior Manager, Law Enforcement and Information Security at Google LLC
        (“Google”), where I have been employed for 13 years. As part of my duties at Google, I
        am a custodian of records. In that capacity, I review and respond to legal process and I
        authenticate Google’s business records. In my role I also handle issues that may arise
        from the removal of content from Google’s platform. I am familiar with Google’s
        procedures for gathering information responsive to legal process and the procedures for
        making CyberTips to NCMEC. I am over the age of eighteen and competent to make
        this declaration. I make each of the following statements based on my personal
        knowledge, and I could, if necessary, testify to the truth of each of them.
    2. Google provides Internet-based services. Google’s terms of service, which a user must
        accept as part of registering a Google Account, prohibit our services from being used in
        violation of law. The terms of service also provide that Google “may review content to
        determine whether it is illegal or violates our policies, and we may remove or refuse to
        display content that we reasonably believe violates our policies or the law.” A true and
        correct copy of Google’s relevant terms of service is attached hereto as Exhibit A.
    3. Google has a strong business interest in enforcing our terms of service and ensuring that
        our products are free of illegal content, and in particular, child sexual abuse material.
        We independently and voluntarily take steps to monitor and safeguard our platform. If
        our product is associated with being a haven for abusive content and conduct, users will
        stop using our services. Ridding our products and services of child abuse images is
        critically important to protecting our users, our product, our brand, and our business
        interests.
    4. Based on these private, non-government interests, since 2008, Google has been using a
        proprietary hashing technology to tag apparent child sexual abuse images. Each
        offending image, after it is viewed by at least one Google employee, is given a digital
        fingerprint that our computers can automatically recognize and is added to our repository
        of hashes of apparent child pornography as defined in 18 USC § 2256. Comparing
        these hashes to hashes of content uploaded to our services allows us to identify
        duplicate images of apparent child pornography to prevent them from continuing to
        circulate on our products.
    5. Separate from Google’s use of its own proprietary-technology hashes, Google
        contributes to the NCMEC-hosted Industry list of hashes. The format for this list is
        PhotoDNA, which is a technology for hash matching licensed by Microsoft. Other
        members of industry who participate in the NCMEC-hosted Industry hash sharing
        program, would have access to this hash list for the purpose of cleaning their platforms
        of abusive content. The hashes in this list are sourced from Industry, not government.
        Google does not scan the hashes from this list against user accounts.
    6. We also rely on users who flag suspicious content they encounter so we can review it
        and help expand our database of illegal images. No hash is added to our repository
        without the corresponding image first having been visually confirmed by a Google
employee to be apparent child pornography.
7. When Google’s product abuse detection system encounters a hash that matches a hash
    of a known child sexual abuse image, in some cases Google automatically reports the
    user without re-reviewing the image. In other cases, Google undertakes a manual,
    human review, to confirm that the image contains apparent child pornography.
8. When Google discovers apparent child pornography, in accordance with 18 USC 2258A,
    Google files a report with the National Center for Missing and Exploited Children
    (“NCMEC”) in the form of a CyberTip. In this matter, Google provided a CyberTip to
    NCMEC, but at no time provided the same or similar information directly to law
    enforcement.
9. Google trains a team of employees on the legal obligation to report apparent child
    pornography. The team is trained by counsel on the statutory definition of child
    pornography and how to recognize it on our products and services. Google makes
    reports in accordance with that training.
10. Google’s records reflect that the 1 image reported in CyberTip # 10794821 (submitted on
    or around May 10, 2016) received a manual human review by Google personnel
    concurrent with the report being sent to NCMEC.
11. Google did not have any discussions or interactions with NCMEC or any law
    enforcement agency pertaining to the reported account(s) prior to generating or
    submitting any of the above mentioned CyberTip.
12. When Google includes a statement or indication in a CyberTip that an image was viewed
    or reviewed by Google, it is referring to a viewing of that image by a human reviewer
    concurrent to or immediately preceding making the report. Google makes an effort to
    complete the portion of the form seeking clarity regarding whether a file was viewed
    because failure to do so incurs an operational burden of separately responding to
    subsequent inquiries regarding whether content was viewed or not.
13. Pursuant to 28 U.S.C. § 1746, I declare under penalty of perjury that the foregoing is
    true and correct to the best of my knowledge.

   ___________________________                        Date: 08/17/2018
   Cathy A. McGoff
You can also read