Honorable Edmond E. Chang, United States District Judge.
The Illinois Biometric Information Privacy Act forbids the unauthorized collection and storing of some types of biometric data. 740 ILCS 14/1 et seq. A private entity cannot gather and use someone's "biometric identifier" — defined as retinal or iris scans, fingerprints, voiceprints, or hand or face geometry scans — unless that person has consented. Id. § 14/10. The Act also bans the non-consensual collection and storage of information (the Act labels it "biometric information") that is "based on" those biometric identifiers. Id.
In the months leading up to March 2016, photographs of Lindabeth Rivera were allegedly taken by a "Google Droid device"
For purposes of evaluating the dismissal motion, the Court must accept as true the allegations in the First Amended Complaints. Erickson v. Pardus, 551 U.S. 89, 94, 127 S.Ct. 2197, 167 L.Ed.2d 1081 (2007). Between around March 2015 and March 2016, "approximately eleven" photographs of Lindabeth Rivera were taken in Illinois by a Google Photos user on a Google Droid device. Rivera First Am. Compl. ¶ 27. The person who took the picture was an Illinois resident who had purchased the Droid device in Illinois. Id. As soon as the photographs of Rivera were taken, the Droid automatically uploaded them to the cloud-based Google Photos service. Id. According to the Complaint, Google immediately scanned each uploaded photograph of Rivera. Id. ¶ 28. The scans located her face and zeroed in on its unique contours to create a "template" that maps and records her distinct facial measurements. Id. At the time of the automatic upload and face-scan, the photographer's Droid device was still in Illinois and would have had an Illinois-based Internet Protocol (IP) address. Id. ¶ 27.
Weiss's experience was similar, except that Weiss himself was a user of Google Droid and Google Photos (Rivera, on the other hand, neither had a Droid nor a Google Photos account). Weiss First Am. Compl. ¶¶ 26-27; Rivera First Am. Compl. ¶ 26. Between 2013 and 2016, Weiss took "approximately twenty-one" photos of himself while in Illinois on his Droid device. Weiss First Am. Compl. ¶¶ 26-27. These photos were automatically uploaded when they were taken, and then immediately scanned to create a custom face-template based on Weiss's features. Id. ¶¶ 28-29. At the time of uploading and scanning, Weiss's Droid was in Illinois and it would have had an Illinois-based Internet Protocol (IP) address. Id. ¶ 28.
Both Rivera and Weiss contend that their face-templates were then used by Google to find and group together other photos of them. Rivera First Am. Compl. ¶ 29; Weiss First Am. Compl. ¶ 30. Google also used the templates to recognize their gender, age, race, and location. Rivera First Am. Compl. ¶ 30; Weiss First Am. Compl. ¶ 31. At no time was Rivera's or Weiss's consent sought by Google to create or use the face-templates. Rivera First Am. Compl. ¶¶ 32-33; Weiss First Am. Compl. ¶ 33-34. Nor did Rivera or Weiss give Google permission to collect or store the data derived from their faces. Rivera First Am. Compl. ¶ 31; Weiss First Am. Compl. ¶ 32.
Based on these allegations, Rivera and Weiss, individually and on behalf of a proposed class, bring suit against Google for a violation of the Illinois Biometric Information Privacy Act. They argue that the face geometry templates created by Google are "biometric identifiers" within the definition of the Privacy Act, and accordingly cannot be collected without consent. Rivera First Am. Compl. ¶¶ 1, 21, 43-48; Weiss First Am. Compl. ¶¶ 1, 21, 44-49. Rivera and
Google brings its motion under Federal Rule of Civil Procedure 12(b)(6). A Rule 12(b)(6) motion tests the sufficiency of the complaint, Hallinan v. Fraternal Order of Police of Chi. Lodge No. 7, 570 F.3d 811, 820 (7th Cir. 2009); Gibson v. City of Chi., 910 F.2d 1510, 1520 (7th Cir. 1990). When deciding a motion to dismiss, the Court accepts as true all factual allegations in the complaint and draws all reasonable inferences in the plaintiff's favor. Killingsworth v. HSBC Bank Nev., N.A., 507 F.3d 614, 618 (7th Cir. 2007).
Under Rule 8(a)(2), a complaint generally need only include "a short and plain statement of the claim showing that the pleader is entitled to relief." Fed. R. Civ. P. 8(a)(2). The complaint must "give the defendant fair notice of what the ... claim is and the grounds upon which it rests." Bell Atl. Corp. v. Twombly, 550 U.S. 544, 555, 127 S.Ct. 1955, 167 L.Ed.2d 929 (2007) (alteration in original) (internal quotation marks and citation omitted). These allegations "must be enough to raise a right to relief above the speculative level," id., and must "contain sufficient factual matter, accepted as true, to `state a claim to relief that is plausible on its face,'" Ashcroft v. Iqbal, 556 U.S. 662, 678, 129 S.Ct. 1937, 173 L.Ed.2d 868 (2009) (quoting Twombly, 550 U.S. at 570, 127 S.Ct. 1955). Only factual allegations are entitled to the assumption of truth, not mere legal conclusions. Iqbal, 556 U.S. at 678-79, 129 S.Ct. 1937.
Google's primary argument is that Rivera and Weiss really are complaining about Google's use of their photographs, and the Illinois Biometric Information Privacy Act does not cover photographs or information derived from photographs. Def.'s Br. at 6-13. Google also offers a backup argument: even if what Google is doing would run afoul of the Privacy Act if done in Illinois, Google supposedly did not do anything in Illinois, so there is no violation of that Illinois law. Id. at 13-15. And Google offers a backup to the backup argument: if the Privacy Act does purport to cover what Google did outside of Illinois, then the state statute actually conflicts with the federal Constitution's Dormant Commerce Clause. Id. at 15-19. Each issue is addressed in turn below.
Google submits that Rivera's and Weiss's claims should be dismissed because the Privacy Act does not apply to photographs or information derived from photographs. Rivera and Weiss, however, argue that face geometry scans created from photographs are covered by the Act, and qualify as both "biometric identifiers" and "biometric information" within the Act. So the first question is whether the face geometry scan as described by Rivera and Weiss (a description that must be accepted as accurate at the dismissal-motion stage) fits the statutory definition of either "biometric identifier" or "biometric information." To answer the question, the usual principles of statutory interpretation apply.
Start with the text. The Privacy Act forbids private entities from gathering and keeping a person's "biometric identifier" and "biometric information" without first giving notice and getting consent:
740 ILCS 14/15(b). Beyond the ban on non-consensual gathering and collecting, private entities that do obtain biometric identifiers and information must publish a "retention schedule" that details how the data will be kept and when it will eventually be destroyed.
But what is a "biometric identifier" and what is "biometric information"? The latter is defined by reference to the former, so it makes sense to start with "biometric identifier." The Act defines "biometric identifier" in a very specific way:
After affirmatively defining "biometric identifier," the Act goes on, in the same long paragraph as the affirmative definition, to list a catalogue of things that are not biometric identifiers. The list runs on for five sentences, and the first sentence is especially important for this case because that is where the word "photographs" appears (the bracketed numbers do not appear in the statute, and instead are inserted for convenience):
740 ILCS 14/10. Because the parties base some of their arguments on these "do not include" sentences, it is worth noting that some of these sentences do not simply set out exceptions, in the usual sense of the word "exceptions," to the specified biometric identifiers. That is, an "exception" to a definition is usually something that otherwise probably would be covered by the affirmative definition. To be sure, some of the five disqualifying sentences are true exceptions. For example, in the fourth disqualifying sentence, if a patient will be undergoing facial reconstruction surgery, then the information collected from a face geometry scan — which otherwise would be covered by the affirmative definition of biometric identifier — in that "health care setting" would be exempted from the definition. In contrast, the first sentence says (in part) that "written signatures" are not biometric identifiers. This reads like a "just to be totally sure" disqualifier, rather than an ordinary exception, because it seems unlikely that written signatures could ever fit into any of the affirmatively specified biometric identifiers (to repeat the specified list: "a retina or iris scan,
Moving on to "biometric information," the Privacy Act affirmatively defines that term by referring back to "biometric identifier," and then also provides a "does not include" disqualifier:
740 ILCS 14/10. The affirmative definition of "biometric information" does important work for the Privacy Act; without it, private entities could evade (or at least arguably could evade) the Act's restrictions by converting a person's biometric identifier into some other piece of information, like a mathematical representation
Turning to the case at hand, as alleged in the First Amended Complaints (the truth of which must be assumed for now), the "face templates" (as Rivera and Weiss call them) generated by Google do qualify as a "biometric identifier" under the Privacy Act. For each face template, Google is creating a set of biology-based measurements ("biometric") that is used to identify a person ("identifier"). More importantly, as alleged, a face template is one of the specified biometric identifiers in the Privacy Act, namely, a "scan of ... face geometry." 740 ILCS 14/10.
Against this straightforward reading of the definition of "biometric identifier," Google argues that face-scan measurements derived from a photograph do not qualify as biometric identifiers. Def.'s Br. at 1. In Google's view, only face scans that are done in person can qualify as biometric identifiers. Id. at 7; R. 52, Def's. Reply Br. at 5. But nothing in the text of the Privacy Act directly supports this interpretation. Nothing in the statute says, one way or the other, how the biometric measurements must be obtained (or stored, for that matter) in order to meet the definition of "biometric identifier." The definition simply lists the specific identifiers that are covered. And the particular biometric identifiers can, in fact, be collected in various ways without altering the fact that the measurements still are biometric identifiers. Consider, for example, fingerprints: the definition is indifferent as between inked fingerprints versus digital images of fingerprints. Nor does the definition say whether a scan of face geometry has to be in person or may be generated from a photograph or a video.
Without direct textual support in the affirmative definition of "biometric identifier," Google's argument hinges on the first "do not include" sentence in the definitional paragraph of "biometric identifier." Remember that the affirmative definition is followed by five sentences that say what biometric identifiers "do not include." The first of those sentences includes "photographs" in the list of what biometric identifiers do not include:
740 ILCS 14/10 (emphasis added). With that premise in hand — photographs are not biometric identifiers — Google then points to the "do not include" sentence from the definition of biometric information. Remember that sentence says, "Biometric information does not include information derived from items or procedures excluded under the definition of biometric identifiers." Id. So, Google's argument continues, biometric information does not include information derived from photographs, because photographs are excluded from the definition of biometric identifiers. But Google's argument is not yet complete, because Rivera and Weiss are not alleging that the photographs themselves are the biometric identifiers, and because Google still needs to grapple with the face templates — not the photographs — qualifying as biometric identifiers. To do this, Google purports to find a "careful structure," Def.'s Br. at 8, in the separate definitions of "biometric identifier" and "biometric information." Google argues that the two definitions distinguish the "source of the content":
Id. In essence, Google is arguing that if biometric information cannot be "based on" something from the biometric-identifier paragraph's "do not include" list (for example, "photographs"), then an identifier may also not be "based on" something from that same list.
The problem with this argument is that there is no textual or structural clue to support it. The definition of "biometric
The other fatal problem with Google's "careful structure" argument is that it depends on drawing some structural meaning from the "do not include" sentences. But that cannot be done. As noted earlier, the five sentences in the run-on paragraph (block quoted earlier in the Opinion) comprise a mix of things that are true exceptions (that is, they otherwise would qualify as a biometric identifier) and others that read more like just-to-be-sure exclusions. Yes, structure and context can provide interpretive help, but the "do not include" listings defy a common thread that sheds any additional light on the straightforward affirmative definition of biometric identifier. As a result, it is not sensible to use the photograph exclusion to back-fill an interpretation of biometric identifier — unless, of course, the proposed identifier in question is simply a photograph. But a photograph is just that — a photograph, not a scan of face geometry, which is a set of biology-based measurements. Rivera and Weiss nowhere argue that the photograph itself is the biometric identifier. Indeed, if Google simply captured and stored the photographs and did not measure and generate scans of face geometry, then there would be no violation of the Act. (The same is true of someone, say a journalist, who records a person's voice without generating a voiceprint.) All in all, the reference to "photographs" in the first "do not include" sentence is no help to Google's argument.
Google's final attempt to argue from text and structure is premised on the Privacy Act's written-consent requirement. 740 ILCS 14/15(b)(3). That requirement dictates that, in order to collect a person's biometric identifier or biometric information, a private entity must (among other things) receive a "written release" executed by the person. Id. The need for a written release is telling, Google says, because consent may be most easily and clearly given in person. Def.'s Br. at 9.
This is unconvincing for two reasons. First and foremost, even assuming, for the moment, that in-person consent is logistically more convenient to get than remote consent, the absence of any other textual or structural clues that the scan must be in person outweighs the weak inference arising from any purported logistical convenience. Second, there is substantial reason to doubt just how much easier it is to obtain a written release in person. Many courts, for instance, have routinely upheld one-click ("clickwrap") consent features on websites and internet services. See, e.g., Van Tassell v. United Mktg. Grp., LLC, 795 F.Supp.2d 770, 790 (N.D. Ill. 2011) (courts "regularly uphold" clickwrap agreements when structured properly); Ronald J. Mann & Travis Siebeneicher, Just One Click: The Reality of Internet
Because the text of the Privacy Act provides the answer to whether the alleged face templates come within the definition of biometric identifier, there is no need to resort to legislative history. Ultsch v. Ill. Mun. Ret. Fund, 226 Ill.2d 169, 314 Ill.Dec. 91, 874 N.E.2d 1, 10 (2007) ("Where the language of a statute is plain and unambiguous, a court need not consider other interpretive aids."). For the sake of completeness, however, the Court explains why Google's legislative-history-based arguments are also wanting.
Google first points to the statement of the Act's cosponsor on the floor of the state House of Representatives to argue for the importance of the in-person aspect of scans. Def.'s Br. at 10. The state Representative declared that the bill's urgency was exemplified by the bankruptcy of a company called Pay By Touch, which was "the largest fingerprint scan system in Illinois." Id. (quoting IL H.R. Tran. 2008 Reg. Sess. No. 276 at 249 (May 30, 2008)). The bankruptcy, according to the Act's cosponsor, left "thousands of customers... wondering what will become of their biometric ... data." Id. Even if legislative history could be relied on here, this is but one House floor statement. Indeed, here there is no mystery on the subject of legislative intent, because the Act itself clearly sets forth legislative findings and legislative intent. Section 5 is entitled, "Legislative findings; intent," and elaborates on the legislature's concerns. To be sure, the only example given in Section 5 relates to in-person transactions ("finger-scan technologies at grocery stores, gas stations, and school cafeterias"), 740 ILCS 14/5(b), but the stated concern is considerably broader than this one application. That same Section declares that the General Assembly finds that the "use of biometrics is growing in the business and security screening sectors" and that major corporations are testing "new applications of biometric-facilitated financial transactions." Id. § 14/5(a)-(b). Section 5 also outlines the unique threats of biometrical information capture (for example, the uniqueness and unalterable nature of such information, the risks for identity theft, and the chance that concerned citizens will avoid biometric-facilitated transactions). Id. § 14/5(c), (e). And the Section goes on to point out that "[t]he full ramifications of biometric technology are not fully known." Id. § 14/5(f). All of this explicit statutory text dwarfs Google's single floor statement.
Google next argues that the legislative history would have recorded some reaction to this proposed Act — "someone would have remarked upon it" — if the legislation were truly so allegedly "sweeping" as Rivera and Weiss suggest (that is, that it covers face templates scanned from photographs). Def.'s Br. at 10. This line of thinking is not persuasive. Google calls the reading "sweeping" because it argues that the Act would, under the Plaintiffs' reading, apply to "any individual ... running common photo-organizing software on a home computer." Id. For two reasons, however, this is an unconvincing legislative-history
Google next argues that the General Assembly's ultimate rejection of other proposed words in the statute's definitions also speaks to the legislature's intent to exclude facial geometry data-gathering not done in person. More specifically, Google points out that the Assembly's "opt[ing] for the word `scan' over `records' suggests that it cared about how the content was obtained." Def.'s Br. at 10-11. Google contends that a scan "suggests something done to the person herself" and "records" does not. Id. at 11; Def.'s Reply Br. at 7. This is not necessarily right. "Scan" may indeed be more suggestive of a direct procedure than the more inert-sounding "record." But within the various meanings of "scan" in everyday speech, "scan" is not obviously more suggestive of something done directly to a person than to a thing (like a photograph). And, most importantly (as discussed earlier), the definition of biometric identifier is simply a list of specified things that does not distinguish the manner in which the identifier is generated.
Google's final point on legislative history is that the General Assembly dropped, at some point, the word "facial recognition" from an earlier-proposed definition of
All of this said, it remains possible that Google could prevail on its face-template arguments (that is, that what Google collects from the photos are not covered by the Act) once further factual development has occurred in discovery. It is conceivable that discovery will reveal that what Google is actually doing does not fit within the definition of biometric identifier as interpreted by the Court. Until that time, however, the Plaintiffs' allegations must be taken as true, and they adequately state a claim under the Privacy Act.
Google's second argument in its motion to dismiss is that the Plaintiffs' claims cannot proceed because applying the Privacy Act to Google would result in an extraterritorial application of the statute. Def.'s Br. at 13. In other words, this Illinois law applies only in Illinois, and Google is not doing anything in Illinois. As explained in this section, at this early stage in the case, the Court cannot decide this issue in Google's factual favor, and can only hold — as a matter of law — that the Illinois Biometric Information Privacy Act does not apply extraterritorially. Discovery is needed to determine whether there are legitimate extraterritoriality concerns.
Under Illinois law, an Illinois statute does not have extraterritorial effect unless the Assembly expressly intended it. Avery v. State Farm Mut. Auto. Ins. Co., 216 Ill.2d 100, 296 Ill.Dec. 448, 835 N.E.2d 801, 852-53 (2005) (quoting Dur-Ite Co. v. Indus. Comm'n, 394 Ill. 338, 68 N.E.2d 717, 722 (Ill. 1946)). Google is correct that, with regard to the Privacy Act, there is no sign of that sort of intent from the Illinois legislature. Nor does Rivera or Weiss argue that the Act is intended to operate extraterritorially. The Privacy Act is not authorized to have extraterritorial effect.
If the Act cannot apply extraterritorially, then Rivera's and Weiss's asserted violations of the Act must have taken place in Illinois in order for them to win. Returning to Avery (on which both parties rely for their arguments on the location of the ostensible Privacy Act violations),
Much of this case revolves around conduct occurring online or on a "cloud." Avery's totality-of-the-circumstances standard has not yet produced much guidance in the context of online conduct. In one case applying Avery to an Illinois Deceptive Trade Practice Act claim, the court found that an infringement that "took place on the Internet and was international in scope" was thereby "presumably occurring in Illinois." Specht v. Google, Inc., 660 F.Supp.2d 858, 866 (N.D. Ill. 2009). That assessment, in combination with other Illinois connections, was found sufficient to state a claim under that particular statute. Id. In another case, a London-based plaintiff was surfing a hotel's website to book a hotel room in Moscow. Later taking issue with allegedly misleading prices quoted on that website, that plaintiff sued the hotel on an Illinois Consumer Fraud Act claim in Illinois, arguing that Avery had been satisfied because the hotel company's principal place of business was in Illinois and because the hotel website had an Illinois choice of law clause. Shaw v. Hyatt Int'l Corp., 2005 WL 3088438, at *2-3 (N.D. Ill. Nov. 15, 2005), aff'd, 461 F.3d 899 (7th Cir. 2006). The district court found those connections too tenuous and decided the relevant transaction should not be found to have "primarily and substantially" occurred within Illinois. Id. Neither of these cases is directly controlling, but they show that Internet factors (site access, corporate operation of a website) might be part of the Avery calculus.
The question, then, is whether Google's activities — making face templates of Rivera and Weiss in photographs uploaded automatically from Google Droid devices in Illinois — are an extraterritorial (and therefore not-actionable) application of the Privacy Act. According to Rivera and Weiss, all of the following are Illinois connections: Rivera and Weiss are Illinois residents, R. 51, Pls.' Resp. Br. at 22; Rivera's and Weiss's photographs were taken in Illinois, id.; and Rivera's and Weiss's photographs were allegedly "automatically uploaded in Illinois to the cloud-based Google Photos service ... from an Illinois-based Internet Protocol (`IP') address," id. Rivera and Weiss also allege that it was in Illinois where Google failed to provide Rivera and Weiss with required disclosures and failed to get Rivera's and Weiss's consent, id. at 22-23. At this stage of the proceedings we take the Plaintiffs' allegations as true, and these alleged facts
Google contends that the face scans did not occur "primarily and substantially" in Illinois. Def.'s Br. at 14. It cites to Rivera's and Weiss's failure to allege a location for the actual scanning of face geometry. Id. In Google's reckoning, the location of the scan ("the place where a biometric identifier is `collected, captur[ed], ... or otherwise obtained'") is to be seen as the determinative "situs" of the Privacy Act violation. Id.; Def.'s Reply Br. at 11. Google also cites other circumstances not mentioned by Rivera or Weiss that could be relevant under the Avery test, such as "where the photographer resides" (in the case of Rivera) and "where the Google Photos user signed up for Google Photos" (in the case of Weiss). Def.'s Br. at 14.
Assessing these arguments at this initial stage, the Court concludes that the Plaintiffs sufficiently allege facts that would deem the asserted violations as having happened in Illinois. But there is no bright-line rule for determining this, so the parties will have the chance to develop more facts during discovery. For example, where did the alleged scans actually take place? Even if we do definitely determine that the scanning takes place outside of Illinois, that would not necessarily be dispositive. Avery, 296 Ill.Dec. 448, 835 N.E.2d at 853 ("The place of injury or deception is only one of the circumstances that make up a fraudulent transaction and focusing solely on that fact can create questionable results. If, for example, the bulk of the circumstances that make up a fraudulent transaction occur within Illinois, and the only thing that occurs out-of-state is the injury or deception, it seems to make little sense to say that the fraudulent transaction has occurred outside Illinois."); Gros, 525 F.Supp.2d at 1024 ("Avery instructs courts to consider the totality of the circumstances in determining whether the disputed transaction occurred `primarily and substantially' in Illinois.... It is therefore incorrect to focus on only one aspect of the disputed transaction.").
Another issue needing further factual refinement is where precisely the lack of consent took place. This is a complex issue, and neither side has yet addressed it thoroughly. Part of the lack of consent location issue may hinge on whether uploads are indeed scanned "immediately" as Rivera and Weiss suggest or if there is any intervening time before a scan occurs.
Google's last argument is that the practical effect of the Illinois Biometric Information Privacy Act, if read the way Rivera and Weiss read it, would violate the Dormant Commerce Clause of the United States Constitution. Def.'s Br. at 15. Google reaches that conclusion by arguing under an extraterritorial Dormant Commerce Clause theory that the Privacy Act (if read the way Plaintiffs read it) has the practical effect of controlling conduct beyond Illinois's boundaries. Id.
The Constitution grants Congress the power to "regulate Commerce with
A third test — based on extraterritorial effects — has been used increasingly by courts to assess compliance with the Commerce Clause. In Healy v. Beer Institute, Inc., the Supreme Court laid out the Commerce Clause's implications for extraterritoriality in state regulation:
491 U.S. 324, 336, 109 S.Ct. 2491, 105 L.Ed.2d 275 (1989) (bracketed numbers and paragraphing added). See also Midwest Title Loans, Inc. v. Mills, 593 F.3d 660, 665 (7th Cir. 2010) ("But another class of nondiscriminatory local regulations is invalidated without a balancing of local benefit against out-of-state burden, and that is where states actually attempt to regulate activities in other states."); Int'l Dairy Foods Assoc. v. Boggs, 622 F.3d 628, 646 (6th Cir. 2010) ("a state regulation is `virtually per se invalid' if it is ... extraterritorial... in effect."); Pac. Merch. Shipping Ass'n v. Goldstene, 639 F.3d 1154, 1178 (9th Cir. 2011) ("[T]he Commerce Clause prohibits state legislation regulating commerce that takes place wholly outside of the state's borders, regardless of whether the commerce has effects within the state.").
Google argues that under this extraterritoriality test, the Privacy Act as construed by Rivera and Weiss violates the Commerce Clause. Google gives an elaborate hypothetical to show that a very tenuous connection to Illinois (merely driving through the state and uploading a photograph that had nothing to do with Illinois) could still lead to a violation of the Act. Def.'s Br. at 15. It also argues that, if there is not a bright-line rule for determining for the purposes of the Act whether conduct is taking place in Illinois, Google
Again, however, this is not the stage at which to assess these arguments in detail. The Commerce Clause argument is directly related to the extraterritoriality effect argument treated above. To repeat, the Illinois Biometric Information Privacy Act was not intended to and does not have extraterritorial application. Whether the Privacy Act is nevertheless being summoned here to control commercial conduct wholly outside Illinois is not possible to figure out without a better factual understanding of what is happening in the Google Photos face-scan process. What is learned from discovery there will inform both the more general extraterritoriality analysis above and this Dormant Commerce Clause analysis.
In addition to the factors needing discovery mentioned earlier, one of the arguments unique to the Commerce Clause section also warrants further factual development. Specifically, the parties need to develop what Google's burdens of compliance would actually be under the Privacy Act, if it were read in the way the Plaintiffs suggest. As noted earlier, the Privacy Act only subjects violators to statutory damages if there is negligence or willfulness. 740 ILCS 14/20. So even if the Plaintiffs' construction of the Act were to depend on a totality-of-the-circumstances test for assessing location, Google could conceivably avoid liability by taking reasonable steps toward compliance. But perhaps the technological details of Google Photos' operation and the ubiquity of the Internet and cloud computing would make even attempts at reasonable compliance unworkable. Again, further discovery could help determine if that is so. It would also allow the Court to see in more detail how Google's specific transactions fit into the current legal landscape on Internet-based Commerce Clause violations.
Google's motion to dismiss is denied. At the next status hearing, the Court will set the discovery schedule.