The Internet of Things (IoT) is a concept dating back to the 1920s, when Nikola Tesla first described a wireless world “converted into a huge brain.”1 Now, almost a century later, we are experiencing the real possibility that we have created this “brain” through the Internet of Things. This begs the question: does the average consumer really know what information they have consented to contribute to this brain? As the IoT appears to be the next big wave of technological change, the importance of consent is already evolving.2

The extent of the amount and type of information collected can be demonstrated by a few simple examples. One such example is how a Roomba, and other internet-connected vacuums, gather data that essentially map the layout of your home; they also track run times which may indicate lifestyle patterns such as when consumers are at home and or not.3 Although Roomba does not currently sell this information, it has been reported that they at least have considered plans to enter the market of selling this information.4

Smart cars are another example. Gartner consultancy estimates that by 2020, approximately 250 million cars will be connected to the internet.5 Autonomous cars will not only require an immense amount of data to operate, but will also collect an extraordinary amount of data during the operation of the vehicle—such as mappings of roads, routes, time spent in the car, and how often a given location is visited—raising privacy concerns.6

Consent and the amount of information consumers have agreed to expose seem to have evolved through a slippery slope of consumer complacency; first, by consumers agreeing to small amounts of information in exchange for the use of a website or product, and then evolving into today’s technological world which largely could not operate without the collection of this data. Historically, emphasis has been placed on implied consent and privacy policies that attempt to limit a business’s liability.7 However, even now, given the amount of information that is shared over the internet through the use of cookies and other similar technologies, consumers seem unaware or unconcerned with the depth of information tracked.

A social experiment highlighted this casual attitude of sharing personal data when a New York artist offered to trade actual cookies in exchange for an individual’s maiden name, fingerprints, or partial social security numbers.8 Nearly 400 individuals accepted this offer despite the artist refusing to say what she would do with the information she collected.9

As early as the 1970s, the Fair Information Privacy Principles (FIPPs) were developed by the U.S. Secretary of Health, Education, and Welfare Secretary’s Advisory Committee on Automated Personal Data Systems10 to provide guidance in the collection, maintenance, and dissemination of information. However, although widely referenced, the FIPPs have little binding effect except where they have been codified into a statute; in the United States, where privacy statutes are limited and largely industry dependent, the FIPPs remain more of a recommendation than an enforcement mechanism.11

Across the Atlantic, the European Union has recently taken a more rigorous stance on consumer consent. Its policy not only requires “unambiguous” consent, but also questions whether consent was “freely given” where consent is a condition to the terms of service, except where data processing is necessary for the service itself.12 The new EU General Data Protection Regulation (GDPR) is garnering a lot of attention, indicating society may be beginning to shift the ways in which consumers, businesses, and regulations interact and enforce consent of data collection and dissemination.13

Today, most consumers have a general sense that data pertaining to them is collected but likely have little insight into the degree of information collected nor its uses and potential for abuse.14 As the IoT becomes more pervasive, regulators have an opportunity to consider placing more controls on data. However, these controls must consider the chilling effects that may stem from limiting the information a business can collect and the ways it can be used, or waiting for consumers to reach the limit at which they no longer impliedly consent to the collection of data and abstain from products that require too much exposure. Because consumers may be unaware of the types and depth of information collected, the duty likely falls on regulators to balance these tensions by placing stricter limits on the types and uses of information or to at least provide mechanisms that aid consumer awareness of the collection, use, and dissemination of data.