Privacy and security are both huge concerns for consumers and businesses alike in the evolving IoT landscape. Privacy is the unauthorized use of data by an entity that has been granted access to a dataset. Thus it is generally privacy that forms the relationship between companies and customers, and any breach of this contract is a privacy concern. Security, on the other hand, is the unauthorized use and or/access of data by an entity that has not been granted access to some dataset; e.g. hacking and external security breaches. Both privacy and security goals will be hard to reconcile with the main aim of IoT development: monitoring, collecting, analyzing, and using massive amounts of data.
Whose job is it to protect sensitive data in these rapidly-growing IoT industries? Responsibilities for data privacy and security vary by industry and by country. In the US, when companies are not regulated by another agency (e.g. the Department of Health and Human Services for HIPAA rules on medical patient data), this responsibility usually falls under the jurisdiction of the Federal Trade Commission (FTC).
The FTC has conflicting interests to balance. The Commission was created in 1914 in order to break up the increasingly-powerful corporations that controlled the oil, steel, and tobacco industries with the end goal of protecting consumers from “unfair or competitive practices”. Conversely, the FTC must avoid “unduly burdening legitimate business activity.” The FTC walks a fine line between social and moral conservatism, and economic progress.
As with the majority of emerging and semi-defined technologies, the US government has been largely content to let the market shape the course of the IoT Services market development. Yet the steadily-growing stream of privacy concerns (Snapchat, NSA, Google, Facebook, etc.) and security concerns (Anthem, Blue Cross, Target, Adobe, LastPass, the Office of Personnel of the US Government) has made it clear that the FTC will need to make its presence felt in the IoT Services market sooner rather than later. It is quite apparent that many entities simply do not have the proper incentive to thoroughly self-regulate with regards to privacy and security. Data regulation is in its infancy and it will undoubtedly be a daunting task.
The FTC published corporate guidance on privacy and security practices earlier this year. Let us parse this document to see if we can elucidate any key findings and conclusions. It is important to keep in mind that none of these recommendations carry the weight of law; the report simply “summarizes the workshop and provides staff’s recommendations in this [IoT] area.”
The FTC makes six main security recommendations in order to prevent unauthorized breaches of data. Companies should:
This is the full extent of the security recommendations. These are all common practice in industry, and the vague nature of the language adds little value to the discussion of how the FTC specifically might regulate data in the IoT market.
In the privacy section of the FTC report, the agency recommends that companies minimize the amount of data they collect, but the recommendation is quite flexible, giving companies the option to collect potentially useful data with consumer consent. But how does a company obtain consent when the device or service has no interface, as will be the case with many embedded devices employed in the IoT market?
According to the FTC, as long as the use of the data is “expected” and “consistent with the context of the interaction” a company need not explicitly obtain consent to collect data. This language does not set any standards; rather it is remedial language that can be applied to different situations post-incident. The FTC couples this expected use language with industry-specific legislation, such as the Fair Credit Reporting Act, which restricts the usage of credit data in certain circumstances. In summary, under these recommendations the company has nearly full discretion in the collection and usage of data as long as it can prove that it is using the data in an “expected” manner relative to the nature and context of its relation with its patron (barring any industry-specific legislation).
The report notes an interesting idea proposed by MIT Professor Hal Abelson. He suggests that data be “tagged” upon collection with appropriate uses so that another software could identify and flag and inappropriate uses, providing a layer of protection and forcing the company to think about how to use the data before collecting it. We expressed a similar view in a recent VDC View document entitled “Beyond ‘Who Owns the Data?’,” suggesting that IoT vendors develop and implement data structures to permit highly flexible assignments of data access right and usage permissions. Tagging would certainly be one way to segregate usage rights and protect different streams of data.
The FTC states that any legislation concerning the IoT would be premature at this point. However, staff recommends that Congress should enact “general data security legislation” and “basic privacy protections” which it cannot mandate itself. Basically, the FTC needs a new legislative base from which to launch lawsuits. Congress created an IoT Caucus shortly after the filing of this FTC report, but it has been mostly silent since its inception.
Perhaps the most interesting part of this report comes in the form of a dissent by one of the 5 commissioners (leaders) of the FTC. Commissioner Joshua Wright notes that the FTC generally issues two types of reports: 1) an in-depth and impactful report commissioned by Congress that compels private parties to submit data to the FTC for analysis and review; or 2) a slightly less formal report that details and makes public any workshops conducted by the Commission, concluding with recommendations that are supported by substantial data and analysis.
Wright contends that this FTC report does not fit either of these categories, and goes on to shred the report to pieces. Firstly, he argues, the IoT is a nascent and far-ranging concept – a one-day workshop cannot generate a sufficient sample of ideas or range of views in order to support any policy recommendation. Secondly, he observes that the report “does not perform any actual analysis,” instead merely relying on its own assertions without qualification or economic backing. He goes as far as to say that the report merely pays “lip service” to a few obvious facts without actually performing any analysis. Thirdly, he remains unconvinced that the Fair Information Practice Principles (FIPP) is a proper concept to apply to the IoT, favoring instead “the well-established Commission view companies must maintain reasonable and appropriate security measures; that inquiry necessitates a cost-benefit analysis. The most significant drawback of the concepts of ‘security by design’ and other privacy-related catchphrases is that they do not appear to contain any meaningful analytical content.” Commissioner Wright clearly has a large bone to pick with the method by which the FTC is considering data regulation in the IoT market.
Corporations and consumers alike in the IoT market would do well to pay attention to the following conclusions that we can draw from the FTC document and Commissioner Wright’s dissent: