A few weeks ago, I met an executive from a major Hollywood Studio. One thing that I heard from him that I did not expect was studios are regulated and, more importantly, very concerned about protecting our personal privacy. Honestly, I have grown to expect to hear these words in healthcare, banking and insurance, but not from a Hollywood executive.
As with most things like this, there is an interesting inception point for this regulation. What I learned was that during Robert Bork’s Supreme Court confirmation hearings, his video rental history was published without his authorization. While nothing on the list apparently would create a scandal even in that time, most felt that Bork’s privacy had been violated by the release. Given this, The Video Privacy Protection Act (VPPA) was passed in 1988. Its aim was to prevent a "wrongful disclosure of video rentals or sales.” It established, as well, damages for "video service providers" that disclose rental information outside their ordinary course of business. The per person penalties are not a lot, but in the age of cyber hacking, the damages from a mass release could really add up. Interestingly, as well, there is even case law that has applied the act to Facebook, Netflix and Hulu.
With this said, those that collect information on me and my video purchases have an interesting dilemma. They need to use my purchase data to intelligently suggest videos based upon my previous purchase history, but at the same time, they need to make sure that this data is not released or misused internally. This means that they need to anonymize data in two directions.
First, they need to ensure that connecting me to my video purchases history is protected. Second, they cannot gather information on all the purchasers of a video. Information releases in both these scenarios are protected. Doing this well is difficult unless you create separate databases for purchases and users, but this makes it difficult to automate suggestions — a key element in providing a winning digital experience.
If you govern and de-identify a purchaser’s identifying information, you can protect purchase history data from internal users without authorized permissions. This can even include the computer scientists developing movie suggestion algorithms.
At the same time, you can prevent a mass release of user video purchases and viewing histories. All that will be shown to those without authorization is a scrambled display of information whether it be for purchases or accesses for a video.
Clearly, in an appropriate customer support situation, purchase history for a user can be unscrambled to reveal customer specifics. Imagine a customer service rep needing this information when a client is arguing that they did not purchase or rent a video. So how would this work?
In intelligent, dynamic, data-centric and person-centric protection (we call this data-centric security), data protection is established for databases containing customer information, with centralized governance and control for this data and de-identification to protect the identity of customers as it relates to their video rentals and purchases.
De-identification leveraging tokenization uses consistent tokens as substitutes for identifying information so that identities of purchasers are always protected, even during processing. With centralized governance of de-identification, data remains protected wherever it goes.
The power of this approach can be understood by considering a healthcare example: doctors should see entire medical records but not financial information, while researchers studying how to derive better care should see entire medical records but not who they belong to.
Customer data matters to all businesses including studios and organizations renting and selling videos. However, the ability to connect a customer with their video history needs to only be available to authorized data stakeholders. Without these safeguards, the bad guys only need to acquire or compromise one privileged person’s credentials to get into your complete video library purchase history.
Meeting compliance and privacy expectations means raising the bar to make it harder to access sensitive information. In traditional methods like encryption, if a privileged administrator’s credentials are compromised, they have everything. Centralized control of data policies, governance and de-identification allows firms to overcome the limitations of encryption alone.
Article written by Myles Suer
Want more? For Job Seekers | For Employers | For Contributors