....These principles are an attempt to illuminate a belief system in which the seemingly opposing motives of creating corporate profit and respecting individual privacy can live in harmony. Here you may find a meeting grounds that enables both your organization and your customers to profit—each in their own ways.
Data about people is valuable in and of itself.¶
Data provide commercial value to businesses in addition to their inherent value from a personal perspective. They also provide value as an exchange or a unique identifier to build social connections. A privacy engineer understands this principle as bedrock and strives to find innovative ways to extend the value of data while protecting their inherent value.
A privacy engineer needs more than just technical skills to protect and extend the value of data.¶
The inherent value of data that is attained from or attributable to human beings requires a number of different perspectives and skill sets to be effective. The privacy engineer, as a modern renaissance type discipline, views personal data through legal, creative, and personal lenses.
A privacy engineer draws from artistic creativity and expression to innovate and communicate.¶
Beyond learning from sister disciplines to add to the known world of technology, the privacy engineer seeks to create simplicity, clarity, and beauty to engage and inform users and owners of systems. The tools of engagement can use sound, taste, touch, sight, smell, intuition, or any other artistic medium. Technologies, policies, laws, organization, and metric modalities all have interfaces. Effective interfaces can be engaging, challenging, educational, elegant, emotive, and even beautiful where innovation meets art.
A privacy engineer learns from, but disregards, the failures of the past.¶
While building on past successes as well as the remnants of previous attempts at success, a privacy engineer closely regards and incorporates existing tools, policies, and frameworks as scaffolding to create something wonderful. (Borrowed heavily from Intel founder Bob Noyes.) A privacy engineer strives to map and develop data systems in a scientific fashion in order to create new or improved meansof delivering value to all parties who have a vested interest in the data.
We are all privacy engineers.¶
We all possess or are the subject of PI  and have a vested interest in protecting it. Some of us have occasion to operate as “professional privacy engineers,” but all of us at least operate as “citizen privacy engineers” when we act as stewards of our own PI and the PI of others.
For the privacy engineer, with the mantra to innovate comes the mantra to do no harm.¶
The privacy engineer’s goal should be to harness the inherent value of data and innovate to create additional value. But the most basic requirement for the privacy engineer is to do no harm and to plan to eliminate as much secondary or unanticipated harm as possible.
Innovation and complexity need not be the adversary of privacy engineering, although failure of imagination may be.¶
What is not thought of cannot be recognized and therefore cannot be managed. Failures of imagination are thus the biggest enemy of the privacy engineer. Failure to imagine a new possibility means that a value creating opportunity or a risk mitigation opportunity has been missed.
The privacy engineer must be able to understand, calculate, mitigate, and accept risk.¶
The privacy engineer cannot ignore risk or fall prey to the idea that it can be completely eliminated. By embracing both risk and value, the privacy engineer can strive to find solutions that deliver maximum value at an acceptable risk level to the organization and the individual.
Privacy engineering happens inside and outside of code.¶
Coding, building systems, and the business processes that support the product lifecycle are critical. A foundation of privacy principles and operational business processes can support development of products that promote privacy. At the same time, the individual doing the developing may see opportunities for innovation that can only be envisioned by one who is at the proverbial drawing board.
A privacy engineer needs to differentiate between bad ideas and bad implementations.¶
A bad idea is one that goes against privacy principles or lacks sound judgment about using and protecting PI. A bad implementation is when the design goal is sound but the implementation is not due to poor usability, unmitigated risks, or an approach that weakens the bond of trust with users. In the latter scenario, a bad implementation that may harm data privacy may be rearchitected or protected in another layered fashion, whereas, in the former, a bad idea should be acknowledged and quickly ended before damage is done.
|||Michelle Finneran Dennedy; Jonathan Fox; Thomas R. Finneran. Copyright 2014 Apress|