Privacy Management Reference Model and Methodology (PMRM) Version 1.0

Committee Specification Draft 01

26 March 2012

Specification URIs

This version:

http://docs.oasis-open.org/pmrm/PMRM/v1.0/csd01/PMRM-v1.0-csd01.pdf (Authoritative)

http://docs.oasis-open.org/pmrm/PMRM/v1.0/csd01/PMRM-v1.0-csd01.html

http://docs.oasis-open.org/pmrm/PMRM/v1.0/csd01/PMRM-v1.0-csd01.doc

Previous version:

N/A

Latest version:

http://docs.oasis-open.org/pmrm/PMRM/v1.0/PMRM-v1.0.pdf (Authoritative)

http://docs.oasis-open.org/pmrm/PMRM/v1.0/PMRM-v1.0.html

http://docs.oasis-open.org/pmrm/PMRM/v1.0/PMRM-v1.0.doc

Technical Committee:

OASIS Privacy Management Reference Model (PMRM) TC

Chairs:

John Sabo (john.t.sabo@ca.com), CA Technologies

Michael Willett (mwillett@nc.rr.com), Individual

Editors:

John Sabo (john.t.sabo@ca.com), CA Technologies

Michael Willett (mwillett@nc.rr.com), Individual

Peter F Brown (peter@peterfbrown.com), Individual

Dawn N Jutla (dawn.jutla@smu.ca), Saint Mary’s University

Abstract:

The Privacy Management Reference Model and Methodology (PMRM, pronounced “pim-rim”) provides a model and a methodology for:

·         understanding and analyzing privacy policies and their privacy management requirements in defined use cases; and

·         selecting the technical services which must be implemented to support privacy controls.

It is particularly relevant for use cases in which personal information (PI) flows across regulatory, policy, jurisdictional, and system boundaries.

Status:

This document was last revised or approved by the OASIS Privacy Management Reference Model (PMRM) TC on the above date. The level of approval is also listed above. Check the “Latest version” location noted above for possible later revisions of this document.

Technical Committee members should send comments on this specification to the Technical Committee’s email list. Others should send comments to the Technical Committee by using the “Send A Comment” button on the Technical Committee’s web page at http://www.oasis-open.org/committees/pmrm/.

For information on whether any patents have been disclosed that may be essential to implementing this specification, and any offers of patent licensing terms, please refer to the Intellectual Property Rights section of the Technical Committee web page (http://www.oasis-open.org/committees/pmrm/ipr.php).

Citation format:

When referencing this specification the following citation format should be used:

[PMRM-v1.0]

Privacy Management Reference Model and Methodology (PMRM) Version 1.0. 26 March 2012. OASIS Committee Specification Draft 01. http://docs.oasis-open.org/pmrm/PMRM/v1.0/csd01/PMRM-v1.0-csd01.html.

 

 

Notices

Copyright © OASIS Open 2012. All Rights Reserved.

All capitalized terms in the following text have the meanings assigned to them in the OASIS Intellectual Property Rights Policy (the "OASIS IPR Policy"). The full Policy may be found at the OASIS website.

This document and translations of it may be copied and furnished to others, and derivative works that comment on or otherwise explain it or assist in its implementation may be prepared, copied, published, and distributed, in whole or in part, without restriction of any kind, provided that the above copyright notice and this section are included on all such copies and derivative works. However, this document itself may not be modified in any way, including by removing the copyright notice or references to OASIS, except as needed for the purpose of developing any document or deliverable produced by an OASIS Technical Committee (in which case the rules applicable to copyrights, as set forth in the OASIS IPR Policy, must be followed) or as required to translate it into languages other than English.

The limited permissions granted above are perpetual and will not be revoked by OASIS or its successors or assigns.

This document and the information contained herein is provided on an "AS IS" basis and OASIS DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY WARRANTY THAT THE USE OF THE INFORMATION HEREIN WILL NOT INFRINGE ANY OWNERSHIP RIGHTS OR ANY IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE.

OASIS requests that any OASIS Party or any other party that believes it has patent claims that would necessarily be infringed by implementations of this OASIS Committee Specification or OASIS Standard, to notify OASIS TC Administrator and provide an indication of its willingness to grant patent licenses to such patent claims in a manner consistent with the IPR Mode of the OASIS Technical Committee that produced this specification.

OASIS invites any party to contact the OASIS TC Administrator if it is aware of a claim of ownership of any patent claims that would necessarily be infringed by implementations of this specification by a patent holder that is not willing to provide a license to such patent claims in a manner consistent with the IPR Mode of the OASIS Technical Committee that produced this specification. OASIS may include such claims on its website, but disclaims any obligation to do so.

OASIS takes no position regarding the validity or scope of any intellectual property or other rights that might be claimed to pertain to the implementation or use of the technology described in this document or the extent to which any license under such rights might or might not be available; neither does it represent that it has made any effort to identify any such rights. Information on OASIS' procedures with respect to rights in any document or deliverable produced by an OASIS Technical Committee can be found on the OASIS website. Copies of claims of rights made available for publication and any assurances of licenses to be made available, or the result of an attempt made to obtain a general license or permission for the use of such proprietary rights by implementers or users of this OASIS Committee Specification or OASIS Standard, can be obtained from the OASIS TC Administrator. OASIS makes no representation that any information or list of intellectual property rights will at any time be complete, or that any claims in such list are, in fact, Essential Claims.

The name "OASIS" is a trademark of OASIS, the owner and developer of this specification, and should be used only to refer to the organization and its official outputs. OASIS welcomes reference to, and implementation and use of, specifications, while reserving the right to enforce its marks against misleading uses. Please see http://www.oasis-open.org/who/trademark.php for above guidance.

 

Table of Contents

1        Introduction. 6

1.1 Context 6

1.2 Objectives. 6

1.3 Target Audience. 7

1.4 Specification Summary. 7

1.5 Terminology. 10

1.6 Normative References. 11

1.7 Non-Normative References. 11

2        High-Level Privacy Analysis and Use Case Description. 12

2.1 Application and Business Process Descriptions. 12

Task #1:       Use Case Description. 12

Task #2:       Use Case Inventory. 12

2.2 Applicable Privacy Policies. 13

Task #3:       Privacy Policy Conformance Criteria. 13

2.3 Initial Privacy Impact (or other) Assessment(s) [optional] 14

Task #4:       Assessment Preparation. 14

3        Detailed Privacy Use Case Analysis. 15

3.1 Use Case Development 15

Task #5:       Identify Actors. 15

Task #6:       Identify Systems. 15

Task #7:       Identify Privacy Domains and Owners. 16

Task #8:       Identify roles and responsibilities within a domain. 17

Task #9:       Identify Touch Points. 17

Task #10:         Identify Data Flows. 18

3.2 Identify PI in Use Case Privacy Domains and Systems. 18

Incoming PI 18

Internally Generated PI 18

Outgoing PI 18

Task #11:         Identify Incoming/Internally Generated/Outgoing PI 19

3.3 Specify Required Privacy Controls. 19

Task #12:         Specify Inherited Privacy Controls. 19

Task #13:         Specify Internal Privacy Controls. 20

Task #14:         Specify Exported Privacy Controls. 20

4        Services Supporting Privacy Controls. 21

4.1 Services Needed to Implement the Controls. 21

4.2 Service Details and Function Descriptions. 23

4.2.1 Core Policy Services. 23

1.       Agreement Service. 23

2.       Usage Service. 23

4.2.2 Privacy Assurance Services. 23

3.       Validation Service. 23

4.       Certification Service. 23

5.       Enforcement Service. 24

6.       Security Service. 24

4.2.3 Presentation and Lifecycle Services. 24

7.       Interaction Service. 24

8.       Access Service. 24

4.3 Services satisfying the privacy controls. 25

Task #15:         Identify the Services that conform to the identified privacy controls. 25

4.4 Define the Technical Functionality and Business Processes Supporting the Selected Services. 25

4.4.1 Functions Satisfying the Selected Services. 25

Task #16:         Identify the Functions that satisfy the selected Services. 26

4.5 Risk Assessment 26

Task #17:         Conduct Risk Assessment 26

4.6 Iterative Process. 27

Task #18:         Iterate the analysis and refine. 27

5        PMRM Glossary, plus Operational Definitions for Fair Information Practices/Principles (“FIPPs”) 28

5.1 Operational FIPPs. 28

5.2 Glossary. 29

Appendix A.       Acknowledgments. 31

Appendix B.       Revision History. 32

 

 


1      Introduction

The Privacy Management Reference Model and Methodology (PMRM) addresses the reality of today’s networked, interoperable capabilities, applications and devices and the complexity of managing personal information (PI)[1] across legal, regulatory and policy environments in interconnected domains.  It is a valuable tool that helps improve privacy management and compliance in cloud computing, health IT, smart grid, social networking, federated identity and similarly complex environments where the use of personal information is governed by laws, regulations, business contracts and other policies, but where traditional enterprise-focused models are inadequate. It can be of value to business and program managers who need to understand the implications of privacy policies for specific business systems and to help assess privacy management risks.

The PMRM is neither a static model nor a purely prescriptive set of rules (although it includes characteristics of both), and implementers have flexibility in determining the level and granularity of analysis required by a particular use case. The PMRM can be used by systems architects to inform the development of a privacy management architecture.  The PMRM may also be useful in fostering interoperable policies and policy management standards and solutions. In many ways, the PMRM enables “privacy by design” because of its analytic structure and primarily operational focus.

1.1 Context

Predictable and trusted privacy management must function within a complex, inter-connected set of networks, systems, applications, devices, data, and associated governing policies.  Such a privacy management capability is needed both in traditional computing and in cloud computing capability delivery environments.  A useful privacy management capability must be able to establish the relationship between personal information (“PI”) and associated privacy policies in sufficient granularity to enable the assignment of privacy management functionality and compliance controls throughout the lifecycle of the PI.  It must also accommodate a changing mix of PI and policies, whether inherited or communicated to and from external domains or imposed internally. It must also include a methodology to carry out a detailed, structured analysis of the application environment and create a custom privacy management analysis (PMA) for the particular use case.

1.2 Objectives

The PMRM is used to analyze complex use cases, to understand and implement appropriate operational privacy management functionality and supporting mechanisms, and to achieve compliance across policy, system, and ownership boundaries.

In addition to serving as an analytic tool, the PMRM can aid the design of a privacy management architecture in response to use cases and as appropriate for a particular operational environment. It can also be used to help in the selection of integrated mechanisms capable of executing privacy controls in line with privacy policies, with predictability and assurance.  Such an architectural view is important, because business and policy drivers are now both more global and more complex and must thus interact with many loosely-coupled systems.

In addition, multiple jurisdictions, inconsistent and often-conflicting laws, regulations, business practices, and consumer preferences, together create huge barriers to online privacy management and compliance. It is unlikely that these barriers will diminish in any significant way, especially in the face of rapid technological change and innovation and differing social and national values, norms and policy interests.

The Privacy Management Reference Model and Methodology therefore provides policymakers, program and business managers, system architects and developers with a tool to improve privacy management and compliance in multiple jurisdictional contexts while also supporting capability delivery and business objectives. In this Model, the controls associated with privacy (including security) will be flexible, configurable and scalable and make use of technical mechanisms, business process and policy components. These characteristics require a specification that is policy-configurable, since there is no uniform, internationally-adopted privacy terminology and taxonomy.

Analysis and documentation produced using the PMRM will result in a Privacy Management Analysis (PMA) that serves multiple stakeholders, including privacy officers and managers, general compliance managers, and system developers. While other privacy instruments, such as privacy impact assessments (“PIAs”), also serve multiple stakeholders, the PMRM does so in a way that is somewhat different from these others. Such instruments, while nominally of interest to multiple stakeholders, tend to serve particular groups. For example, PIAs are often of most direct concern to privacy officers and managers, even though developers are often tasked with contributing to them. Such privacy instruments also tend to change hands on a regular basis. As an example, a PIA may start out in the hands of the development or project team, move to the privacy or general compliance function for review and comment, go back to the project for revision, move back to the privacy function for review, and so on. This iterative process of successive handoffs is valuable, but can easily devolve into a challenge and response dynamic that can itself lead to miscommunication and misunderstandings.

The PMRM process output, in contrast, should have direct and ongoing relevance for all stakeholders and is less likely to suffer the above dynamic. This is because it should be considered as a “boundary object,” a construct that supports productive interaction and collaboration among multiple communities. Although a boundary object is fully and continuously a part of each relevant community, each community draws from it meanings that are grounded in the group’s own needs and perspectives. As long as these meanings are not inconsistent across communities, a boundary object acts as a shared yet heterogeneous understanding. The PMRM process output, if properly generated, constitutes just such a boundary object. It is accessible and relevant to all stakeholders, but each group takes from it and attributes to it what they specifically need. As such, the PMRM can facilitate collaboration across relevant communities in a way that other privacy instruments often cannot.

1.3 Target Audience

The intended audiences of this document and expected benefits to be realized include:

·         Privacy and Risk Officers will gain a better understanding of the specific privacy management environment for which they have compliance  responsibilities as well as detailed policy and operational processes and technical systems that are needed to achieve their organization’s privacy  compliance;

·         Systems/Business Architects will have a series of templates for the rapid development of core systems functionality, developed using the PMRM as a tool.

·         Software and Service Developers will be able to identify what processes and methods are required to ensure that personal data is created and managed in accordance with requisite privacy provisions.

·         Public policy makers will be able to identify any weaknesses or shortcomings of current policies and use the PMRM to establish best practice guidelines where needed.

1.4 Specification Summary

The PMRM consists of:

·         A conceptual model of privacy management, including definitions of terms;

·         A methodology; and

·         A set of operational services, together with the inter-relationships among these three elements.

Description: \\WHS\Organisations\OASIS\TC-PMRM\Diagrams\PMRM Model-2012-03-16.png

Figure 1 – The PMRM Conceptual Model

In Figure 1, we see that the core concern of privacy protection (by users, policy makers, solution providers, etc.) helps, on the one hand, drive policy and principles (which in turn influence actual regulation and lawmaking); and on the other hand, informs the use cases that are developed to address the specific architecture and solutions required by the stakeholders in a particular domain.

Legislation in its turn is a major influence on privacy controls – indeed, privacy controls are often expressed as policy objectives rather than as specific technology solutions – and these form the basis of the PMRM Services that are created to conform to those controls when implemented.

The PMRM conceptual model is anchored in the principles of Service-Oriented Architecture (and particularly the principle of services operating across ownership boundaries). Given the general reliance by the privacy policy community on non-uniform definitions of so-called “Fair Information Practices/Principles” (FIP/Ps), a non-normative, working set of operational privacy definitions (see section 5.1) is used to provide a foundation for the Model.  With their operational focus, these working definitions are not intended to supplant or to in any way suggest a bias for or against any specific policy or policy set.  However, they may prove valuable as a tool to help deal with the inherent biases built into current terminology associated with privacy and to abstract their operational features.

The PMRM methodology covers a series of tasks, outlined in the following sections of the document, concerned with:

·         defining and describing use-cases;

·         identifying particular business domains and understanding the roles played by all actors and systems within that domain in relation to privacy issues;

·         identifying the data flows and touch-points for all personal information within a privacy domain;

·         specifying various privacy controls;

·         mapping technical and process mechanisms to operational services;

·         performing risk and compliance assessments.

The specification also defines a set of Services deemed necessary to implement the management and compliance of detailed privacy requirements within a particular use case.  The Services are sets of functions which form an organizing foundation to facilitate the application of the model and to support the identification of the specific mechanisms which will be incorporated in the privacy management architecture appropriate for that use case. The set of operational services (Agreement, Usage, Validation Certification, Enforcement, Security, Interaction, and Access) is described in Section 4 below.

The core of the specification is expressed in two normative sections: the High Level Privacy Analysis and the Detailed Privacy Management Reference Model Description. The Detailed PMRM Description section is informed by the general findings associated with the High Level Analysis.  However, it is much more detail-focused and requires development of a use case which clearly expresses the complete application and/or business environment within which personal information is collected, communicated, processed, stored, and disposed.

It is also important to point out that the model is not generally prescriptive and that users of the model may choose to adopt some parts of the model and not others.  However, a complete use of the model will contribute to a more comprehensive privacy management architecture for a given capability or application.  As such, the PMRM may serve as the basis for the development of privacy-focused capability maturity models and improved compliance frameworks. The PMRM provides a model foundation on which to build privacy architectures.

Use of the PMRM by and within a particular business domain and context (with a suitable Use Case), will lead to the production of a Privacy Management Analysis (PMA). An organization may have one or more PMAs, particularly across different business units, or it may have a unified PMA. Theoretically, a PMA may apply across organizations, states, and even countries or other geo-political regions.

Figure 2 below shows the high-level view of the PMRM methodology that is used to create a PMA. Although the stages are numbered for clarity, no step is an absolute pre-requisite for starting work on another step and the overall process will usually be iterative. Equally, the process of establishing an appropriate privacy architecture, and determining when and how technology implementation will be carried out, can both be started at any stage during the overall process.

Description: \\WHS\Organisations\OASIS\TC-PMRM\Diagrams\PMRM Methodology.png

Figure 2 - The  PMRM Methodology

1.5 Terminology

References are surrounded with [square brackets] and are in bold text.

The key words “MUST”, “MUST NOT”, “REQUIRED”, “SHALL”, “SHALL NOT”, “SHOULD”, “SHOULD NOT”, “RECOMMENDED”, “MAY”, and “OPTIONAL” in this document are to be interpreted as described in [RFC2119].

A glossary of key terms used in this specification as well as operational definitions for sample Fair Information Practices/Principles (“FIP/Ps”) are included in Section 5 of the document.  We note that words and terms used in the discipline of data privacy in many cases have meanings and inferences associated with specific laws, regulatory language, and common usage within privacy communities.  The use of such well-established terms in this specification is unavoidable. However we urge readers to consult the definitions in the glossary and clarifications in the text to reduce confusion about the use of such terms within this specification.

1.6 Normative References

[RFC2119]               S. Bradner, Key words for use in RFCs to Indicate Requirement Levels, http://www.ietf.org/rfc/rfc2119.txt, IETF RFC 2119, March 1997.

1.7 Non-Normative References

[SOA-RM]               OASIS Standard, "Reference Model for Service Oriented Architecture 1.0”, 12 October 2006. http://docs.oasis-open.org/soa-rm/v1.0/soa-rm.pdf       

[NIST 800-53]          “Security and Privacy Controls for Federal Information Systems and Organizations – Appendix J: Privacy Controls Catalog”, NIST Special Publication 800-53 Draft Appendix J, July 2011.

2      High-Level Privacy Analysis and Use Case Description

The first phase in applying the PMRM methodology requires the scoping of the application or business service in which personal information (PI) is associated - in effect, identifying the complete environment in which the application or capabilities where privacy and data protection requirements are applicable. The extent of the scoping analysis and the definitions of “application” or “business capability” are set by the entity utilizing the PMRM. These may be defined broadly or narrowly, and may include lifecycle (time) elements.

The high level analysis may also make use of privacy impact assessments, previous risk assessments, privacy maturity assessments, compliance reviews, and accountability model assessments as determined by the user of the PMRM. However, the scope of the high level privacy analysis (including all aspects of the capability or application under review and all relevant privacy policies) must correspond with the scope of the second phase, covered in Section 3, “Detailed Privacy Use Case Analysis”, below.

2.1 Application and Business Process Descriptions

Task #1:     Use Case Description

Objective          Provide a general description of the Use Case.

Example

A California utility, with a residential customer base with smart meters installed, wants to promote the increased use of electric vehicles in its service area by offering significantly reduced electricity rates for nighttime recharging of vehicle battery. The system also permits the customer to use the charging station at another customer’s site [such as at a friend’s house] and have the system bill the vehicle owner instead of the customer whose charging station is used.

The customer plugs in the car and requests “charge at cheapest rates”. The utility is notified of the car’s presence, its ID number and the approximate charge required (provided by the car’s on board computer). The utility schedules the recharge to take place during the evening hours and at different times than other EV charging (thus putting diversity into the load).

The billing department now calculates the amount of money to charge the EV customer based on EV rates and for the measured time period.

The same EV customer drives to a friend’s home (who also has an EV) and requests a quick charge to make sure that he can get back home. When he plugs his EV into his friend’s EV charger, the utility identifies the fact that the EV belongs to a different customer and places the charging bill on the correct person’s invoice.

The billing department now calculates the amount of money to invoice the customer who owns the EV, based on EV rates and for the measured time period.

Task #2:     Use Case Inventory

Objective          Provide an inventory of the capabilities, applications and policy environment under review at the level of granularity appropriate for the analysis covered by the PMRM and define a High Level Use Case which will guide subsequent analysis. In order to facilitate the analysis described in the Detailed Privacy Use Case Analysis in Section 4, the components of the Use Case Inventory should align as closely as possible with the components that will be analyzed in the corresponding detailed use case analysis.

Context            The inventory can include applications and business processes; products; policy environment; legal and regulatory jurisdictions; systems supporting the capabilities and applications; data; time; and other factors Impacting the collection, communication, processing, storage and disposition of PI. The inventory should also include the types of data subjects covered by the use case together with individual user privacy options (such as policy preferences, privacy settings, etc. if these are formally expressed).

Example

Systems:       Utility Communications Network, Customer Billing System, EV On Board System…

Legal and Regulatory Jurisdictions:

                     California Constitution, Article 1, section 1 gives each citizen an "inalienable right" to pursue and obtain "privacy."

                     Office of Privacy Protection - California Government Code section 11549.5.

                     Automobile "Black Boxes" - Vehicle Code section 9951.

                     …

Personal Information Collected on Internet:

                     Government Code section 11015.5. This law applies to state government agencies…

                     The California Public Utilities Commission, which “serves the public interest by protecting consumers and ensuring the provision of safe, reliable utility service and infrastructure at reasonable rates, with a commitment to environmental enhancement and a healthy California economy”…

Policy:           The Utility has a published Privacy Policy covering the EV recharging/billing application

 

Customer:      The Data Subject can accept default settings for all customer-facing interfaces or customize the settings.

2.2 Applicable Privacy Policies 

Task #3:     Privacy Policy Conformance Criteria

Objective          Define and describe the criteria for conformance of a system or business process (identified in the use case and inventory) with an applicable privacy policy. As with the Use Case Inventory described in Task # 2 above, the conformance criteria should align with the equivalent elements in the Detailed Privacy Use Case Analysis described in Section 3. Wherever possible, they should be grouped by the relevant FIP/Ps and expressed as privacy constraints.

Note that whereas Task #2 itemizes the environmental elements relevant to the Use Case, Task #3 focuses on the privacy requirements specifically.

Example

Privacy Policy Conformance Criteria:

(1) Ensure that the utility does not share data with third parties without the consumer’s consent…etc.

(2) Ensure that the utility supports strong levels of:

(a) Identity authentication

(b) Security of transmission between the charging stations and the utility information systems…etc.

(3) Ensure that personal data is deleted on expiration of retention periods…

2.3 Initial Privacy Impact (or other) Assessment(s) [optional]

Task #4:     Assessment Preparation

Objective          Prepare an initial privacy impact assessment, or as appropriate, a risk assessment, privacy maturity assessment, compliance review, or accountability model assessment applicable within the scope of analysis carried out in steps 2.1 and 0. Such an assessment can be deferred until a later iteration step (see Section 4.3) or inherited from a previous exercise. 

Example

Since the Electric Vehicle (EV) has a unique ID, it can be linked to an individual. Individuals’ whereabouts may be tracked through utility transaction visibility…

The EV charging and vehicle management system may retain data which can be used to identify patterns of charging and location information that can constitute PI.

Unless safeguards are in place and (where appropriate) under the user’s control, there is a danger that intentionally anonymized PI nonetheless become PII…

The utility wishes to capture behavioral and movement patterns and sell this information to potential advertisers or other information brokers to generate additional revenue. This information constitutes PII.   The collection and use of this information should only be done with the explicit, informed consent of the user.

3      Detailed Privacy Use Case Analysis

3.1 Use Case Development

Goal                 Prepare and document a detailed Privacy Management Analysis of the Use Case which corresponds with the High Level Privacy Analysis and the High Level Use Case Description.

Constraint        The Detailed Use Case must be clearly bounded and must include the following components.

Task #5:     Identify Actors 

Objective          Identify actors having operational privacy responsibilities.

Definition         An actor is a data subject or a human or a non-human agent interacting with PI managed by a System within a Privacy Domain.

                        A “domain” covers both physical areas (such as a customer site or home) and logical areas (such as a wide-area network or cloud computing environment) that are subject to the control of a particular domain owner.

Example

Actors Located at the Customer Site:

      Customer, Guest

Actors Located at the EV’s Location:

      Non-Customer Host (Temporary host for EV charging)

Actors Located within the Utility’s domain:

      Service Provider (Utility)

      Contractors and Suppliers to the Utility

Task #6:     Identify Systems

Objective          Identify the Systems where PI is collected, communicated, processed, stored or disposed within a Privacy Domain.

Definition         For purposes of this specification, a System is a collection of components organized to accomplish a specific function or set of functions having a relationship to operational privacy management.

Example

Located at the Customer Site:

         Customer Communication Portal

         EV Physical Re-Charging and Metering System

Located in the EV:

         EV: Device

         EV On-Board System: System

Located within the EV manufacturer’s domain:

         EV Charging Data Storage and Analysis System

Located within the Utility’s domain:

         EV Program Information System (includes Rates, Customer Charge Orders, Customers enrolled in the program, Usage Info etc.)

         EV Load Scheduler System

         Utility Billing System

         Remote Charge Monitoring System

         Partner marketing system for transferring usage pattern and location information

Task #7:     Identify Privacy Domains and Owners

Objective          Identify the Privacy Domains included in the use case together with the respective Domain Owners.

Definition         Privacy Domains are the physical or logical areas within the use case subject to control by Domain Owners.

                          Domain Owners are entities responsible for ensuring that privacy controls and PMRM services are managed in business processes and technical systems within a given Domain.

Context            Privacy Domains may be under the control of individuals or data subjects; data controllers; capability providers; data processors; and other distinct entities having defined operational privacy management responsibilities.

Rationale          Domain Owner identification is important for purposes of establishing accountability.

Example

Utility Domain:

         The physical premises located at…. which includes the Utility’s program information system, load scheduling system, billing system, and remote monitoring system

         This physical location is part of a larger logical privacy domain, owned by the Utility and extends to the Customer Portal Communication system at the Customer’s site, and the EV On-Board software application System installed in the EV by the Utility, together with cloud-based services hosted by…..

Customer Domain:

         The physical extent of the customer’s home and adjacent land as well as the EV, wherever located, together with the logical area covered by devices under the ownership and control of the customer (such as mobile devices).

Example

         The EV On-Board System belongs to the utility Privacy Domain Owner.

         The EV (with its ID Number) belongs to the Customer Domain Owner and the Vehicle Manufacturer Domain Owners, but the EV ID may be accessed by the Utility.

Task #8:     Identify roles and responsibilities within a domain

Objective          For any given use case, identify the roles and responsibilities assigned to specific actors within a specific privacy domain

Rationale          Any individual or position may carry multiple roles and responsibilities and these need to be distinguishable, particularly as many functions involved in processing of PI are assigned to a person or other actor, according to explicit roles and authority to act, rather to a person or actor as such.

Example

Role:                    EV Manufacturer Privacy Officer

Responsibilities:   Ensure that all PI data flows from EV On-Board System conform both with contractual obligations towards the Utility as well as the Collection Limitation and Information Minimization FIP/P.

Task #9:     Identify Touch Points

Objective          Identify the touch points at which the data flows intersect with Privacy Domains or Systems within Privacy Domains.

Definition         Touch Points are the intersections of data flows with Privacy Domains or Systems within Privacy Domains.

Rationale          The main purpose for identifying touch points in the use case is to clarify the data flows and ensure a complete picture of all Privacy Domains and Systems in which PI is used.

Example

The Communication Interfaces whereby actors send and receive data are touch points. For instance the Customer Communication Portal provides an interface via which the Customer communicates a charge order to the Utility.

When the customer plugs into the charging station, the EV On-Board System also embeds communication functionality that acts as its touch point to send EV ID and EV Charge Requirements to the Customer Communication Portal

Task #10:  Identify Data Flows

Objective          Identify the data flows carrying PI and privacy constraints among Domains in the Use Case.

Constraint        Data flows may be multidirectional or unidirectional.

Example

When a charging request event occurs, the Customer Communication Portal sends Customer information, EV identification, and Customer Communication Portal location information to the EV Program Information System managed by the Utility.

This application uses metadata tags to indicate whether or not customer’ identification and location data may be shared (and then, only with authorized third parties), and prohibits the sharing of data that provides customers’ movement history, if derived from an aggregation of transactions.

3.2 Identify PI in Use Case Privacy Domains and Systems

Objective          Specify the PI collected, created, communicated, processed or stored within Privacy Domains or Systems in three categories.

Incoming PI

Definition         Incoming PI is PI flowing into a Privacy Domain, or a system within a Privacy Domain.

Constraint        Incoming PI may be defined at whatever level of granularity appropriate for the scope of analysis of the Use Case and the Privacy Policies established in Section 2.

Internally Generated PI

Definition         Internally Generated PI is PI created within the Privacy Domain or System itself.

Constraint        Internally Generated PI may be defined at whatever level of granularity appropriate for the scope of analysis of the Use Case and the Privacy Policies established in Section 2.

Example           Examples include device information, time-stamps, location information, and other system-generated data that may be linked to an identity.

Outgoing PI

Definition         Outgoing PI is PI flowing out of one system to another system within a Privacy Doman or to another Privacy Domain.

Constraint        Outgoing PI may be defined at whatever level of granularity appropriate for the scope of analysis of the Use Case and the Privacy Policies established in Section 2.

Task #11:  Identify Incoming/Internally Generated/Outgoing PI

Example

Incoming PI:

         Customer ID received by Customer Communications Portal

Internally Generated PI:

         Current EV location logged by EV On-Board system

Outgoing PI:

         Current EV location transmitted to Utility Load Scheduler System

3.3 Specify Required Privacy Controls

Goal                 For Incoming, Internally Generated and Outgoing PI, specify the privacy controls required to enforce the privacy policy associated with the PI. Privacy controls may be pre-defined or may be derived. In either case, privacy controls are typically associated with specific Fair Information Practices Principles (FIP/Ps) that apply to the PI.

Definition         Control is a process designed to provide reasonable assurance regarding the achievement of stated objectives.

Definition         Privacy Controls are administrative, technical and physical safeguards employed within an organization in order to protect PI. They are the means by which privacy policies are satisfied in an operational setting.

Task #12:  Specify Inherited Privacy Controls

Objective          Specify the required Privacy Controls which are inherited from Privacy Domains or Systems within Privacy Domains.

Example:

The utility inherits a Privacy Control associated with the Electric Vehicle’s ID (EVID) from the vehicle manufacturer’s privacy policies.

The utility inherits the consumer’s Operational Privacy Control Requirements, expressed as privacy preferences, via a link with the customer communications portal when she plugs her EV into friend Rick’s charging station.  The utility must apply Jane’s privacy preferences to the current transaction. The Utility accesses Jane’s privacy preferences and learns that Jane does not want her association with Rick exported to the Utility’s third party partners. Even though Rick’s privacy settings differ around his PI, Jane’s non-consent to the association being transmitted out of the Utility’s privacy domain is sufficient to prevent commutative association. Thus if Rick were to charge his car’s batteries at Jane’s, the association between them would also not be shared with third parties.


 

Task #13:  Specify Internal Privacy Controls

Objective          Specify the Privacy Controls which are mandated by internal Privacy Domain policies.

Example

Use Limitation Internal Privacy Controls

The Utility complies with California Code SB 1476 of 2010 (Public Utilities Code §§ 8380-8381 Use Limitation).

It implements the 2011 California Public Utility Commission (CPUC) privacy rules, recognizing the CPUC’s regulatory privacy jurisdiction over it and third parties with which it shares customer data.

Further, it adopts NIST 800-53 Appendix J’s “Control Family” on Use Limitation – e.g. it evaluates any proposed new instances of sharing PII with third parties to assess whether they are authorized and whether additional or new public notice is required.

Task #14:  Specify Exported Privacy Controls

Objective          Specify the Privacy Controls which must be exported to other Privacy Domains or to Systems within Privacy Domains.

Example

The Utility exports Jane’s privacy preferences associated with her PI to its third party partner. One of her privacy control requirements is to not share her EVID with marketing aggregators or advertisers.

4      Services Supporting Privacy Controls

Privacy controls are usually stated in the form of a policy declaration or requirement and not in a way that is immediately actionable or implementable. “Services” provide the bridge between those requirements and a privacy management implementation by providing privacy constraints on system-level actions governing the flow of PI between touch points.

4.1 Services Needed to Implement the Controls

A set of operational Services is the organizing structure which will be used to link the required Privacy Controls specified in Section 4.3 to operational mechanisms necessary to implement those requirements.

Eight Privacy Services have been identified, based on the mandate to support an arbitrary set of privacy policies, but at a functional level. The eight Services can be logically grouped into three categories:

·         Core Policy: Agreement, Usage

·         Privacy Assurance: Security, Validation, Certification, Enforcement

·         Presentation and Lifecycle: Interaction, Access

These groupings, illustrated below, are meant to clarify the “architectural” relationship of the Services in an operational design. However, the functions provided by all Services are available for mutual interaction without restriction.

Text Box: Core Policy Services

 Text Box: Privacy Assurance Services Text Box: Presentation
& Lifecycle Services

Text Box: Agreement Text Box: Validation Text Box: Certification Text Box: Interaction 

Text Box: Usage Text Box: Security Text Box: Enforcement Text Box: Access

A system architect or technical manager should be able to integrate these privacy Services into a functional architecture, with specific mechanisms selected to implement these functions. In fact, a key purpose of the PMRM is to stimulate design and analysis of the specific functions - both manual and automated - that are needed to implement any set of privacy policies. In that sense, the PMRM is an analytic tool.

The PMRM identifies various system capabilities that are not typically described in privacy practices and principles. For example, a policy management (or “usage and control”) function is essential to manage the PI usage constraints established by the individual, information collector or regulation, but such a function is not explicitly named in privacy principles/practices. Likewise, interfaces (and agents) are not explicit in the privacy principles/practices, but are necessary to represent other essential operational capabilities.

Such inferred capabilities are necessary if information systems are to be made “privacy configurable and compliant.”  Without them, enforcing privacy policies in a distributed, fully automated environment will not be possible, and businesses, individuals, and regulators will be burdened with inefficient and error-prone manual processing, inadequate privacy governance and compliance controls, and inadequate compliance reporting.

A “Service”, as used here, is defined as a collection of related functions and mechanisms that operate for a specified purpose. The eight privacy Services defined are Agreement, Usage, Security, Validation, Certification, Enforcement, Interaction, and Access. Specific operational behavior of these Services is governed by the privacy policy and constraints that are configured in a particular implementation and jurisdictional context.  These will be identified as part of the Use Case analysis.  Practice with use cases has shown that the Services listed above can, together, operationally encompass any arbitrary set of privacy requirements.

The functions of one Service may invoke another Service. In other words, functions under one Service may “call” those under another Service (for example, pass information to a new function for subsequent action). In line with principles of Service-Oriented Architecture (SOA)[2], the Services can thus interact in an arbitrary interconnected sequence to accomplish a privacy management task or set of privacy lifecycle requirements. Use cases will illustrate such interactions and their sequencing as the PMRM is used to solve a particular privacy problem. By examining and by solving multiple use cases, the PMRM can be tested for applicability and robustness.

The table below provides a description of each Service’s functionality and an informal definition of each Service:

SERVICE

FUNCTIONALITY

PURPOSE

AGREEMENT

Define and document permissions and rules for the handling of PI based on applicable policies, individual preferences, and other relevant factors; provide relevant Actors with a mechanism to negotiate or establish new permissions and rules; express the agreements for use by other Services

Manage and negotiate permissions and rules 

USAGE

Ensure that the use of PI complies with the terms of any applicable permission, policy, law  or regulation,

including PI subjected to information minimization, linking, integration, inference, transfer, derivation, aggregation, and anonymization over the lifecycle of the use case

Control PI use

VALIDATION

Evaluate and ensure the information quality of PI in terms of Accuracy, Completeness, Relevance, Timeliness and other relevant qualitative factors

Check PI

CERTIFICATION

Ensure that the credentials of any Actor, Domain, System , or system component are compatible with their assigned roles in processing PI; verify compliance and trustworthiness of that Actor, Domain, System  or system component against defined policies and assigned roles.

Check credentials

ENFORCEMENT

Initiate response actions, policy execution, and recourse when audit controls and monitoring indicate that an Actor or System does not conform to defined policies or the terms of a permission (agreement)

Monitor and respond to audited exception conditions

SECURITY

Provide the procedural and technical mechanisms necessary to ensure the confidentiality, integrity, and availability of personal information; make possible the trustworthy processing, communication, storage and disposition of privacy operations

Safeguard privacy information and operations

INTERACTION

Provide generalized interfaces necessary for presentation, communication, and interaction of PI and relevant information associated with PI; encompasses functionality such as user interfaces, system-to-system information exchanges, and agents

Information presentation and communication

ACCESS

Enable data-subject Actors, as required and/or allowed by permission, policy, or regulation, to review their PI that is held within a Domain and propose changes and/or corrections to their PI

View and propose changes to stored PI

 

4.2 Service Details and Function Descriptions

4.2.1 Core Policy Services

1.     Agreement Service

·       Define and document permissions and rules for the handling of PI based on applicable policies, individual preferences, and other relevant factors.

·       Provide relevant Actors with a mechanism to negotiate or establish new permissions and rules.

·       Express the agreements for use by other Services.

Example

As part of its standard customer service agreement, a bank requests selected customer PI, with associated permissions for use. Customer negotiates with the bank to modify the permissions. Customer provides the PI to the bank, with the modified and agreed to permissions. This agreement is signed by both parties, stored in an appropriate representation and the customer is provided a copy.

2.     Usage Service

·       Ensure that the use of PI complies with the terms of any applicable permission, policy, law or regulation,

·       Including PI subjected to information minimization, linking, integration, inference, transfer, derivation, aggregation, and anonymization,

·       Over the lifecycle of the use case.

Example

A third party has acquired individual PI, consistent with agreed permissions for use. Before using the PI, the third party has implemented functionality ensuring that the usage of the PI is consistent with the permissions.

4.2.2 Privacy Assurance Services

3.     Validation Service

·       Evaluate and ensure the information quality of PI in terms of Accuracy, Completeness, Relevance, Timeliness and other relevant qualitative factors.

Example

PI is received from an authorized third party for a particular purpose. The PI is checked to ensure it is sufficiently current for use. 

4.     Certification Service

·       Ensure that the credentials of any Actor, Domain, System, or system component are compatible with their assigned roles in processing PI;

·       Verify that an Actor, Domain, System, or system component supports defined policies and conforms with assigned roles.

Example

A patient enters an emergency room, presenting identifying credentials. Functionality has been implemented which enables hospital personnel to check those credentials against their prior-patient database. Hospital personnel invoke the certification service’s authentication processes.

5.     Enforcement Service

·       Initiate response actions, policy execution, and recourse when audit controls and monitoring indicate that an Actor or System does not conform to defined laws, regulations, policies or the terms of a permission (agreement).

Example

A magazine’s subscription service provider forwards customer PI to a third party not authorized to receive the information. A routine audit of the service provider’s system reveals this unauthorized disclosure practice, alerting the appropriate responsible official person (the organization’s privacy officer) who takes appropriate action.

6.     Security Service

·       Make possible the trustworthy processing, communication, storage and disposition of privacy operations;

·       Provide the procedural and technical mechanisms necessary to ensure the confidentiality, integrity, and availability of personal information.

Example

PI is transferred between authorized recipients, using transmission encryption, to ensure confidentiality.

Strong identity and authorization management systems are implemented to conform to data confidentiality policies.

4.2.3 Presentation and Lifecycle Services

7.     Interaction Service

·       Provide generalized interfaces necessary for presentation, communication, and interaction of PI and relevant information associated with PI;

·       Encompasses functionality such as user interfaces, system-to-system information exchanges, and agents.

Example:

Your home banking application uses a graphical user interface (GUI) to communicate with you, including presenting any relevant privacy Notices.

8.     Access Service

·       Enable data-subjects, as required and/or allowed by permission, policy, or regulation, to review their PI held within a Domain and propose changes and/or corrections to it.

Example:

A national credit bureau has implemented an online service enabling individuals to request their credit score details and to report discrepancies in their credit histories.


 

4.3 Services satisfying the privacy controls

The Services defined in Section 4.1 encompass detailed Functions and Mechanisms needed to transform the privacy controls of section 3.3 into an operational system design for the use case. Since the detailed use case analysis focused on the data flows – incoming, internally generated, outgoing – between Systems (and Actors), the Service selections should be on the same granular basis.

Task #15:  Identify the Services that conform to the identified privacy controls.

Perform this task for each data flow exchange of PI between systems.

This detailed conversion into Service operations can then be synthesized into consolidated sets of Service actions per System involved in the Use Case.

On further iteration and refinement, the engaged Services can be further delineated by the appropriate Functions and Mechanisms for the relevant privacy controls.

Examples:

Based upon

a)     Internally Generated PI (Current EV location logged by EV On-Board system), and

b)    Outgoing PI (Current EV location transmitted to Utility Load Scheduler System),

convert to operational Services as follows:

“Log EV location”:

Validation       EV On-Board System checks that location is not previously rejected by EV owner

Enforcement   If location is previously rejected, then notify the Owner and/or the Utility

Interaction       Communicate EV Location to EV On-Board System

Usage              EV On-Board System records EV Location in secure storage, together with agreements

“Transmit EV Location to Utility Load Scheduler System (ULSS)”:

Interaction       Communication established between EV Location and ULSS       

Security          Authenticate the ULSS site; secure the transmission

Certification    ULSS checks the credentials of the EV On-Board System

Validation       Validate the EV Location against accepted locations

Usage              ULSS records the EV Location, together with agreements

4.4 Define the Technical Functionality and Business Processes Supporting the Selected Services

Each Service is composed of a set of operational Functions, reflected in defined business processes and technical solutions.

The Functions step is critical because it necessitates either designating the particular business process or technical mechanism being implemented to support the Services required in the use case or the absence of such a business process or technical mechanism.

4.4.1 Functions Satisfying the Selected Services

Up to this point in the PMRM methodology, the primary focus of the use case analysis has been on the “what” - PI, policies, control requirements, the Services needed to manage privacy.  Here the PMRM requires a statement of the “how” – what business processes and technical mechanisms are identified as providing expected functionality.

Task #16:   Identify the Functions that satisfy the selected Services

Examples

“Log EV Location” (uses services Validation, Enforcement, Interaction, and Usage Services):

Function:    Encrypt the EV Location and Agreements and store in on-board solid-state drive

“Transmit EV Location to Utility Load Scheduler System (ULSS)” (uses Interaction, Security, Certification, Validation, and Usage Services):

Function:    Establish a TLS/SSL communication between EV Location and ULSS, which includes mechanisms for authentication of the source/destination

4.5 Risk Assessment

Task #17:  Conduct Risk Assessment

Objective          Once the requirements in the Use Case have been converted into operational Services, an overall risk assessment should be performed from that operational perspective

Constraint        Additional controls may be necessary to mitigate risks within Services.  The level of granularity is determined by the Use Case scope. Provide operational risk assessments for the selected Services within the use case.

Examples

“Log EV location”:

Validation       EV On-Board System checks that location is not previously rejected by EV owner

                       Risk: On-board System has been corrupted

Enforcement   If location is previously rejected, then notify the Owner and/or the Utility

                       Risk: On-board System not current

Interaction       Communicate EV Location to EV On-Board System

                       Risk: Communication link not available

Usage              EV On-Board System records EV Location in secure storage, together with agreements

                       Risk: Security controls for On-Board System are compromised

“Transmit EV Location to Utility Load Scheduler System (ULSS)”:

Interaction       Communication established between EV Location and ULSS

                       Risk: Communication link down

Security          Authenticate the ULSS site; secure the transmission

                       Risk: ULSS site credentials are not current

Certification    ULSS checks the credentials of the EV On-Board System

                       Risk: EV On-Board System credentials do not check

Validation       Validate the EV Location against accepted locations

                       Risk: Accepted locations are back-level

Usage              ULSS records the EV Location, together with agreements

                       Risk: Security controls for the ULSS are compromised


 

4.6 Iterative Process

Goal                 A ‘first pass’ through the Tasks above could be used to identify the scope of the Use Case and the underlying privacy policies and constraints. Additional iterative passes would serve to refine the Use Case and to add detail. Later passes could serve to resolve “TBD” sections that were not previously well-understood.  

Task #18:  Iterate the analysis and refine.

Iterate the analysis in the previous sections, seeking further refinement and detail.

5      PMRM Glossary, plus Operational Definitions for Fair Information Practices/Principles (“FIPPs”)

As explained in the introduction, every specialized domain is likely to create and use a domain-specific vocabulary of concepts and terms that should be used and understood in the specific context of that domain. PMRM is no different and this section contains such terms.

In addition, a number of “operational definitions” are intended to be used in the PMRM to support development of the “Detailed Privacy Use Case Analysis” described in Section 4.  Their use is completely optional, but may be helpful in organizing privacy policies and controls where there are inconsistencies in definitions across policy boundaries or where existing definitions do not adequately express the operational characteristics associated with Fair Information Practices/Principles.

5.1 Operational FIPPs

The following 14 Fair Information Practices/Principles are composite definitions derived from a comprehensive list of international legislative instruments. These operational FIPPs can serve as a sample set, as needed.    

Accountability

Functionality enabling reporting by the business process and technical systems which implement privacy policies, to the individual or entity accountable for ensuring compliance with those policies, with optional linkages to redress and sanctions.

Notice

Functionality providing Information, in the context of a specified use, regarding an entity’s privacy policies and practices including: definition of the Personal Information collected; its use (purpose specification); its disclosure to parties within or external to the entity; practices associated with the maintenance and protection of the information; options available to the individual regarding the collector’s privacy practices; retention and deletion; changes made to policies or practices; and other information provided to the individual at designated times and under designated circumstances.

Consent

Functionality, including support for Sensitive Information, Informed Consent, Change of Use Consent, and Consequences of Consent Denial, enabling individuals to agree to allow the collection and/or specific uses of some or all of their Personal Information either through an affirmative process (opt-in) or implied (not choosing to opt-out when this option is provided).

Collection Limitation and Information Minimization

Functionality exercised by the information collector or information user to limit the information collected, processed, communicated and stored to the minimum necessary to achieve a stated purpose and, when required, demonstrably collected by fair and lawful means.

Use Limitation

Functionality exercised by the information collector or information user to ensure that Personal Information will not be used for purposes other than those specified and accepted by the individual or provided by law, and not maintained longer than necessary for the stated purposes.

Disclosure

Functionality enabling the release, transfer, provision of access to, use for new purposes, or divulging in any other manner, Personal Information held by an entity in accordance with notice and consent permissions and/or applicable laws and functionality making known the information collectors policies to external parties receiving the information.

Access and Correction

Functionality allowing individuals having adequate proof of identity to discover from an entity, or discover and/or correct or delete, their Personal Information, at specified costs and within specified time constraints; and functionality providing notice of denial of access and options for challenging denial when specified.

Security/Safeguards

Functionality that ensures the confidentiality, availability and integrity of Personal Information collected, used, communicated, maintained, and stored; and that ensures specified Personal Information will be de-identified and/or destroyed as required.

Information Quality

Functionality that ensures that information collected and used is adequate for purpose, relevant for purpose, accurate at time of use, and, where specified, kept up to date, corrected or destroyed.

Enforcement

Functionality ensuring compliance with privacy policies, agreements and legal requirements and to give individuals a means of filing complaints of compliance violations and having them addressed, including recourse for violations of law, agreements and policies.

Openness

Functionality making availability to individuals the information collector's or information user's policies and practices relating to their management of Personal Information and for establishing the existence of, nature and purpose of use of Personal Information held about the individuals.

Anonymity

Functionality which renders personal information anonymous so that an individual is no longer identifiable.

Information Flow

Functionality enabling the communication of personal information across geo-political jurisdictions by private or public entities involved in governmental, economic, social or other activities.

Sensitivity

Functionality that provides special handling, processing, security treatment or other treatment of specified information, as defined by law, regulation or policy.

5.2 Glossary

Actor

A data subject or a human or a non-human agent or (sub)system interacting with PI within Privacy Domain or System.

Boundary Object

A sociological construct that supports productive interaction and collaboration among multiple communities

Control

A process designed to provide reasonable assurance regarding the achievement of stated objectives.

Domain Owner

An entity having responsibility for ensuring that privacy controls and privacy constraints are implemented and managed in business processes and technical systems in accordance with policy and requirements.

Incoming PI

PI flowing into a Privacy Domain, or a system within a Privacy Domain.

Internally Generated PI

PI created within the Privacy Domain or System itself.

Outgoing PI

PI flowing out of one system to another system within a Privacy Doman or to another Privacy Domain.

PI

Personal Information – any data which describes some attribute of, or that is uniquely associated with, an individual.

PII

Personally identifiable information – any (set of) data that can be used to distinguish or trace an individual’s identity.

Privacy Constraint

An operational mechanism that controls the extent to which PII may flow between touch points.

Privacy Control

An administrative, technical or physical safeguard employed within an organization in order to protect PII.

Privacy Domain

A physical or logical area within the use case subject to control by Domain Owner(s)

Privacy Management

The collection of policies, processes and methods used to protect and manage PI.

Privacy Management Reference Model and Methodology (PMRM)

A model and methodology for understanding and analyzing privacy policies and their management requirements in defined use cases; and for selecting the technical services which must be implemented to support privacy controls.

(PMRM) Service

A collection of related functions and mechanisms that operate for a specified purpose.

System

A collection of components organized to accomplish a specific function or set of functions having a relationship to operational privacy management.

Touch Point

The intersection of data flows with Privacy Domains or Systems within Privacy Domains.

Appendix A. Acknowledgments

The following individuals have participated in the creation of this specification and are gratefully acknowledged:

Participants:

Peter F Brown, Individual Member

Gershon Janssen, Individual Member

Dawn Jutla, Saint Mary’s University

Gail Magnuson, Individual Member

Joanne McNabb, California Office of Privacy Protection

John Sabo, CA Technologies

Stuart Shapiro, MITRE Corporation

Michael Willett, Individual Member

Appendix B. Revision History

 

Revision

Date

Editor

Changes Made

WD01

2012-01-17

Peter F Brown

Transposition of 5 Jan 2012 draft v09 into official template and re-structuring of document

WD01

2012-01-19

John Sabo

Completion of Objectives section, other minor edits

WD01

2012-01-20

Peter F Brown

Completion of document structure and other edits

WD01

2012-02-01

Michael Willett

Edits throughout

WD01

2012-02-07

Michael Willett

Accept/Reject edits and create clean copy

WD02

2012-02-09

Peter F Brown

Capture initial updates from discussions and TC meeting

WD02

2012-02-15

Dawn Jutla

Insert running Examples

WD02

2012-02-16

Michael Willett

Extensive edits; cleanup

WD02

2012-02-21

Peter F Brown

Formatting edits, plus some clear up of text

WD02

2012-02-23

Michael Willett

Review/accept Peter’s edits

WD02

2012-02-25

John Sabo

Additional edits

WD03

2012-02-29

Peter F Brown

New clean edit following editorial meeting

WD03

2012-03-01

John Sabo

Additional edits

WD03

2012-03-02

Peter F Brown

Incorporation of comments from editors

WD03

2012-03-03

Michael Willett

Reviewed Peter’s edits, plus a few new edits

WD03

2012-03-06

Peter F Brown

Incorporation of final comments from editors

WD04

2012-03-16

Peter F Brown

This draft

 

 



[1] There is a distinction between ‘personal information’ (PI) and ‘personally identifiable information’ (PII) – see Glossary. However, for clarity, the term ‘PI’ is generally used in this document and is assumed to cover both. Specific contexts do, however, require that the distinction is made explicit.

[2] See for example the [SOA-RM]