Download Motivation

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Nonlinear dimensionality reduction wikipedia , lookup

Transcript
Privacy Concerns in Upcoming Residential and
Commercial Demand Response Systems
Mikhail Lisovich, Devashree Trivedi, and Stephen Wicker
Department of Electrical and Computer Engineering
Cornell University
TRUST Spring Conference, April 2008
Privacy in the Home
Privacy is the interest that individuals
have in sustaining a 'personal space',
free from interference by other people
and organizations.
•Privacy of the Person
•Privacy of Personal Behavior
• Privacy of Personal Communications
•Privacy of Data
TRUST Spring Conference, April 2008
Privacy in the Home
Interested Parties:
Police
Employers
Marketers
Criminals
Presence
Dinner times
Sleep schedule
Appliances
Shower times
• ANY activity involving electricity, water, and gas
TRUST Spring Conference, April 2008
Privacy in the Home
Q:How real is the threat?
A: Very. Three contributing factors:
Technology: AMI/AMR, NILM
(Nonintrusive Load Monitoring)
Precedent for Repurposing: Drug
production screening. Involves Austin
Police Department, others.
Legal Precedent:
Smith v. Maryland
US. v. Miller
TRUST Spring Conference, April 2008
Outline

Introduction



Background





Brief Overview
Interested Parties
Abuse Cases
Privacy Metric
Experiment





Main Claim
Summary of TRUST Efforts
Overview
Experimental Setup
Algorithms
Results
Discussion


Algorithm effectiveness
Privacy Implications
TRUST Spring Conference, April 2008
Outline

Introduction


Motivation
Summary of TRUST Efforts
TRUST Spring Conference, April 2008
Motivation

Next generation demand-response architectures are increasingly
deployed by major utilities across the US.

Advantages: cost savings in power generation, increased grid
reliability, new modes of consumer-utility interaction.

Disadvantage: Increased availability of data creates or exacerbates
issues of privacy and security.
Our Main Claim: In a lax regulatory environment, the detailed
household consumption data gathered by advanced metering
projects can and will be repurposed by interested parties to reveal
personally identifying information such as an individual's activities,
preferences, and even beliefs.
TRUST Spring Conference, April 2008
TRUST Efforts

Cornell, Berkeley School of Law have focused on the
privacy risks arising from the collection of power
consumption data in current and future demand-response
systems.

Berkeley: law & policy aspects



D. Mulligan, J.Lerner have written an article in the Stanford Technology
Law Review chronicling the evolution of court opinion toward energy data
privacy and calling for its constitutional protection.
Collaborated with the California Public Utilities Commission (CPUC) to
develop a set of draft guidelines for a secure and privacy-preserving
demand response infrastructure.
Cornell: technological aspects



Highlighted the importance of NILM algorithms for extrapolating activity.
Proposed a formal way of evaluating privacy risks.
Conducted a proof-of-concept technical study.
TRUST Spring Conference, April 2008
Outline

Introduction



Motivation
Summary of TRUST Efforts
Background



Brief Overview
Interested Parties
Abuse Cases
TRUST Spring Conference, April 2008
Technical Overview

Advanced Metering Infrastructure (AMI)

Collects time-based data at daily, hourly or
sub-hourly intervals
TRUST Spring Conference, April 2008
Technical Overview (contd.)

Non-Intrusive Load Monitoring (NILM)

NILM: fundamental tool for extrapolating
activity
TRUST Spring Conference, April 2008
Players/Abuse Cases

Law Enforcement
–
–
Detecting Drug Production.
Supreme Court boundaries (as such)::
1.
2.

Employers
–


Kyllo v. US - Information obtained, using sensors,
about activity within the home that would not otherwise
have been available without intrusion constitutes a
search
Smith v. Maryland, US v. Miller - records freely given to
third parties not protected under 4th Amendment
Employee Tracking
Marketing Partners
Criminals
TRUST Spring Conference, April 2008
Outline

Introduction



Background




Motivation
Summary of TRUST Efforts
Brief Overview
Interested Parties
Abuse Cases
Privacy Metric
TRUST Spring Conference, April 2008
Privacy Metric

Goal: a metric which associates the degree of data availability (accuracy of readings, time resolution,
types of readings, etc) with potential privacy risks, providing a robust and reliable indicator of overall
privacy.

Extrapolating activity may be thought of in two stages
–
–

Performance Evaluation:
–
–

First stage: NILM in combination with data from other sensors is used to extract appliance usage, track an individual's
position, and match particular individuals to particular observed events.
Second stage: intermediate data is combined with contextual data (such as the number/age/sex of individuals in the
residence, tax and income records, models of typical human behavior).
First stage: at most, the gathered information will reveal everything that's happening in the house (precise information
about all movements, activities, and even the condition of appliances)
Second stage: more difficult to define an absolute performance metric - the number of specific preferences and beliefs
that can be estimated is virtually limitless. In order to develop a comprehensive privacy metric, one needs to carefully
define a list of `important' parameters, basing importance both on how fundamental a parameter is (how many other
parameters may be derived from it) and on home/business owners' expectations of privacy.
Summary: The list of important second-stage parameters form the evaluation criteria. Algorithms for
estimating the parameters, along with the corresponding data requirements, provide a method for
evaluating the sufficiency of available data. Together, these provide a metric for how much information
may potentially be disclosed by a particular monitoring system.
TRUST Spring Conference, April 2008
Outline

Introduction



Background





Motivation
Summary of TRUST Efforts
Brief Overview
Interested Parties
Abuse Cases
Privacy Metric
Experiment




Overview
Experimental Setup
Algorithms
Results
TRUST Spring Conference, April 2008
Experiment:
•
Monitored a student residence continuously over
a period of two-weeks.
•
Gathered electrical data from the breaker panel,
visual data from a camera.
•
Camera logs included activities such as:
•
•
•
•
•
Turning household appliances on or off
Entering or leaving the residence
Sleeping
Preparing meals
Taking a bath
TRUST Spring Conference, April 2008
Experimental Setup
Floorplan
Data Gathering Setup
TRUST Spring Conference, April 2008
Setup Photos
TRUST Spring Conference, April 2008
Algorithm: Details

Parameters to be estimated:
–
–
–
–

Presence/Absence, Number of Individuals
Appliance Usage
Sleep/wake cycle.
Miscellaneous Events - Breakfast, Dinner, Shower.
Sample Interval:
TRUST Spring Conference, April 2008
Participant Privacy
TRUST Spring Conference, April 2008
Evaluation Criteria

Compare behavior extraction results against reference results from
camera data. Two Metrics:

Event based:
1.
2.
3.
4.

Define the cutoff threshold T_thresh
For each parameter, examine the sequence of turn-on/turn-off events on both the
reference and estimated intervals.
If a camera event occurs but a corresponding electrical event does not occur within
T_thresh seconds, declare a Failure to Detect.
If an electrical event occurs but a corresponding camera event does not occur within
T_thresh seconds, declare a Misdetection.
Global Perspective:
Compute correctly classified percentage of the reference interval.
TRUST Spring Conference, April 2008
Algorithm: Implementation 1

Accumulate Raw Data:

Find Switching Events:
TRUST Spring Conference, April 2008
Algorithm: Implementation 2

Match events to
appliances:

Use heuristics to
estimate parameters
of interest:
TRUST Spring Conference, April 2008
Results
Electrical Data (Seconds Plot)
Estimated Presence/SleepWake Intervals
2500
1600
Reference SleepWake:
1400
0
1
0
1
0
1
1
0
1
0
1
0
0
1
0
Estimated SleepWake:
2000
1200
1
0
0
1
Reference Presence:
1000
1500
1
0
1
1
Estimated Presence:
800
1
0
1
1
1000
600
400
500
200
0
0
Day 1
0.5
Day 2
1
1.5
Day 3
2
2.5
Day 4
3
5
0
0
Day 1
0.5
Day 2
1
1.5
Day 3
2
x 10
2.5
Day 4
3
5
x 10
TRUST Spring Conference, April 2008
Performance

For the training data set, 101 of approximately 104 refrigerator events (more than
97%) were correctly classified. Results were similar (97%) for the experimental
set.
TRUST Spring Conference, April 2008
Outline

Introduction



Background





Brief Overview
Interested Parties
Abuse Cases
Privacy Metric
Experiment





Motivation
Summary of TRUST Efforts
Overview
Experimental Setup
Algorithms
Results
Discussion


Algorithm effectiveness
Privacy Implications
TRUST Spring Conference, April 2008
Discussion
•
Our behavior extraction algorithm was a proof-ofconcept. Future algorithms will show vast performance
improvements.
•
Useful data can be extracted by less potent technology.
•
•
Hourly power averages such as the ones produced by California's AMI
system may also be used to determine presence and sleep cycles,
although to a coarser degree. Major appliances a large steady state
power consumption (e.g. heat lamps) can also be identified.
Future concerns are not limited to the performance of
these systems the level of on an individual household.
•
•
Algorithms are fully automated, so analysis may be done on a
extremely large scales.
Easy access to such personal and demographic information will
inevitably generate a market for it!
TRUST Spring Conference, April 2008
Discussion (contd.)
•
Data data mining of hourly usage data by utilities be
carefully monitored and regulated.
–
•
The authors of the report to the California Energy Commission advise
that utilities should become subject to more stringent rules on the
release and re-use of personal data as data mining practices develop
and new information in which consumers have a reasonable expectation
of privacy is exposed.
Our paper fleshes out the details of this recommendation:
1.
2.
3.
Our discussion of interested entities and motivations shows that
repurposing of consumption data creates real privacy concerns for
the consumer, and by extension highlights the reasonable
expectations of privacy that he or she should develop.
Our technical discussion and proof of concept demonstration shows
what data mining may be capable of, illustrating the extent to which
consumer privacy can be violated.
Finally, our privacy metric framework, in combination with the
technical discussions, allows one to more precisely define the
permitted and prohibited uses of data mining.
TRUST Spring Conference, April 2008
Thank you for your time!

Questions?
TRUST Spring Conference, April 2008
Conclusion
Where, as here, the Government uses a
device that is not in general public use, to
explore details of the home that would
previously have been unknowable without
physical intrusion, the surveillance is a
'search' and is presumptively unreasonable
without a warrant.
-Justice Scalia, Kyllo v. US
TRUST Spring Conference, April 2008