CSA Congress and the Privacy Academy

Last month, Enterprivacy Consultant Jason Cronk attended the CSA Congress and the Privacy in San Jose. It was a historic event with the Cloud Security Alliance and the IAPP recognizing combined need to address their joint interest with a combined conference. Jason blogged about his visit to the conference for the Collaborista Blog. You can read that story here.

 

Jason is a thought leader in the privacy space, a member of the IAPP Faculty and a consultant, lecturer and writer. To bring Jason’s expertise to your organization be sure to contact us today.

1NCEMAIL the disposable email app.

How many times after completing a purchase at a store have you been given the option of giving them your email address to mail you a receipt? You wonder to yourself, “what are you doing with my email? Are you going to SPAM me?” Unbeknownst to you, many of these retailers may be selling your information to data brokers or tracking you across purchases. This is especially a concern where the email is collected by the payment processor (like SQUARE) who may be building a dossier on you and your sales history.

The solution? Enter 1NCEMAIL, a disposable email alias mobile phone app. Just released for Android devices, 1ncemail gives you a disposable, one time use email to give to merchants. You get your receipt, and the email becomes worthless. By downloading 1ncemail, you get 10 free aliases to try and then additional aliases can be purchased for as low as 1/2 cent a piece, using the Internet currency Bitcoin.

The APP is very simple and easy to use. Install it at https://play.google.com/store/apps/details?id=com.oncemail.disposableemailapp1NCEMAIL - Single use email aliases for your phone.

You can read 1NCEMAIL’s privacy policy at http://1ncemail.com/index.cfm?controller=home&action=privacy

1NCEMAIL is a product of Foryte Web Services, Inc., the parent company of Enterprivacy Consulting Group.

 

 

 

It’s all about the location…..

Following a Gartner report on the different types of location qualifiers for information (physical, legal, logical and political), Enterprivacy Consulting Group consultant R. Jason Cronk guest blogs on to topic on Intralinks, Inc.’s CollaboristaBlog.

CollaboristaBlog is Intralinks‘ enterprise collaboration blog, dedicated to  the latest news, advice, trends and research on data protection, compliance, cyber security, data privacy, technology topics and more.
Jason will be continuing his discussion next month detailing the importance of user managed keys.

 

 

Data Privacy Day 2014

In honor of Data Privacy Day 2014, Enterprivacy Consulting Group is proud to announce the publication by our very own consultant, Jason Cronk, of a new white paper on Privacy by Design (PbD) and Privacy Engineering. Published by the Ontario Information and Privacy Commissioner’s office and coauthored with Commissioner Ann Cavoukian and Stuart Shapiro of the MITRE Corporation, the paper explores the burgeoning field of Privacy Engineering and how it is supportive of the 7 Foundational Principles of Privacy by Design.

The paper can be downloaded here.
Summary

This paper surveys the emerging discipline of privacy engineering. Privacy engineers require multidisciplinary knowledge and skills. To be effective, they need to have an understanding of both technical and non-technical considerations. Privacy engineers are tasked with managing risks. The paper reviews several risk models that they can adopt, some based on Fair Information Practice Principles and legal compliance, others stemming from user-centric harms and integrity of context. Privacy engineers must then apply systematic risk analyses, using tools such as privacy impact assessments, to measure and quantify identified risks. Finally, privacy engineers must design controls to mitigate those risks, including privacy-respecting architectures, effective privacy policies, and a range of data management methods including minimization, anonymization, aggregation, and the use privacy-enhancing technologies.

There is a growing understanding that innovation and competitiveness must be approached from a “design-thinking” perspective – namely, a way of viewing the world and overcoming constraints that is at once holistic, interdisciplinary, integrative, creative, innovative, and inspiring. Privacy, too, must be approached from the same design-thinking perspective. Privacy and data protection should be incorporated into networked data systems and technologies by default, and become integral to organizational priorities, project objectives, design processes, and planning operations. Ideally, privacy and data protection should be embedded into every standard, protocol, and data practice that touches our lives. This will require skilled privacy engineers and common methodologies and tools. This paper seeks to promote a broader understanding and deeper practice of privacy engineering.

Enterprivacy Consulting Group is dedicated to promoting the nascent field of privacy engineering and helping corporations embed privacy into their product and service offerings. If you have questions or your company needs help in this area, don’t hesitate to contact us.

Privacy After Hours in Miami, Florida

Enterprivacy Consulting Group is pleased to announce that we will host the IAPP Privacy After Hours event in Miami, Florida this year in honor of Data Privacy Day. Details below:

Date: January 28, 2014 (Data Privacy Day)
Time: 5pm – 7pm
Location: The News Lounge
Address: 5580 NE 4th Court, Suite 4b & 5a
The Courtyard @ 55th Street Station
Miami, FL 33137

Please RSVP to Leah Simon at pah@privacyassociation.org.

There will be a door prize provided by the IAPP.

We chose Miami for our event because that weekend, prior to Data Privacy Data, we will be in Miami attending the North American Bitcoin Conference. Be sure to drop and say hello to us there.

 

 

 

The perils of investing in technology without considering privacy.

Back in November of 2013, Snapchat, the popular image and video sharing application spurned a $3 billion offer from Facebook. Facebook, losing ground among teens, is eager to purchase services that appeal to this audience. Though the offer was clearly way above Snapchat’s market value, in made sense in Facebook’s portfolio. Fast forward to Christmas 2013 and Gibson Security, a white hat security firm, releases code that reverse engineers Snapchat’s API to potentially pull the entire database of Snapchat users in less than a day. Fast forward a few days later and some hacker does just that. Now, its not like Snapchat wasn’t aware of the potential. In fact, Gibson Security warned them of the potential problem months earlier! Snapchat blithely ignored them.

 

Did Facebook narrowly miss another privacy faux pas?
If the had acquired them, what would the ramifications had been under their FTC consent decree?

There are two lessons to be learned from this incident.

Lesson 1: People (even teens) want privacy!
Snapchat provides limited functionality beyond what most smart phones can do. It allows you to snap pictures (and video), add a comment and then send it to friends. Guess what, almost every smart phone has MMS (multimedia messaging service) capabilities. So what did Snapchat add? The illusion of privacy. That’s right, I said illusion. Why is that? First off, the major selling point of of Snapchat was the fact that “snaps” you sent were ephermal and would be deleted from the recipient’s phone within a few seconds after receipt. This is why people people flocked in the millions to Snapchat, not to replicate the existing functionality of the phone but for the perceived privacy benefits of using Snapchat rather than relying on the end user to delete your picture voluntarily.  I say this was an illusion because with another simple App….SnapCapture, the recipient could preserve the picture you sent. Equally importantly, Snapchat didn’t actually delete your images. Furthermore, Snapchat has the ability to intercept data for law enforcement prior to the recipient opening the image. At least they don’t save the images on their server after they been delivered.

Bottom line: Consumers want privacy but most services fail to deliver it.

Lesson 2: Do you privacy due diligence.
“Snapchat has raised about $73 million in funding to date from investors including Lightspeed Venture Partners, Benchmark Capital, Institutional Venture Partners, SV Angel and General Catalyst Partners.” — USA Today. I understand that most App developers and small business have more to focus on than privacy. They have to build their product, make it work, market it, grow it, etc… However, that is NO excuse when you start getting in the big-time and raise millions of dollars from investors. Privacy is a risk. It is a legal risk. It is a compliance risk. It is a MARKET risk. If you’re going to base your product on provide people privacy, you better be damn sure you provide people privacy. History is replete with failed security and privacy products. Even more on the hook is the investors and venture capital firms. Where were they in their due diligence? Did they completely ignore privacy and security? Were they blinded by the astounding growth in Snapchat’s user base and the faux privacy it was offering?

Fortunately, if you’re reading this, you may have an advantage. If you’re considering investing in a startup or growth company, get a privacy and security due diligence analysis done. Enterprivacy Consulting Group can examine the market, the application, the regulatory environment, the customer demographics and provide you a full analysis. Not necessarily to stall your investment but to make sure they are on the right track and won’t end up the next Snapchat.

Contact us today.

 

Recent events

Enterprivacy Consulting Group’s lead privacy engineering consultant R. Jason Cronk has been busy recently. Coming up next week, on November 19th, 2013, he will be speaking at the Intel Security Conference: “Dealing with the Increasing Complexity of Security and Privacy.” His talk will be on designing for privacy, principles and practices where he will debut his bubble theory of privacy and discuss Privacy by Design and privacy engineering. In addition, he’ll discuss two projects that built privacy into their systems: IBM’s Sensemaking engine and the Estonian ITL’s use of Sharemind secure multi-party computation system to calculate financial metrics of the Information Technology industry in Estonia.

Jason has also been working on several paper and was recently name to the IAPP Faculty. He’ll be training people looking to get  their CIPP certification. Last month Jason spoke on Privacy Engineering to an enthusiastic group at the IAPP Privacy Academy in Seattle at a pre-conference worshop with Dr. Stuart Shapiro.

Is 2013 the Year of the Privacy Engineer?

[Previously published on the IAPP Privacy Perspectives]

Nascent is a term I often use to describe the field of privacy engineering. Not until this fall have the first students of Carnegie Mellon’s Masters of Science in Information Technology—Privacy Engineering started in the newly formed one-year program. And only in the past year or so have Google, Microsoft and other techno-centric firms been advertising openings with variations of privacy engineer in the title. Though the term privacy engineering has been around since at least 2001, only recently has the computer science community tried to use it in a concrete and systematic way.

So what is privacy engineering?

Simply put, it is the inclusion and implementation of privacy requirements as part of systems engineering. Those requirements may be functional or nonfunctional. In other words, privacy may be a necessary function of the system (think TOR as an example) or it may be a beneficial additive but not absolutely essential for the system to operate. Most privacy requirements fall in this latter category.

The goal of the privacy engineer is to create and follow a repeatable process, such that application of the process to a given system under the same conditions will lead to consistent results. The first step in this process is to identify the privacy requirements that should be applied. This is done by incorporating standard or baseline privacy requirements and by looking at the privacy risks, not to the organization but rather to the subject of information held by the system. Common frameworks, such as Ryan Calo’s Subjective/Objective harms or Daniel Solove’s Taxonomy of Privacy, can be used to identify potential privacy problems and then a determination would need to be made of probability of occurrence and severity of impact—risk being a function of probability and severity.

In identifying risks, it is important that the engineer look at the entirety of the system and how people will interact with it. Is there a mutuality of expectations? How do cognitive biases and human irrationality affect their privacy? Does the user experience enhance or detract from privacy? Finally, is the purported benefit of the system legitimate and proportional to the privacy risks? It may be a case where the privacy engineer needs to step back and say, “This isn’t worth the risks and no control can sufficiently mitigate the problems I’ve found.” Assuming this isn’t the case, the next step of the privacy engineer is to identify controls to address the risks.

Controls come in several forms. The one most familiar to the reader will be policy controls, which dictate when and what information can be collected, how it is to be stored and other internal rules that should be followed when dealing with data flowing in, out and within the organization. There are also a host of technical point controls—such as data minimization, encryption, randomization—which can be applied to increase privacy within the system. Finally, and less common, are architectural controls—primarily anonymization and decentralization—which serve to lessen the probability that a harm occurs.

Of course, once all these controls are in place, a new risk analysis must be done. It is an iterative process. Consider an online bookstore of sensitive topics that identifies a risk to clients of being adversely associated with the books they order. The store then decides not to collect names from purchasers as a control (anonymization). However, this decision has created a new risk. When delivered to the purchaser, other residents at the delivery address may open the delivery because the recipient is unidentified, thus exposing the purchaser to exposure at home. This is an instance of a risk-risk trade-off, and this risk must also be managed. Further controls, such as an optional identifier, should then be considered. Even when controls mitigate risks without any side effects, enough residual risk may remain to warrant additional mitigation.

With the daily barrage of news accounts of privacy stumbles and a public growing weary of the constant assault on their information, the role of the privacy engineer is becoming necessary for more and more forward-thinking organizations. It is no longer sufficient in the privacy profession to mitigate for organizational and compliance risks. Personnel must be in place to identify user-centric risks and help design solutions that mitigate those risks and provide the organization the information it needs to operate. The privacy engineer is that person.