Privacy Ethics and Wrongful Collection of Data

A Q&A with Andy Sambandam of Clarip, Inc.

Wrongful collection of private data often occurs unwittingly on the part of both consumers and the companies tracking them. I talked to Andy Sambandam, founder and CEO of Clarip, Inc., a software as a service data privacy platform, about how individuals and organizations can be more savvy about the data collection in every day internet usage and the risks associated with it.

Continue Reading

Don’t Ring the (False) Alarm: When a Data Loss Event Isn’t a Breach

A Q&A with Darin Bielby and Jeremy Batterman of Navigant Consulting’s Information Security & Investigations Practice
During a recent Risk and Insurance Management Society (RIMS) panel discussion, Navigant Managing Director Darin Bielby asserted that 50 percent of the organization’s information security forensic investigations yield evidence that enables legal counsel to counsel companies that a data breach did not occur. These findings typically demand no further action or notification about the event, though some organizations proceed with additional precautionary measures. I talked with Bielby and his colleague Jeremy Batterman about the reality of data privacy events and what forensic investigators are seeing.

Continue Reading

Using Big Data to Protect Against Cyber Risk

A Q&A with Lance Forbes of LemonFish Technologies
Of all Big Data’s capabilities, the means to proactively detect cyber breach events is especially intriguing. I spoke with Lance Forbes, chief scientist of LemonFish Technologies to find out more about how analytics can be used to find lost data across the internet.

Continue Reading

Data Breach Events: A Plaintiff Perspective

Email Computer Key For Emailing Or ContactingA Q&A with John Yanchunis of Morgan & Morgan
The legal landscape around data loss is rapidly evolving, and with major events such as the Anthem breach changing the game on a daily basis, it can be a challenge to keep up with the courts’ current thinking. I spoke with plaintiff attorney John Yanchunis of Morgan & Morgan about some of the most recent developments he’s observed.

Continue Reading

Microsoft on the Frontier for Legal Privacy Protections

Privacy button on keyboardA Q&A with Geff Brown of Microsoft
“Privacy is without a doubt the most exciting area of the law to be involved in right now,” says Geff Brown, assistant general counsel in regulatory affairs at Microsoft. I asked him about the current legal climate for consumers and tech companies around privacy issues and what Microsoft is doing to proactively protect user information.

Continue Reading

Mobile App Data Security

A Q&A with Jack Walsh of ICSA Labs
With the proliferation of mobile devices, businesses from all sectors are now offering apps for consumer and employee use. However, data insecurity, the potential for lost personal information and a lack of developer experience pose a major liability for companies providing mobile apps. I talked to Jack Walsh, mobility programs manager of ICSA Labs, about the major security and privacy issues connected to mobile apps.

So many companies are offering mobile apps these days. What are some of the key issues that risk managers should be aware of?
One of the main problems is that apps are typically developed by a third party. In the olden days, when everyone simply used a computer, applications came from large, well-known and respected companies such as Microsoft and IBM, which followed lengthy processes and procedures to better ensure safety, security and functionality. With hundreds of thousands of apps available now we have many smaller players developing mobile apps, and processes and procedures can be significantly different from one developer to the next. Even if you choose someone with experience, this is a relatively young field and it’s challenging to ensure that things are done properly. The second problem is that no one is able to test all of these apps—part of the trouble is that there are currently few good tools out there. The third concern is that the apps need to be compliant with whatever regulatory guidance is associated with the app developing company’s given industry, and that problem is going to continue to be a concern for many companies.

What kind of security risks are companies facing now?
An app can actually be malicious, doing something it was either intentionally or inadvertently made to do. For instance, there are disreputable ad networks out there that have served up malware. This is becoming more prevalent. Another area to watch are libraries for different apps—a developer who doesn’t want to do everything from scratch might link in an existing library outside of their control that could get you into trouble as well. Beyond that, apps might have vulnerable code or contain dead or recycled code; they might be overpermissioned or underpermissioned.

Is there a ‘case study’ you can offer as a lesson?
There have been issues with regards to libraries—there are people out there who intentionally create a harmless looking library that contains hidden malware. It’s not an everyday occurrence, but we’re seeing it more. In Russia and China we’ve also seen many apps that send SMS messages surreptitiously to premium numbers, billed to the user without the user’s knowledge. This has happened multiple times recently. There are a number of studies out there of the top 100 free and paid Apple iOS and Google Android apps that describe how those apps – many of which we all use every day – are not safeguarding user’s sensitive information.

What can a company do to mitigate its mobile app risk exposure?
Well, first and foremost, they should ask questions of their developer and find out how they’re building the app, what sort of processes and procedures they follow. Many times a developer will give what seems like an acceptable answer—“I use a secure platform for app development”—but the risk manager should dig deeper. Press them to find out what’s being done beyond that in the way of testing. Who else, if anyone, looks at the code or tests the resulting app for the developer? It’s always easier for someone else to see mistakes than the person who wrote it. Most importantly, does the app adequately protect sensitive information when it stores and transmits it? Does it use encryption and other storage and data transfer safeguards? Is it possible to defeat the authentication mechanism? Can other apps get access to the data through improper use of APIs in one or more library that are outside the direct control of the app developer? There are best practices out there to avoid these occurrences. Note, too, that testing should not be just a one-time thing. Apps need to be tested throughout their life cycle, any time the code is upgraded or the operating system gets an update the app should be looked at again. There’s no such thing as too much testing.

In summary…
Mr. Walsh, who tests mobile apps for security and safety, raises an important red flag here. All types of businesses and organizations, both tech and non-tech, are deploying newer technologies that can drastically increase their cyber liability risk exposure. Just the other day I saw that even my local hoagie shop was offering its own mobile app—God only knows where that user data is going! 

Privacy matters. Moreover, if your mobile app doesn’t have a privacy policy that is transparent about the kinds of data being collected, stored and shared with third parties it can land your company in major legal trouble. “Wrongful data collection,” or running afoul of your own privacy policy, is one of the fastest growing areas of litigation in the cyber risk area. See, for example, the state of California suing Delta airlines for their mobile app, pursuing fines of $2500 per download. Not having a privacy policy in this day and age is simply reckless.

Note: eRisk Hub® users, please see our free mobile app privacy policy template by Ron Raether, Esq. of Faruki Ireland & Cox P.L.L. on the Tools section.

Data Collection Liability and Trends

A Q&A with Dominique Shelton
The area of mobile app customer data collection is fraught with heightened regulatory interest. In the past 18 months, industry leaders have come together to create the Best Practices in Mobile Data Collection guidance while the California Attorney General released “Privacy on the Go.” To get a better handle on the issue of mobile app data collection, I spoke with Dominique Shelton of Edwards Wildman Palmer, LLP in Los Angeles, CA.

Can you provide an example of a business being sued for wrongful data collection?
It is important to keep in mind that data collection has been challenged by plaintiffs and regulatory agencies when an argument can be raised that the consumers did not understand that their data was being collected. The focus of the FTC and certain regulators like the California Attorney General has been on compliance with “privacy by design.” This means giving consumers sufficient notice and the choice to make meaningful decisions about their data and how it’s used. Currently, there are over 176 class actions pending around the country, based on behavioral advertising or tracking information to create customized products or targeted advertising. One recent example is the case the California Attorney General brought against Delta Airlines for not posting a mobile privacy disclosure. We have also seen class actions filed against a major studio and search engine in December, challenging mobile apps that allegedly collect behavioral information from users under the age of 13. So in this environment it’s very important for companies to take a look at this issue.

What types of data can get businesses into trouble?
Certainly, any personally identifiable information, such as name, address, phone numbers or email that may be collected without disclosure. And now we’re starting to see risk around behavioral data that identifies the user’s mobile device or social networking ID associated with their Facebook profile, for instance. Those identifiers are considered personally identifiable information by regulators. The CA AG recently issued the “Privacy on the Go” report, in which personally identifiable information is defined as “any data linked to a person or persistently linked to a mobile device—data that can identify a person via personal information or a device via a unique identifier.” Also, the FTC is moving towards a definition of personal information that includes any data “reasonably linkable” to an individual based upon the comments of users that unique identifiers can be linked to other information to identify users personally. Of course, we are also seeing financial and health data as an area of interest and this is considered “sensitive” information as well.

Are there any states with stricter laws that increase liability?
No. California doesn’t have a law yet. There is a Do Not Track bill pending but it hasn’t gone through to a full vote. In the context of class actions, most have been filed using older statutes such as the Electronic Privacy Act and the Computer Abuse Act, claiming that tracking violates those statutes. Although there is not a distinct Do Not Track bill, it is important to know that this issue is getting greater attention by regulators and this in and of itself may create a new standard. For example, “Privacy on the Go” does create a de facto new standard for the country by recommending security, encryption and protection standards for unique identifiers and behavioral data that was previously considered by many companies to be non-personally identifiable information, not subject to enforcement. Further, the class actions that have been brought based upon the creation of ad profiles from the users’ online behavior also acts as a check for how companies should consider compliance. Further, the SEC October 2011 guidance calling for disclosure of all material cyber risks by public companies is useful.

What can a business do to mitigate their risk?
First, take a look at online disclosure and mobile disclosure policies. Supplemental notice is ideal. Make sure that someone in the company is responsible for privacy compliance—it could be the chief privacy officer or someone in the legal department or product development. They should be in touch with self-regulatory groups that have promulgated guidelines to address these issues. At a minimum, the company should be in step with its peers in addition to focusing on the latest guidance materials and which activities will attract the attention of regulators. There needs to be a dialogue between the privacy group and product development and marketing so that when new products are rolled out they can vet their use of customer information and make sure they are up to date on their legal obligations. Also, conducting annual trainings on privacy practices is a good idea and recommended by the CA AG.

In summary…
Ms. Shelton has provided an excellent summary on some emerging ‘big data’ issues that impact organizations that collect and use private information without adhering to reasonable privacy principles. Dominique mentions the Delta Airline case, which could serve as a bellwether case for mobile apps that are deployed without much regard for privacy policy/principles (i.e., customer notice and consent to the use of their personal information). Delta is facing a massive California AG penalty of $2500 per download (reportedly, Delta had a million downloads). One might argue what Delta did with its mobile app is a common practice, one that needs to be ceased immediately.

No more posts.