Privacy is contextual and social, less legal and technical

Privacy is more than a couple settings and a consent checkbox on a form somewhere. Privacy and publicity seem to be pretty straightforward concepts and, legally, they are treated fairly superficially and defined mechanically. A result of that is a similarly superficial treatment in conversations about privacy and publicity in social and commercial engagements which rarely touches on what privacy really means to us. This leaves us fundamentally confused and conflicted about privacy because we have a deeper sense of what privacy means to us but the typical conversation about privacy lacks the language to describe that deeper sense of it all.

Anil Dash and dana boyd recently published articles on Medium titled “What is Public?” and “What is Privacy?“, respectively, which dive deeper into what publicity and privacy mean to us. If you are interested in what privacy and publicity mean in modern times, you should read both articles carefully:

What Is Public? andWhat Is Privacy?

One of the paragraphs in Dash’s article that stood out for me was this one:

What if the public speech on Facebook and Twitter is more akin to a conversation happening between two people at a restaurant? Or two people speaking quietly at home, albeit near a window that happens to be open to the street? And if more than a billion people are active on various social networking applications each week, are we saying that there are now a billion public figures? When did we agree to let media redefine everyone who uses social networks as fair game, with no recourse and no framework for consent?

I agree more with boyd that privacy is more about social convention. I particularly like this extract from boyd’s article:

The very practice of privacy is all about control in a world in which we fully know that we never have control. Our friends might betray us, our spaces might be surveilled, our expectations might be shattered. But this is why achieving privacy is desirable. People want to be in public, but that doesn’t necessarily mean that they want to be public. There’s a huge difference between the two. As a result of the destabilization of social spaces, what’s shocking is how frequently teens have shifted from trying to restrict access to content to trying to restrict access to meaning. They get, at a gut level, that they can’t have control over who sees what’s said, but they hope to instead have control over how that information is interpreted. And thus, we see our collective imagination of what’s private colliding smack into the notion of public. They are less of a continuum and more of an entwined hairball, reshaping and influencing each other in significant ways.

I also think this next extract nicely captures why people become angry with brands and why reputational harm happens at an emotional level. If you represent a brand, you should read this a few times:

When powerful actors, be they companies or governmental agencies, use the excuse of something being “public” to defend their right to look, they systematically assert control over people in a way that fundamentally disenfranchises them. This is the very essence of power and the core of why concepts like “surveillance” matter. Surveillance isn’t simply the all-being all-looking eye. It’s a mechanism by which systems of power assert their power. And it is why people grow angry and distrustful. Why they throw fits over being experimented on. Why they cry privacy foul even when the content being discussed is, for all intents and purposes, public.

Privacy is contextual. Law is also a poor mechanism for protecting it because law tends to be mechanical (it has to be). What we need more is a better awareness of what privacy and publicity mean in a social context and where the line is.

Jeff Jarvis made a statement about privacy in This Week in Google 261 which really caught my attention:

Privacy is a responsibility. It is an ethic of knowing someone else’s information.


Photo credit: Lost in Translation by kris krüg, licensed CC BY-SA 2.0

Apple tells developers not to share health data with advertisers

According to The Guardian, Apple has imposed contractual restrictions on developers that prohibit them from sharing health data they may receive through an anticipated range of health-related apps which iOS 8 will usher in through a platform called HealthKit:

Its new rules clarify that developers who build apps that tap into HealthKit, of which Nike is rumoured to be one, can collect the data it holds.

But, they stated, the developers “must not sell an end-user’s health information collected through the HealthKit APIs to advertising platforms, data brokers or information resellers”. Although, the rules add that they could share their data with “third parties for medical research purposes” as long as they get users’ consent.

These sorts of apps have enormous potential to benefit consumers and, at the same time, they represent a profound risk to consumers because our most intimate personal information is being accessed. How developers and device manufacturers handle this data is bound to inform a new generation of privacy complaints and reputational harm case studies in the years to come.

Sharing more with Facebook to improve its value

This point in Kevin O’Keefe’s article titled “Facebook eliminating the junk in your News Feed” on Facebook “click bait” made an interesting point about using Facebook more to improve its value to you as a user:

All too lawyers and other professionals I speak with complain about all the junk they see on Facebook. Part of the reason is that they don’t use it enough to help Facebook know what they like. At the same, Facebook acknowledges they have a problem with “click bait.”

What interests me about this point is that we often think that sharing more with Facebook equates to even more junk in our News Feed because the more you share on Facebook, the more signals you send to the social network and these signals inform the ads and suggestions you receive (probably the same with Google).

Instead, what O’Keefe seems to be saying is that using Facebook more helps Facebook’s algorithms refine your experience with more relevant ads and suggestions:

Just as Google wants you to receive what you are looking for on a search or a news program wants to get you the most important news, Facebook wants you to receive what you consider the most important information and news.

Perhaps more importantly, it seems that using Facebook more actively also helps Facebook determine what to show you more of in your News Feed. This is helpful given that you don’t actually see everything your Facebook friends share in your general News Feed, only what Facebook’s algorithms think you want to see more of.

From a privacy perspective, this approach suggests that you should share more of your personal information for an improved and more relevant Facebook experience, not less. It isn’t an approach designed to restrict the use of your personal information as a strategy to better protect your privacy but rather intended to use more of your personal information in a way that adds more value to you, as well as Facebook.

It reminds me about Jeff Jarvis’ point a while ago about how brands that know more about you can present a more relevant experience of their services to you. Which would you prefer?

Rewritten WASPA Code better regulates mobile services in SA

The new WASPA Code of Conduct is a complete rewrite of the Wireless Applications Service Providers’ Association’s rules which regulate the mobile content and services industry in South Africa. One of the biggest changes to the Code is a consolidation of the old Advertising Rules and the Code of Conduct itself along with a dramatically scaled down body of rules governing advertising copy. The changes go further than restructuring the old framework. As WASPA’s advisory note presenting an overview of the new version of the Code points out –

The revised Code of Conduct incorporates the most important portions of the Advertising Rules, but without many of the unnecessarily restrictive details in those Rules. The new Code is organized more clearly and logically than previous versions, aims to be less open to interpretation, and intends to function as an effective set of principles for the WASP industry, as it exists a decade after WASPA’s formation.

Of course this version of the Code doesn’t exist in a vacuum. WASPA’s adjudicators and appeals panelists (which includes me) have documented their interpretations of various provisions of older versions of the Code in a substantial library of rulings over the years and one of the challenges in the near term will be harmonising those rulings with the new Code and preserving guidance on a range of topics including subscription service marketing, service “bundling” and spam and applying that guidance to the new Code’s clauses.

The WASPA Code has been better aligned with legislation such as the Consumer Protection Act, the Protection of Personal Information Act and the Electronic Communications and Transactions Act which should translate into improved consistency between the law and the Code, as a self-regulatory framework. The new Code also reinforces WASPA’s importance as a regulatory body in the South African mobile content and services industry. This comes at a good time as the trend towards mobile services is only going to strengthen going forward.

Your future digital government

I had to apply for unabridged birth certificates for our children the other day so I sat down in front of my laptop, browsed to the Department of Home Affairs’ website and logged into the secure Civic Services portal to start the process. I used my new ID card with its embedded personal digital certificate and a one-time code from my smartphone to authenticate myself.

As you can imagine, Home Affairs has all my details and who our kids are so all I really had to do was select the option for the unabridged birth certificates and place the order. The system informed me that because this was the first time I had requested these particular birth certificates there wouldn’t be a charge. I received a confirmation of my request along with digitally signed and locked digital versions of our kids unabridged birth certificates about five minutes after I concluded my request.

The birth certificates were in PDF and I quickly verified that they were signed by Home Affairs using the Department’s current public key (they were) and then forwarded them on to the service provider that requested them from us.

At this point you are probably wondering how I managed to do all of this? You probably had to drive out to your local Home Affairs office, fill out the forms on paper and wait in line to hand the forms over to the person behind the counter and be told you’d have to wait six to eight weeks for the birth certificates to be printed out in Pretoria and delivered to that office. You would then have to return to the office with your receipt so you could collect the pages.

My story is completely hypothetical. That process is not currently possible at the moment. This isn’t because the technology doesn’t exist, it does, or because the law doesn’t currently cater for it, it does. Implementing processes like this requires a different approach to digital government services. In this particular case, the starting point is likely a combination of a number of factors:

  • A secure, complete and accurate citizens’ and residents’ database;
  • A secure portal through which citizens and residents can access government services using a unique digital identity which is linked to the data the government has about them;
  • Digital certificates issued to each citizen and resident along with each person’s national identity;
  • A convenient means of both securing and using a digital identity to authenticate each citizen and resident that has a cross-platform mobile as well as conventional desktop interface.

The Electronic Communications and Transactions Act provides a broad framework for much of what would be required, including digital signatures, digital documents and data retention and evidence. The benefits could be to radically streamline government services and empower citizens to transact more securely and effectively with each other. These benefits are not reserved for government services, they extend to private services too. In fact, a single secure and digital identity for South Africa’s inhabitants could serve as a platform for a variety of providers to develop engagement models that could transform how the country functions.

So why isn’t such a system being developed (or in place already – much of the technology required has probably existed for some time now). The Verge has an interesting post on this topic titled “Our future government will work more like Amazon” which has a few relevant observations, including this one:

The problem is logistics. Sure, the Postal Service would probably love to have some fresh resources to boost up these facilities. But consolidating many offices into one is never easy. And reappropriating human resources would definitely be controversial. But with good digital systems to reduce paperwork, remember previous encounters with citizens, and greatly reduce the need for people to visit brick and mortar offices in the first place, it’s certainly feasible.

From a legal perspective there are very few barriers to this sort of future. Aside from logistics, the challenge is that our culture is still heavily invested in paper and paper paradigms and the change to digital workflows seems to be prohibitively complicated. That said, there are many benefits to going digital including cost savings, better security and improved redundancy (if you work with paper files, how much redundancy is built into your filing system?).

Simply adopting the necessary technologies isn’t going to solve the problem either. Effective implementation is essential and failing to do this has led to controversies such as the SANRAL consumer data exploits we read about recently. I came across another example of poor implementation when I began writing this post this morning, somewhat ironically from the South African Post Office’s Trust Centre which is charged with delivering trusted digital identity solutions:

SAPO_Trust_Centre_screenshot_-_authentication_problem

Leaving aside what must be an oversight, the Trust Centre delivers a key component in this future digital economy. An advanced electronic signature, for example, opens the door to a range of digital transactions otherwise reserved for paper-based transactions. One of the things I would like to do, as an attorney, is commission affidavits digitally. That is only legally possible if both I, as the attorney, and the person who wants to have an affidavit commissioned have advanced electronic signatures. At the moment this has to be done in person but when both parties have advanced electronic signatures (and have been authenticated by the Trust Centre), this could probably take place remotely. That, alone, represents a cost and time saving. Other transactions which become possible include digital contracts to sell land and even truly digital wills.

Going digital can transform how we function and how businesses and government operate. It just takes vision, an understanding of the legalities and risks and sensible technology implementations.

4 suggestions for preserving your digital assets for your heirs after you die

What will happen to your online profiles and data when you die? Before you answer that your digital stuff isn’t all that important so who cares, consider what you are using the digital cloud for:

  1. Email that increasingly includes bank statements, insurance policy information and functions as a backup for when you forget your password for your online profiles;
  2. Document storage and backups for all those policy documents, scans of your ID and passport, accounting records and tax returns;
  3. Photos and videos of your family going back years, decades even (have you maintained your print photos and offline video files to the same extent?);
  4. Various social profiles which you use to keep in touch with friends and family on a daily basis.

The cloud is more than just an incidental part of your life. Unless you are a committed paper-based archivist, you probably have more and more of your life recorded in bits stored on servers around the world and you are likely the only person who can access that data. When the time comes for you to leave this life your family will need to access that data for various reasons and, short of a séance, you won’t be in a position to pass along your access credentials if you don’t plan ahead.

Here are 4 suggestions for how you can do to make sure your family can access your digital assets after you pass on:

  1. Use a password manager like LastPass or 1Password to store all your passwords and key information (I use LastPass and it enables me to store credit card information, ID and passport information and a variety of other sensitive data securely) and use a strong master password to secure your password manager profile (while you’re at it, change your passwords to unique and more secure passwords to protect your profiles better).
  2. Tell your family about your online profiles and how to access them in your will or in a document you leave with your will. If you use a password manager, share the master password with trusted family members or friends so they can unlock your digital assets when the time comes.
  3. Backup your data regularly and automatically. Don’t rely on manual backups. Automate them. Use whichever secure and reliable backup service you prefer (popular options include Dropbox, Google Drive and more) but make sure they include your important stuff and work properly. Storage is becoming cheaper all the time so you should have plenty of space for all your stuff.
  4. Organise your digital archives so they can be easily searched and key documents located by your heirs. One of the first things your family will need to do when you die is report your estate to the relevant authorities and they will need key information to do that. Check with your attorney what they will need and collate that information for them in a convenient folder or location and share that with your family ahead of time.
  5. Make sure you identify all your key online services to your family and explain to them how to access them and your data. Don’t assume that everyone knows the services you use and how to use them effectively. They may not share your passion for those services but you probably don’t want to add to their aggravation by forcing them to stumble around unfamiliar services while grieving for you.

Image credit: ‘Til Death Do Us Part by [n|ck], licensed CC BY 2.0

Facebook Messenger is not the privacy threat you should be concerned about

Many people are focused on the permissions they give Facebook when they install Facebook Messenger and are concerned that they are giving Facebook excessive access to their devices. This isn’t necessarily the case and this growing panic may be more a function of how Android permissions have to be obtained than a real privacy threat which many have read into those permissions.

Facebook _Messenger_iOS_6_RGB smallI found myself listening to a discussion on 94.7 this morning about Facebook Messenger. The breakfast team was talking about these permissions that have attracted so much attention as if installing Messenger instantly compromises users and leaves them exposed to all sorts of privacy invasions when microphones and cameras turn on at someone else’s behest.

The panic level rose a few more notches when the breakfast team received a call from an anonymous listener who told the team that part of his work involves remotely accessing people’s devices (presumably part of lawful investigations) and exploiting these sorts of permissions. It wouldn’t be unreasonable to draw the conclusion that giving Facebook these permissions to access your phone’s microphone, camera and other features somehow makes all of those features available to anyone wishing to exploit that level of access and spy on you.

Fortunately it isn’t as simple as that. Leaving aside the risk that Facebook, itself, grants access to your devices to 3rd parties without your knowledge or that its apps have vulnerabilities which are not patched and are exploited by unscrupulous 3rd parties, Facebook isn’t the threat. I spoke to Liron Segev, an IT Consultant and one of the first people I think about when I need some help with the technical aspects of IT security. He explained that the threats to consumers come from various sources and that poor security awareness on consumers’ part is a contributing factor.

To begin with, it is possible for a 3rd party developer to introduce apps to app stores that appear to have a particular functionality but, below the surface, these apps will scan installed apps on your device, attempt to impersonate or even supplant those apps and exploit the access permissions you gave to the legitimate app. These trojan apps would then take advantage of the sorts of permissions you grant Facebook Messenger to access your device microphone, camera and other features. Avoiding this risk largely comes down to only installing apps you trust and how well the app marketplace is regulated and protected from this sort of malware. More and more security experts recommend installing anti-virus software on your mobile devices to help protect you from these sorts of attacks.

A hidden threat few people outside the security industry are aware of comes from the mobile networks we use every day. Mobile networks have the technical ability to gather data from our devices and even remotely install applications without us being aware of this in order to use that data and access to our devices’ features for a variety of reasons ranging from network performance management to remote surveillance and law enforcement. On the one hand, there are good reasons for networks and governments to have the capability to monitor criminal threats (for example, the somewhat misunderstood capability Google has to monitor Gmail for child porn using an existing database of problematic images). We live in a world where the bad people use advanced encryption and digital tools to plan and conceal their activities. On the other hand, there is also scope for governments and companies to use these capabilities to spy on citizens, infringe their rights and exploit their personal information for profit. As I mentioned in my htxt.africa article “Much ado about Facebook Messenger privacy settings, but is it nothing?” –

Whether you use Messenger should be informed by the extent to which you trust Facebook, not by the very explicit and informative permissions Facebook seeks from you in order to use Messenger. If anything, Facebook is just proving that it has come to a long overdue realisation that there is no benefit in deceiving users.

It is possible that Facebook may turn on your phone’s camera and microphone while you are getting dressed in the morning but highly unlikely. What is more likely is that Facebook requires those permissions to enable Messenger to do what you want and expect it to do. That said, you can’t be complacent and install every app on your device that seems amusing. Take the time to satisfy yourself that the app is from a credible source and look into anti-malware software for your devices. As for mobile networks and governments, there is little you can do except reconsider your device choices if you are concerned about this. Segev pointed out that Blackberry devices are still secure options and Blackberry 10.x is a flexible option even if it isn’t popular media’s darling.

Shifting Facebook privacy challenges

When you think about causes for concern when it comes to privacy online, Facebook frequently comes to mind. The world’s largest online social network has roughly 1.32 billion monthly active users with an average of 829 million active daily users in June 2014. It’s no wonder that privacy regulators are watching Facebook and other large providers carefully.

As David Meyer pointed out in his article on GigaOm titled “Facebook has only “pivoted” on one kind of privacy — in other ways, it’s becoming more dangerous“, Facebook has changed but its not necessarily positive:

I must give credit where it is due: As Slate’s Will Oremus wrote in a piece called “Facebook’s Privacy Pivot” a few days ago, the social network has greatly improved its handling of user privacy in recent months. In a sense.

Once a company that seemed to delight in undermining its users’ choice of privacy settings, these days the social network promotes “friends” rather than “public” as its default post setting, it has an “anonymous” version of its site login tool that limits what personal information logged-into services can see, and it’s just generally less… shifty. Hooray for that.

However, there’s privacy and there’s privacy – and the kind that Facebook has decided to no longer play games with is just one facet, albeit an important one. Broadly speaking, it’s the kind that relates to providing a reliable border between private and public spaces. As for privacy from Facebook itself, its advertising customers and surveillance-happy authorities, that’s an entirely different matter.

The challenge facing Facebook users now is that, although Facebook has found a way to better respect users’ sharing preferences, it has found a new revenue option that is based on leveraging the data it holds from its users. As I pointed out in my post titled “You No Longer Control Your Personal Information, Facebook Does” –

The possible ramifications of this are only starting to become clear. For one thing, personal information is already a valuable commodity, it may even become a sort of currency given its enormous value. Facebook is clearly positioning itself well for this new personal data economy. Secondly, as the world heads closer to a sort of Scrutinised Zone, Facebook’s role could include being a powerful non-governmental power bloc with a “citizenry” rivalling the world’s larger nations in terms of population and economic and social influence.

For now, users should consider the possible ramifications for their privacy. Facebook now has an effective mechanism which it will use to trade access to users’ personal information. Before Anonymous Login, this was more covert and involved matching ads to Facebook updates and profile data programmatically. Anonymous Login goes beyond ads. It is a far more overt sales channel for users’ personal information with control shifting to Facebook from the people that control should belong to – the users, us.

Facebook seems to have realised that forcing users to share more than they would prefer to is not good for business. It has also realised that it no longer needs to do that, it has a tremendous amount of personal information it can profitably exploit in other ways. In some respects, users’ privacy has been even more eroded and users may not appreciate these shifting Facebook privacy challenges for quite some time. Whether this will return to bite Facebook will depend on how transparent it is about what it does with users’ personal information.