Which contracts photographers should consider using

Which contracts your clients should sign

A photographer asked a great question about contracts recently:

I would like to redo my contracts. Would like to know what do you get clients to sign before a shoot?

Disclaimer: This note is a fairly broad overview of many of the major themes you, as a photographer, should think about and which contracts photographers should sign with their clients. It isn’t legal advice or even the best advice for all photographers. It should give you a more informed starting point for a further discussion with your lawyer.

There are two key documents that you should have. The first is a contract governing your services and the other is some sort of privacy statement.

Services contract

The services contract needs to cover a number of themes both for clarity and to make sure you address your common risks. I also refer to services contract provisions as “terms and conditions” in this note.

For starters, use clear, well defined terminology is really important. It may seem pedantic but clearly defining key terms is essential for a clear and intelligible contract which, in turn, is more likely to be enforced if you ever have to test it. Obviously the content of the contract is very important but a contract written in confusing language can be very difficult to understand and enforce effectively. You typically include this terminology in a glossary in your contract.

Your services contract must obviously deal with your services, how you will communicate them and what you will charge for them. Think about issues like scope creep (where your services change without necessarily agreeing on the changes specifically) and amending your pricing as your scope changes. The model I prefer is to use a standard set of terms and conditions that refer to a separate booking form (that can be an online form or a paper form that your client signs) instead of preparing a lengthy contract that contains all the variables such as client details, services required and pricing. The booking form model that refers to the terms and conditions is less intimidating even though the terms and conditions, themselves, will be fairly detailed to make sure you deal with all the important themes.

One issue which comes up frequently in photographers’ groups is a cancellation fee. The Consumer Protection Act enables clients to terminate advance bookings subject to reasonable cancellation fees. Define those in your contract and set cancellation periods which may attract varying fees. For example, you may agree that if the client cancels a shoot 3 months before, the client will pay Rx; 1 month before, the client will pay Ry and 2 weeks before, the client will pay Rz. This will depend on your booking lead times; whether you can replace that booking and other similar factors. You will also need to align these cancellation fees with the Consumer Protection Act’s mechanisms and intent.

As a photographer the licensing aspects of your work are critical. The Copyright Act generally recognises your clients as the owners of the copyright in your photos if they commission you to do the work and agree on a fee for that work. This is good for your clients because they have more control over your deliverables but you have to consider what you will need to do with the photos. Because, by default, you are not the copyright owner in this context, you are not entitled to share the photos as part of your portfolio, restrict what your clients can do with the photos and exercise much other control over the photos’ use.

The Copyright Act gives you a way to change this default position. You can agree with your client to opt-out of the default copyright ownership mechanism in your contract. It is pretty straightforward but you need to include that in your contract. You may also want to think about including a mechanism in your contract which enables you to withhold your deliverables if your client fails to pay you, for example. This would be a separate clause in your contract.

Other clauses you’d include in your contract would be –

  • fees and payment;
  • privacy (linked to the privacy statement which I discuss below);
  • dispute resolution;
  • breach and the consequences of a breach;
  • termination;
  • common no-variation and similar clauses; and
  • domicilium clauses which can be pretty useful for different situations.

Booking form

A booking form is a convenient way to sign a client. Here are a few things to include:

  • Client details (name, contact details, address details);
  • Shoot details (date, times, locations);
  • Fees due (linked to specific deliverables), including deposits due;
  • Your specific deliverables;
  • Cancellation fees (you can include these in your terms and conditions but including these in your booking form makes them more prominent and confusion less likely);
  • Your details;
  • Express confirmation that your client agrees with your terms and conditions and privacy statement;
  • Signature and date fields (the form these will take if you use online forms can vary).

Privacy statement

As a photographer you are dealing with a lot of personal information. Using personal information often requires permission from the people the personal information relates to and the way you obtain this permission is a privacy statement (also known as a privacy policy or data protection policy).

As a starting point integrate your privacy statement with your services contract so when the client agrees to the services contract, s/he also agrees to the privacy statement.

Broadly speaking, the privacy statement must deal with these broad themes:

  • what personal information you will collect and from which sources (for example, automatically through your website, personal information your client volunteers through your booking form or contact forms and so on);
  • what you will do with that personal information (remember to include adding photos to your portfolio or Facebook page for marketing purposes, for example);
  • under what circumstances you may disclose personal information to third parties (these third parties may include your vendors for printing; law enforcement and other legal authorities); and
  • where you store personal information and, broadly, measures you take to secure the data (this will often mean identifying your hosting provider, especially if you use foreign hosting providers and will be transferring personal information across borders).

You will probably include other people in your photos (especially if you do functions and have the usual group photos) who have not signed your contracts. You should require your clients to obtain permission from people they want included in these group photos to be included and their agreement with your data practices which are explained in your privacy statement. How you do this can vary. You can prepare releases for subjects to sign and have them sign in advance or on the day or you can prepare something for your clients to have these participants sign. This can be a cumbersome process so consider the process with the least friction and which still results in permission from these subjects to take photos of them and use those photos for different purposes.

This is more important if you intend publishing photos on public platforms (for example, Facebook). Simply taking photos, making prints and handing these to your client probably won’t require you to go to these lengths because a subject who poses for a photograph clearly consents to being photographed. You’ll need to use your discretion.

It is very important to be sensitive about photos of children. You are not permitted to take photos of children and share them without their parents’ advance permission so make sure you obtain clear consents when it comes to children.

Get it in writing

If you capture the terms of your agreement with your clients in writing, you take huge strides towards reducing the likelihood of confusion and disputes. A written contract can be printed on paper. It can also be digital and part of an email or published on a website. Find the best medium for you that strikes a balance between clearly conveying your contract terms and being relatively accessible and convenient for your clients.

I have prepared a service contract and privacy statement for photographers which I’ve since updated. These two versions should give you a fairly decent idea of what your contracts could look like.

Apple tells developers not to share health data with advertisers

According to The Guardian, Apple has imposed contractual restrictions on developers that prohibit them from sharing health data they may receive through an anticipated range of health-related apps which iOS 8 will usher in through a platform called HealthKit:

Its new rules clarify that developers who build apps that tap into HealthKit, of which Nike is rumoured to be one, can collect the data it holds.

But, they stated, the developers “must not sell an end-user’s health information collected through the HealthKit APIs to advertising platforms, data brokers or information resellers”. Although, the rules add that they could share their data with “third parties for medical research purposes” as long as they get users’ consent.

These sorts of apps have enormous potential to benefit consumers and, at the same time, they represent a profound risk to consumers because our most intimate personal information is being accessed. How developers and device manufacturers handle this data is bound to inform a new generation of privacy complaints and reputational harm case studies in the years to come.

Sharing more with Facebook to improve its value

This point in Kevin O’Keefe’s article titled “Facebook eliminating the junk in your News Feed” on Facebook “click bait” made an interesting point about using Facebook more to improve its value to you as a user:

All too lawyers and other professionals I speak with complain about all the junk they see on Facebook. Part of the reason is that they don’t use it enough to help Facebook know what they like. At the same, Facebook acknowledges they have a problem with “click bait.”

What interests me about this point is that we often think that sharing more with Facebook equates to even more junk in our News Feed because the more you share on Facebook, the more signals you send to the social network and these signals inform the ads and suggestions you receive (probably the same with Google).

Instead, what O’Keefe seems to be saying is that using Facebook more helps Facebook’s algorithms refine your experience with more relevant ads and suggestions:

Just as Google wants you to receive what you are looking for on a search or a news program wants to get you the most important news, Facebook wants you to receive what you consider the most important information and news.

Perhaps more importantly, it seems that using Facebook more actively also helps Facebook determine what to show you more of in your News Feed. This is helpful given that you don’t actually see everything your Facebook friends share in your general News Feed, only what Facebook’s algorithms think you want to see more of.

From a privacy perspective, this approach suggests that you should share more of your personal information for an improved and more relevant Facebook experience, not less. It isn’t an approach designed to restrict the use of your personal information as a strategy to better protect your privacy but rather intended to use more of your personal information in a way that adds more value to you, as well as Facebook.

It reminds me about Jeff Jarvis’ point a while ago about how brands that know more about you can present a more relevant experience of their services to you. Which would you prefer?

A transparent approach to privacy policies

Transparent definition

Richard Beaumont’s article “Transparency Should Be the New Privacy” echoes a point I’ve also been making recently: data protection or privacy is mostly about transparency and trust. Sure, compliance is essential but from a data subject or consumer’s perspective, how transparent you are about how your process the data subject’s personal information and whether your activities engender trust are arguably as important. Achieving that requires a varied approach to data protection and one of the key elements is the document you publish about this, namely the privacy policy (also known as a “data protection policy”, “privacy statement” and other titles).

As Beaumont points out:

The website privacy policy is the basis on which organisations can claim they have received consent from customers/visitors to collect whatever data they want and do what they like with it. In a data-driven world, they are important documents. Expensive lawyers are often paid large sums of money to write them in the full knowledge that they will rarely be read. Of necessity, it is written in legalese that most people won’t fully understand, and it is long because it has to cover all eventualities.

Of course, hardly anybody reads them. In the vast majority of cases, it would be a colossal waste of time.

I don’t agree with an approach that obscures privacy policy wording by using legalese and complex language but privacy policies are typically not read and understood before data subjects share their personal information. On the other hand, Beaumont makes a number of good points about the purpose of most privacy policies:

However, the problem is not really with the privacy policy itself as a document; it is the fact that it has been mis-sold to us. We are led to believe its purpose is to inform. We are told this because consent relies on us being informed about what we are consenting to. It is the basis of almost all privacy law throughout the world.

However, if that were true, it wouldn’t be buried in a link at the bottom of the page and written in dense text that is often also in a smaller font than the rest of the site. Website designers and copywriters know how to inform people online. The privacy policy is the document on any website least likely to inform the visitor in any meaningful way.

The reality is that the privacy policy is designed to protect the owners in the case of a dispute—which is what most legal documents are designed to do. There is nothing wrong with this—these documents are necessary in certain circumstances. It’s just that they don’t fulfil the more common need for accessible information about privacy practices at the company.

He goes on to propose a “transparency policy” as an alternative to a conventional privacy policy. I don’t think we need a new term for the document except where using a new name shifts our perception of the document’s role. I’ve been reading a lot more about more visual legal documents and I like Beaumont’s suggested approach, at least in part:

The transparency statement will be short, clear and simple to understand. It might borrow from the “layered” privacy policy model and would almost certainly involve a strong visual element. It will be easily accessible and you will be encouraged to look at it, especially on a first visit. It will be the basis on which the website will set your expectations for how you and your data will be treated.

His model involves a transparency statement operating alongside a privacy policy which would give the transparency statement important “legal weight”. I don’t think this is necessary, though. I prefer some lawyers’ approach of publishing a “privacy statement” rather than a privacy policy. Although privacy policies are frequently framed as documents you, as a data subject, agree to, they can function just as well as statements of what personal information is being collected; how it is being processed and under what circumstances that personal information may be disclosed and to whom.

When I prepare privacy policies, I usually pair them with a website’s terms and conditions which invoke the privacy policy as an explanation of what personal information is processed and how. The terms and conditions then reference the privacy policy and provide the “legal weight” Beaumont refers to. In that model, a privacy policy could be reframed as a streamlined privacy statement along similar lines to Beaumont’s suggested transparency statement and lawyers. Inferring agreement with a privacy statement becomes largely unnecessary and it would only really be important to establish that data subjects agreed to the terms and conditions themselves which, in turn, would point to the privacy statement for information about personal information processing.

A streamlined privacy statement would also be better suited to more visual representations of its contents which makes them far more intelligible and, by extension, a company’s data processing activities more transparent. With more transparency comes more accountability and trust. In addition –

Because the transparency statement is also more likely to be read, commented on and engaged with, it will likely improve over time, and accepted standards might emerge. This would potentially create a virtuous circle that further improves clarity for consumers.

Emerging standards have further benefits which I find really exciting. The bottom line, though, is Beaumont’s conclusion:

Transparency statements could be the vehicle to enable the majority of people to make better-informed choices than they currently do and use a truly market-driven approach to online privacy practice.

Brands, accurate facial recognition and why transparency is critical

Introducing accurate facial recognition into the mix potentially removes the need for you to tell Facebook (or a future Facebook connected site or app) who you are before your data is shared and your experience modified. All you will need to do now is show up and let a camera see you long enough to capture a reasonably clear image of your face. From there you will be identified, placed into a particular context and things will happen. As a brand, there are some interesting opportunities. Imagine your guests arrive at your event and, instead of relying on guests to manually check in, a webcam at the door connected to your Facebook Page recognises the guests as they arrive and posts an update in your stream sharing their arrival. This isn’t happening yet but it is very possible. 

Facebook’s new artificial intelligence group recently published a research paper titled “DeepFace: Closing the Gap to Human-Level Performance in Face Verification” which describes its advances in facial recognition technology. The abstract is pretty technical so I highlighted the big takeaway that may interest you:

In modern face recognition, the conventional pipeline consists of four stages: detect => align => represent => classify. We revisit both the alignment step and the representation step by employing explicit 3D face modeling in order to apply a piecewise affine transformation, and derive a face representation from a nine-layer deep neural network. This deep network involves more than 120 million parameters using several locally connected layers without weight sharing, rather than the standard convolutional layers. Thus we trained it on the largest facial dataset to-date, an identity labeled dataset of four million facial images belonging to more than 4,000 identities, where each identity has an average of over a thousand samples. The learned representations coupling the accurate model-based alignment with the large facial database generalize remarkably well to faces in unconstrained environments, even with a simple classifier. Our method reaches an accuracy of 97.25% on the Labeled Faces in the Wild (LFW) dataset, reducing the error of the current state of the art by more than 25%, closely approaching human-level performance.

According to the MIT Technology Review’s article titled “Facebook Creates Software That Matches Faces Almost as Well as You Do”, human beings recognise faces correctly 97.53% of the time which makes DeepFace just about as accurate as humans when it comes to identifying your face. What does this mean for brands? Quite a lot although probably not right away.

One of the service features that will continue to distinguish brands and their service offerings is a brand’s ability to present its customers with a deeply personal and meaningful service. Brands have been working on ways to personalise their services for quite some time and have used demographics, location, culture and, more recently (and as we have increasingly seen on Facebook and Google properties), your interests. All of this information is being associated with your identity so when you connect to a site or an app with your Facebook profile, for example, you share your interests, connections and other signals from your profile with the site or the app which then customises your experience, tells you which of your friends are also using the site or the app (making it more likely that you will continue to use it) or do a number of other things to present a version of the site or the app that is more relevant to you.

Introducing accurate facial recognition into the mix potentially removes the need for you to tell Facebook (or a future Facebook connected site or app) who you are before your data is shared and your experience modified. All you will need to do now is show up and let a camera see you long enough to capture a reasonably clear image of your face. From there you will be identified, placed into a particular context and things will happen. As a brand, there are some interesting opportunities. Imagine your guests arrive at your event and, instead of relying on guests to manually check in, a webcam at the door connected to your Facebook Page recognises the guests as they arrive and posts an update in your stream sharing their arrival. This isn’t happening yet but it is very possible.

Of course whether users allow this will likely depend on Facebook’s (or the relevant service’s) data protection policy (with this sort of technology, the term “privacy policy” is totally inappropriate – privacy is a memory) and the controls Facebook will make available to users to permit the service to automatically identify and tag them more publicly than it does at the moment. The challenge is that most users don’t pay much attention to their privacy settings and don’t customise them to suit their preferences. That doesn’t prevent them from being outraged when brands use their profile data in otherwise permissible ways. This may not seem like a problem but, from a reputation perspective, it can be.

Even though this technology is not implemented particularly widely, accurate facial recognition associated with identities and personal information profiles is probably not far off. It is going to scare consumers who will become aware of the myriad cameras and opportunities for them to be identified and located in specific contexts. The remnants of their privacy (by obscurity) will be whittled down to almost nothing and they won’t expect it. As a brand, this technology offers a number of opportunities to engage with customers in a very meaningful and personal way but catching them by surprise is almost certainly going to backfire, largely because the backlash will be so much more intense, precisely because the possible applications of this technology are so personal.

Preparing customers for implementations of these sorts of technologies and reducing the risk of significant reputational harm requires transparency and a healthy dose of courage to be as transparent as you need to be about how you intend engaging with your customers. As I pointed out in my talk at the recent SA Privacy Management Summit, brands have little to gain by being opaque. Transparency is a critical risk management tool, it engenders trust and keeps brands accountable and honest. That is scary for brands not accustomed to being in the spotlight but if they want to engage more effectively with their customers and earn their loyalty, they can’t do it by being evasive and catching their customers by surprise.

Widespread facial recognition will have a fairly profound impact on data protection when businesses adopt it on a larger scale. The opportunities for brands are tremendous and could, literally, revolutionise how a customer perceives a brand. To paraphrase a worn adage, with this great power comes great responsibility and brands should think carefully about how to introduce these tools to their customers and obtain their buy-in. Even though facial recognition is still in fairly limited use, brands have been using various tools and techniques to leverage customers’ identifies and personal data to customise their experiences of a brand’s products and services for some time now. Transparency is more likely to win customers’ trust even though it scares many brands silly. That said –

Courage is not the absence of fear, but rather the judgement that something else is more important than fear.

— James Neil Hollingworth

Your connected home knows you intimately and, soon, so will Google

Google’s business model, like many other consumer-facing companies’ business models, are changing to become far more context aware. We’re seeing that in apps that know our location and where we are going next and warn us when to leave to make it on time. That just scratches the surface and this trend can be tremendously helpful and useful if we can be sure that our personal information is not being abused or vulnerable to exploitation.

Nest cooling with leaf

Google has just announced that it intends purchasing Nest, a company that produced a connected home thermostat and smoke detector that is very well regarded in the United States. The purchase price is $3,2 billion, apparently in cash. That substantial purchase price is a pretty clear indication of the value Google places on Nest’s technology which gives its customers the ability to monitor and adjust their home environment. One of the implications of this purchase is that Google could soon have far deeper insights into what Nest’s customers are doing in their homes.

Although this is arguably a trend that is only going to grow, the question to ask is whether companies reaching into customers’ most intimate spaces have adequate protections in place to protect their personal data? Here is an one possible integration (no announcements about integrations yet so this is speculation) from Stacey Higgenbotham writing for GigaOm in her article titled “When Google closes the Nest deal, privacy issues for the internet of things will hit the big time“:

As a user of Google Now, the contextual service that tells me when to leave my house to make it to my next appointment in time, I see no reason Google couldn’t also tell my thermostat to cycle down before I actually leave. Or, based on my movements in my home, Google could start screening my calls. If I’m in the bedroom and motionless maybe Google could block the work calls from my colleague Om Malik.

Google’s business model, like many other consumer-facing companies’ business models, are changing to become far more context aware. We’re seeing that in apps that know our location and where we are going next and warn us when to leave to make it on time. That just scratches the surface and this trend can be tremendously helpful and useful if we can be sure that our personal information is not being abused or vulnerable to exploitation. As Higgenbotham points out –

Nest and the products the company builds could help provide ever more contextual clues to Google that it can use to help make your life better and even save you money. But in doing so we need to hold it, and other companies seeking to enter the connected home market, to a well-defined set of standards around data security and privacy. That means the industry and the regulators need to move past this impasse: where the internet of things is awesome but will also kill you because strangers can hack into your home and control your medical devices.

Don’t place too much emphasis on the Protection of Personal Information Act

With the Protection of Personal Information Act signed and likely to be implemented to some degree sometime this year, it is fashionable to focus on POPI when thinking about data protection and privacy. While POPI is a very important Act, a complete data protection review has to take into account much more. I prepared a diagram to give you a quick overview of what you should be considering when you assess your compliance readiness.

2014-01-14 Privacy is more than just POPI