The current row over Facebook's successive changes to its privacy settings has several strategic implications for the way that businesses - not just in the digital sector - relate to their customers. In case you've missed the story: Facebook has radically reduced the default privacy settings for its users since the autumn of last year, meaning that users are likely to be sharing far more details across the internet than they previously did. (I've written about this at length elsewhere, but a visit to ouropenbook.org gives a sense of the scale of it.)

The reason? Well, the company says that 'radical transparency' is good for you, in a moral sense. Others say that it is part of a long campaign by Facebook (two steps forward, one step back, according to Nick Carr) to set itself up as the owner of its users' online identity, which is a more lucrative proposition than being a mere social network, even one with several hundred million members.

So what are the implications of this for businesses?

#1: When the mental map which your customer has of your product or service becomes too divergent from their experience of it, the business suffers. (This is what happened when Gerald Ratner described one of his company's products as 'crap'). In the case of Facebook, the actual experience is no longer represented by the map. The researcher danah boyd has explained this well:
A while back, I was talking with a teenage girl about her privacy settings and noticed that she had made lots of content available to friends-of-friends. I asked her if she made her content available to her mother. She responded with, “of course not!” I had noticed that she had listed her aunt as a friend of hers and so I surfed with her to her aunt’s page and pointed out that her mother was a friend of her aunt, thus a friend-of-a-friend. She was horrified. It had never dawned on her that her mother might be included in that grouping. Over and over again, I find that people’s mental model of who can see what doesn’t match up with reality.
#2: Privacy isn't dead, although it is fashionable for digerati to say so. People still expect organisations they do business with to maintain appropriate levels of privacy - and to be able to check these for themselves. We think that this expectation increases as the web becomes more ubiquitous and more portable, and there are more opportunities for breach. At least some users will engage reluctantly because of fear of theft, fraud, or inappropriate social exchanges. In the digital world, companies which take care of their users' privacy will be less profitable in the short-term, but more sustainable in the long-term.

#3: Facebook is effectively polluting the "commons" represented by the internet - all of the public resources and protocols - through self-interested behaviour. It is possible that other suppliers which also depend on a trusted internet for their business will intervene; Google, its own privacy problems notwithstanding, has done a little of this recently. But usually what happens when public interest goods are polluted by commercial interests is that regulation follows. The cases brought against Facebook under trade and competition law, along with the initial responses from privacy regulators, are harbingers of this.

This is also posted at The Futures Company's blog.