Tuesday, August 26, 2014

Right to be forgotten: The failure is in us, not Google (ZDNet)

Summary: There isn't a technical fix for the problem we have with empathy.
By  |

We are more than the sum of our search results. We know that about ourselves without being told — how can the fragments of information about us displayed online ever possibly define the complicated, paradoxical, fascinating beings that we are?
And yet we don't apply the same logic when considering other people; we cheerfully judge them based on what we can find in a quick Google search. That means that someone who made a stupid joke a decade ago is still defined by it, or that someone who committed a crime years ago can never put it behind them despite thousands of good deeds which go unrecorded by the internet.
That's because whereas previously embarrassing stories about an individual would have been printed in newspapers and then forgotten (existing only in a yellowing copy of an old paper, or in our own fallible memories), the internet means these stories are visible every time someone searches for their name.
This is what Europe's right to be forgotten tries to remedy — to take the undeserved sting out of these ancient stories. It goes some way towards creating a half-life for information in an age when digital technology allows us to retain everything forever.
There are some very limited scenarios – such as those involving spent convictions which would not have to be disclosed normally – where a right to be forgotten makes sense. But to me, beyond that, it's very hard to see why information which is fair and accurate should be removed from view.
That's because the right to a private life — which right to be forgotten tries to protect — bumps up against some other rights that are necessarily for a fully functioning society, such as the freedom of expression and freedom of the press.
While it doesn't reduce a journalist's ability to write a story about someone, the right to be forgotten makes it harder for others to find that story through a search engine, which is of course how most people navigate the web. That raises the question, can you really have freedom of speech if no one can hear what you are saying? Freedom of speech implicitly includes the freedom to be heard, and that's what we could be putting at risk here.
As such, the right to be forgotten is too big and too complicated to leave to search engines (who don't really want to police it) and the individuals who want links removed. Many of the decisions made so far on delinking (some of them later reversed) seem hard to defend. We need a better understanding of the what the right to be forgotten means before we start turning search indexes — our outsourced collective memory — into Swiss cheese.
The right to be forgotten embodies one of the most profound challenges we face. Humans are by design forgetting machines; our fallible grey matter urges us on by helping us to forget old pains, and by preventing us from perfectly replaying happy memories over and over again. But now we have to deal with the consequences of having the capability to remember almost everything for all time.
The search engine provides the information, but we are the ones that make judgements based on it. It's not a failure on the part of the search engine if we judge someone wrongly based on a scrap of information that might be years or decades old. We need to make more informed decisions, not knee-jerk responses to old information.
The problem with right to be forgotten is that it takes that choice away from us. It means we don't have the information to make those judgements at all — right or wrong. We can't make intelligent decisions about how to respond to this information if it's hidden from us. Denial and obfuscation is not the right answer to the challenges ahead.
We need to see our fellow humans as we see ourselves — not as a collection of search results but as confused, inconsistent and changeable human beings. Our failure is not one of technology but of empathy, one that no amount of meddling with the search indexes can fix.
Steve Ranger is the UK editor-in-chief of ZDNet and TechRepublic, and has been writing about technology, business and culture for more than a decade

No comments:

Post a Comment