On his blog today, Seth Godin deals with a topic close to my heart. We can’t blame computers when they simply enact the rules we told them to. We can’t blame our systems when things happen that make no sense or are counterproductive. In a world of systems, technology and the cloud, it’s easy to depersonalise what happens in our organisations. That’s just not good enough.
Here’s how Seth put it:
But where did the algorithm come from?
by Seth Godin
Imagine if the owner of the local bookstore hid books from various authors or publishers. They’re on the shelf, sure, but misfiled, or hidden behind other books. Most of us would find this offensive, and I for one like the freedom I have (for now) to choose a new store, one that connects me to what I need.
The airline tickets I purchased last week are missing. Oh, here they are, in my spam folder. Gmail blames an algorithm, as if it wrote itself.
That person who just got stopped on her way to an airplane—the woman who gets stopped every time she flies — the TSA says it’s the algorithm doing it. But someone wrote that code.
And as AI gets ever better at machine learning, we’ll hear over and over that the output isn’t anyone’s responsibility, because, hey, the algorithm wrote the code.
We need to speak up.
You have policies and algorithms in place where you work, passed down from person to person. Decision-making approaches that help you find good customers, or lead to negative redlining… What they have in common is that they are largely unexamined.
Google’s search results and news are the work of human beings. Businesses thrive or suffer because of active choices made by programmers and the people they work for. They have conflicting goals, but the lack of transparency leads to hiding behind the algorithm.
The priority of which Facebook news come up is the work of a team of people. The defense of, “the results just follow the algorithm,” is a weak one, because there’s more than one valid algorithm, more than one set of choices that can be made, more than one set of priorities.
The culture (our politics, our standards, our awareness of what’s happening around us) is being aggressively rewired by what we see, and what we see comes via an algorithm, one that was written by people.
Just because it makes the stockholders happy in the short run doesn’t mean it’s the right choice for the people who trust you.
Source: Seth Godin blog