When German engineering is digital deception

Remember the good old days when computer criminals were tattooed, hoodie-wearing hackers? At the giant German automaker Volkswagen, the bad guys wear pinstripes and wingtips.

VW’s admission that it deliberately programmed the computers in 11 million cars to cheat on air pollution tests is among the worst crimes of the digital era, precisely because it’s an inside job, approved at the highest levels of a massive global corporation.

It’s also a warning to everybody who’s ever assumed that the stuff on your computer screen is the gospel truth. Our digital devices can easily be used to manipulate, mislead or flat-out lie to us. Humans have always lied to each other, and always will. But we’re used to sniffing out the crude analog deceptions of our fellow man. When computers skew reality, it can be done with such subtlety, we may never realize we’re being played.

In the Volkswagen case, we got lucky. Two years ago, an independent testing association, the International Council on Clean Transportation, began testing VW diesel cars sold in the United States. They found that the cars were very clean indeed, in the testing bay. But the same cars produced up to 40 times more pollution on the road. The reason? Engine control software that altered the engine’s performance to produce less toxic exhaust, but only during emissions tests.

VW is already paying a steep price. Its stock fell by about one third between Friday and Tuesday; on Wednesday, chief executive Martin Winterkorn stepped down.

The scandal is one of many cases in which computers have skewed our view of the world. “In screen we trust,” said Marc Goodman, cybersecurity investigator and author of “Future Crimes,” a superb new book on the subject. “We’re all relying on these screens to show us information, but that information can be manipulated.”

Sometimes it’s with good intentions. When you run a Google search, your results will probably be different from mine. That’s because Google remembers your previous searches, the messages you’ve sent via their Gmail service, the videos you’ve watched on YouTube. It uses all this to find search results tailored to your interests. But there’s a downside. For instance, if you consistently click on left-wing or right-wing political news sites, Google may keep giving you more of the same, cutting out opposing viewpoints. Almost without realizing it, we can find ourselves trapped in an infinite loop of narrow-mindedness.

Other data manipulations are all about the bottom line. Google is fighting antitrust suits from the European Union and Russia, which say the company deliberately slants the results when people search for retailers. Shoppers are directed first to Google’s own shopping service or the company’s Zagat restaurant rating guide. Search results from Amazon.com or the hugely popular Yelp rating service get shoved down the page, where people are less likely to see them.

Google insists it’s doing no harm; both Amazon and Yelp continue to prosper. However, people expect Google, the “don’t be evil” company, to serve up unbiased information. Instead, when it comes to shopping, Google puts its thumb on the scale.

Yelp itself has been accused of lowering the ratings of retailers who don’t buy advertising on the site. Yelp denies the practice, but a federal court in San Francisco last year ruled that even if Yelp decided to do this, it would be perfectly legal.

And who can forget last year’s mini-scandal, when social network Facebook described an experiment to manipulate the moods of nearly 700,000 subscribers? Facebook simply modified the stories appearing on their news feeds. Half got a high percentage of upbeat, positive stories, while the other half saw gloomy, depressing fare. Sure enough, the messages posted by these users became accordingly more positive or negative. So the experiment worked. Hooray.

Or not. Millions of Facebookers freaked, appalled that a company with over a billion users could exercise so much power over people’s minds. Could it happen again? There’s no law against it.

Nearly everything we do is mediated through software. And nobody knows exactly what that software is doing except the people who wrote it. We’re pretty much at their mercy, forced to trust in their competence and honesty. Against this kind of inside job, there’s no defense except integrity, a commodity in short supply in Wolfsburg, Germany.

Hiawatha Bray is a technology reporter for the Boston Globe. E-mail him at h_bray@globe.com.
Follow Hiawatha on Twitter - Facebook - Google+