But I've lately undergone a crisis of confidence:I find it hard to hit the road without consulting my phone. And while I'd like to think the recommended route (from Google, Waze, Hopstop, etc.) is just one influence among many—that I have other preferences their algorithms can't perceive—I'm not too proud to confess that I trust the computer more than I trust myself. The habits, hubris, and quirky predilections that once manipulated my movements are being replaced by the judgments of artificial intelligence.
In this I'm not alone. The rise in mobile navigation technology has, in just a few years, transformed the way we get around cities. In 2011, 35 percent of Americans had smartphones; by 2013, that had grown to 61 percent. Three-quarters of those people now use their phones for directions and location-based services. One in five Americans used the Google Maps app in June; one in eight used Apple Maps. Tens of millions more rely on car-based modules hitched to the satellites of the Global Positioning System.
That is dumbfounding progress. The full precision of GPS was made public only 15 years ago, and as recently as the early 2000s, GPS was considered a tool of "sailors, hikers and other outdoors enthusiasts." Today, nearly every mobile app employs it. Radio traffic reports feel as antiquated as floppy disks.
News is now not just outside newspapers, it is outside newsrooms. It is impossible for humans to filter efficiently the vast numbers of images, videos, tweets and updates created and shared by humans, bots and devices. By 2020, according to consultants Gartner, there will be 20bn devices connected to the internet, and they will all have something to say for themselves. Facebook, Instagram, Twitter, WhatsApp and what’s next are and will continue to be making editorial decisions on our behalf. Costolo taking his first editorial stance is significant because he was public and unapologetic about removing material that he felt did cultural and economic damage to Twitter. The Facebook algorithm, and other sorting processes, are both more opaque and less accountable. The decline of the newspaper, and the subsequent closure or shrinking of newsrooms, not only leaves news unbound, it also removes the culture of editorial filtering. Centuries of human debate over cultural values, expressed in everything from intrusive splashes to grandiose editorials, are disappearing to be replaced by a black box.
Accountability is not part of Silicon Valley’s culture. But surely as news moves beyond paper and publisher, it must become so. For a decade or more, news organisations have been obeisant to the power of corporate technology, nodding and genuflecting at the latest improbably impressive magic. But their editorial processes have something to offer technologists too.
Transparency and accountability have to accompany the vast, important role our key information providers now play in society. It is understandable why platforms such as Facebook strenuously resist being labelled as “publishers”, but it is no longer realistic. It takes very little narrative imagination to grasp the ethical complexities ahead; every policeman wearing a camera, every terror cell with a Twitter feed, every face in a crowd rendered recognisable.
Sheepdogs could lose their jobs to robots after scientists learned the secret of their herding ability.
Rounding up sheep successfully is a simple process involving just two basic mathematical rules, a study found.
One causes a sheepdog to close any gaps it sees between dispersing sheep. The other results in sheep being driven forward once the gaps have sufficiently closed.
A computer simulation showed that obeying these two rules alone allowed a single shepherd – or sheepdog – to control a flock of more than 100 animals.
The discovery has implications for human crowd control as well as the development of robots that can gather and herd livestock, the scientists said. […]
To conduct the study, the researchers fitted a flock of sheep and a sheepdog with backpacks containing highly accurate GPS satnavs.
Movement-tracking data from the devices was programmed into computer simulations to develop the mathematical shepherding model.
Writing in the Journal of the Royal Society Interface, the researchers concluded: “Our approach should support efficient designs for herding autonomous, interacting agents in a variety of contexts.
"Obvious cases are robot-assisted herding of livestock, and keeping animals away from sensitive areas, but applications range from control of flocking robots, cleaning up of environments and human crowd control."
When I close my laptop, it goes to sleep. It’s a curiously domestic metaphor but it also implies that sleep in humans and other animals is just a kind of low-power standby mode. (Do computers dream of electric sleep?) Last year, Apple announced a twist on this idea: a new feature for the Mac operating system called “Power Nap”. Using Power Nap, your computer can do important things even while asleep, receiving updates and performing backups.
The name Power Nap comes from the term describing the thrusting executive’s purported ability to catch a restorative forty winks in 20 minutes but the functioning of Apple’s feature symbolically implies a yet more ultra-modern and frankly inhuman aspiration: to be “productive” even while dozing. It is the uncanny technological embodiment of the dream most blatantly sold to us by those work-from-home scams online, which promise that you can “make money even while you sleep”.
Sleep, indeed, is a standing affront to capitalism. That is the argument of Jonathan Crary’s provocative and fascinating essay, which takes “24/7” as a spectral umbrella term for round-the-clock consumption and production in today’s world. The human power nap is a macho response to what Crary notes is the alarming shrinkage of sleep in modernity. “The average North American adult now sleeps approximately six and a half hours a night,” he observes, which is “an erosion from eight hours a generation ago” and “ten hours in the early 20th century”.
Back in 1996, Stanley Coren’s book Sleep Thieves blamed insufficient rest for industrial disasters such as the Chernobyl meltdown. Crary is worried about the encroachment on sleep because it represents one of the last remaining zones of dissidence, of anti-productivity and even of solidarity. Isn’t it quite disgusting that, as he notices, public benches are now deliberately engineered to prevent human beings from sleeping on them?
While Apple-branded machines that take working Power Naps are figured as a more efficient species of people, people themselves are increasingly represented as apparatuses to be acted on by machines. Take the popular internet parlance of getting “eyeballs”, which means reaching an audience. “The term ‘eyeballs’ for the site of control,” Crary writes, “repositions human vision as a motor activity that can be subjected to external direction or stimuli … The eye is dislodged from the realm of optics and made into an intermediary element of a circuit whose end result is always a motor response of the body to electronic solicitation.”
You can’t get more “eyeballs” if the people to whose brains the eyeballs are physically connected are asleep. Hence the interest – currently military; before long surely commercial, too – in removing our need for sleep with drugs or other modifications. Then we would be more like efficient machines, able to “interact” with (or labour among) electronic media all day and all night. (It is strange, once you think about it, that the phrase “He’s a machine” is now supposed to be a compliment in the sporting arena and the workplace.)
Many of us cannot help looking because of what Susan Sontag has called “the perennial seductiveness of war.” It is a kind of rubbernecking, staring at the bloody aftermath of something that is not an act of God but of man. The effect, as Ms. Sontag pointed out in an essay in The New Yorker in 2002, is anything but certain.
“Making suffering loom larger, by globalizing it, may spur people to feel they ought to ‘care’ more,” she wrote. “It also invites them to feel that the sufferings and misfortunes are too vast, too irrevocable, too epic to be much changed by any local, political intervention.”
So now that war comes to us in real time, do we feel helpless or empowered? Do we care more, or will the ubiquity of images and information desensitize us to the point where human suffering loses meaning when it is part of a scroll that includes a video of your niece twerking? Oh, we say as our index finger navigates to the next item, another one of those.
As war becomes a more remote, mechanized activity, posts and images from the target area have significant value. When a trigger gets pulled or bombs explode, real people are often on the wrong end of it. And bearing witness to the consequences gives meaning to what we see.
“So, what’s the trade-off here? In general, we are safer (automation makes airline flying safer, in general) except in the long-tail: pilots are losing both tacit knowledge of flying and some of its mechanics. But in general, we, as humans, have less and less understanding of our machines—we are compartmentalized, looking at a tiny corner of a very complex system beyond our individual comprehension. Increasing numbers of our systems—from finance to electricity to cybersecurity to medical systems, are going in this direction. We are losing control and understanding which seems fine—until it’s not. We will certainly, and unfortunately, find out what this really means because sooner or later, one of these systems will fail in a way we don’t understand.”—Failing the Third Machine Age: When Robots Come for Grandma — The Message — Medium
“We’ve seen some less-radical attempts to destroy technology in the real world in recent months, mainly in the form of attacks on people wearing Glass or flying drones, or the drone on its own (by hockey fans who reportedly and incorrectly thought it belonged to the LAPD). As in the movie, the destroyers haven’t been identified or punished, with one exception: Andrea Mears, 23, was charged with third degree assault for attacking a teen boy, Austin Haughwout, 17, flying a drone on a Connecticut beach. She got probation this week, as noted by comprehensive drone chronicler Greg McNeal. It’s easy to call these people Luddites, after the British workers who set about destroying machines — and in some cases killing the people who owned them — in the late 1700s and early 1800s in a futile attempt to turn back the tide of mechanization. It led Britain to pass a law making machine-wrecking punishable by death. But the new machine destroyers’ motivations are different. The original Luddites were worried machines would take their jobs; the Neo-Luddites fear machines will steal their privacy.”—The Violent Opt-out: The Neo-Luddites Attacking Drones And Google Glass - Forbes
Unsurprisingly, the blame game is now playing out on Wikipedia, where editors battle to record the polemics that best reflect their side of the story. Earlier this morning, the Russian-language Wikipedia entry for commercial aviation accidents hosted one such skirmish, when someone with an IP address based in Kyiv edited the MH17 record to say that the plane was shot down “by terrorists of the self-proclaimed Donetsk People’s Republic with Buk system missiles, which the terrorists received from the Russian Federation.” Less than an hour later, someone with a Moscow IP address replaced this text with the sentence, “The plane was shot down by Ukrainian soldiers.”
Thanks to a Twitter bot that tracks anonymous Wikipedia edits made from IP addresses used by the Russian government, we know that the second edit to the MH17 article came from a computer at VGTRK, the All-Russia State Television and Radio Broadcasting Company.
Since the violence escalated on July 7, there have been 209 Palestinian casualties to a single Israeli killed by mortar shrapnel. (The Palestinian equivalent to something like Red Alert would make your phone vibrate consistently but softly—enough that it can’t be ignored, but at a volume inaudible to everyone around you.)
None of this is meant to detract from the danger that the rockets pose to Israelis who live within firing range, as their fear is real. For the Israeli families in Sderot, Ashkelon, or Be’er Sheva (where I once lived), Red Alert is palliative.
But Red Alert commodifies the pain of war, and helps render invisible its toll on Palestinians. It turns the conflict into a monetized app, with Google-powered ads scrolling at the top of the screen and furious, scattershot comments crowding at the bottom. Red Alert, in addition to assisting Israelis on the ground and gathering advertising dollars, serves the purpose of a government that has the privilege of being able to sufficiently protect its citizens. The people of Gaza have no such luxury.
“Welcome to the Future of Air Conditioning, says a poster at Venice airport, straight after passport control. Next to the words is an image of a composite Shanghai/Dubai-like city, made of sealed towers of the kind that would be impossible without artificial air. Any association with this year’s Rolex-sponsored Venice Biennale of Architecture is coincidental, but the poster is an eloquent exhibit of the event’s main theme. This is: thousands of years of architectural history are being changed utterly by modern techniques of constructing and servicing buildings which, predetermined by technical considerations, make architects marginal to their making. If, for example, a fireplace was once an occasion for social gathering and ornamental embellishment, there are now sensors that can track an individual and provide heating specific to that one person. The provision of heat becomes a solitary, dematerialised and invisible affair.”—2014 Venice Architecture Biennale review: put yourself in their space… | Art and design | The Observer
“I’ve made a bot that ‘likes’ everything on Facebook,” said Julien Deswaef […]
While it sounds an easy project to execute, it turns out that Facebook has its own scripts programmed to penalise ruthless automation. Because of this, Julien has had to mimic the sporadic interactions of humans to keep the bot under the radar. The artist has also had to forfeit his own Facebook account to the bot — you could interpret this as performance art, but Julien calls it software art. Many of his friends instantly complained about having everything liked by him. I follow him/it on Facebook, and yes it’s frustrating, but it is only irritating because it holds up a mirror of how pathetic your Facebook life really is; the bot likes every single mundane trace you leave on the site.
“In fact, it’s all about the butts. Because players see their avatars from a third-person perspective from behind, men are confronted with whether they want to stare at a guy’s butt or a girl’s butt for 20 hours a week. Or as the study authors put it in more academic prose, gender-switching men “prefer the esthetics of watching a female avatar form.” This means that gender-switching men somehow end up adopting a few female speech patterns even though they had no intention of pretending to be a woman.”—World of Warcraft gender switching: Why men choose female avatars. (via jomc)
A Hong Kong VC fund has just appointed an algorithm to its board.
Deep Knowledge Ventures, a firm that focuses on age-related disease drugs and regenerative medicine projects, says the program, called VITAL, can make investment recommendations about life sciences firms by poring over large amounts of data.
Just like other members of the board, the algorithm gets to vote on whether the firm makes an investment in a specific company or not. The program will be the sixth member of DKV’s board.
“By analogy, the opening of Charles Dickens’ A Tale of Two Cities is nothing but a string of short phrases. Yet no one could contend that this portion of Dickens’ work is unworthy of copyright protection because it can be broken into those shorter constituent components.”—Judge Kathleen O’Malley compares the short names such as “java.lang.ref” and “java.lang.reflect,” which Oracle uses to name the APIs, to great works of literature - Tech world stunned as court rules Oracle can own APIs, Google loses copyright appeal — Tech News and Analysis
By 1 p.m., Philip would leave the small yellow house in Silver Spring where he lived alone. He walked a half-block, waited for the No. 5 bus, took it to his job as a taxi dispatcher, returned home, cooked a late dinner, watched Charlie Rose and went to sleep. He never locked his front door and often left it wide open. Part was defiance. This is how I live. Part was warmth. Anyone is welcome.
One February night, someone came inside — someone Philip may have known — and beat him to death. The case remains Montgomery’s only unsolved killing this year.
Philip seemed to have no secrets and no enemies. And he left behind no electronic footprints — the text messages, e-mails, cellphone logs and social-media traffic that police routinely use these days as they seek out unknown quarrels and final movements.
“Those records usually help,” said Capt. Marcus Jones, commander of Montgomery County’s major-crimes division. “We don’t have any of that.”
For Philip’s family and friends, the case brings a terrible possibility: Could everything that made the lifelong bachelor so unique, so stubborn, so confounding, so wonderful — a life rooted in rejection of instant communication — be allowing his killer to get away with it?