“According to a BBC News report, data from pedestrian signal buttons may or may not have any real effect on SCOOT-controlled crossing timings, depending on their location and the time of day, and some junctions may be completely automated, with push-buttons which do not have any effect at all, effectively acting as placebo buttons. However, the same report quotes a Transport for London source as stating that the majority of pedestrian junctions in London do respond to the pedestrian signal button.”—Placebo Buttons - Split Cycle Offset Optimisation Technique - Wikipedia, the free encyclopedia
“I feel like it is legitimate to express concern about overuse of devices or social media and how it may alienate some, and I have just chosen to approach the subject from a different angle. The best possible scenario is for everyone, regardless of their varying optimism on the issue, to acknowledge that the new normal involves the pressures and benefits of multiple devices and an unprecedented amount of information flowing through us. There is nothing reactionary in acknowledging that this can be problematic, and it is our role as artists to offer insights as to how best to navigate this predicament. The only people I fundamentally disagree with are those who stubbornly ignore such issues altogether, dip out, and pretend like it’s 1989 or something. I guess the principal thing I stand for is educating oneself about the potentials and pitfalls of contemporary technology such that you can use it for positive ends. Debate around these issues is a crucial part of that.”—Web Exclusive: Interview with Holly Herndon - The Indy
CopTrax, a surveillance vendor, collaborated with the Byron police department to provide officers with Glass during routine traffic enforcement patrol, stops issuing citations, arrests and during firearms practice.
"They had the CopTrax software loaded into the Google Glass and everything recorded with Glass was then recorded back to our camera system and police cruisers," Farris said.
The US government filed a lawsuit against telecom giant Sprint in civil court on Monday, accusing the company of overcharging the government for costs related to court-ordered wiretapping and surveillance.
Sprint, along with Verizon and AT&T, are regularly required to aid government investigations – doing so by facilitating phone surveillance, also known as pen registers, which contain metadata about a phone call, not its content. In exchange for standing at the ready, the companies are permitted to charge law enforcement agencies for “reasonable expenses” related to the investigation.
In the 90’s, it looked like the Internet might be an exception, that it could be a decentralizing, democratizing force. No one controlled it, no one designed it, it was just kind of assembling itself in an appealing, anarchic way. The companies that first tried to centralize the Internet, like AOL and Microsoft, failed risibly. And open source looked ready to slay any dragon.
But those days are gone. We’ve centralized the bejesus out of the Internet now. There’s one search engine (plus the one no one uses), one social network (plus the one no one uses), one Twitter. We use one ad network, one analytics suite. Anywhere you look online, one or two giant American companies utterly dominate the field.
And there’s the cloud. What a brilliant name! The cloud is the future of online computing, a friendly, fluffy abstraction that we will all ascend into, swaddled in light. But really the cloud is just a large mess of servers somewhere, the property of one American company (plus the clouds no one uses).
Orwell imagined a world with a telescreen in every room, always on, always connected, always monitored. An Xbox One vision of dystopia.
But we’ve done him one better. Nearly everyone here carries in their pocket a tracking device that knows where you are, who you talk to, what you look at, all these intimate details of your life, and sedulously reports them to private servers where the data is stored in perpetuity.
I know I sound like a conspiracy nut framing it like this. I’m not saying we live in an Orwellian nightmare. I love New Zealand! But we have the technology.
When I was in grade school, they used to scare us with something called the permanent record. If you threw a spitball at your friend, it would go in your permanent record, and prevent you getting a good job, or marrying well, until eventually you’d die young and friendless and be buried outside the churchyard wall.
What a relief when we found out that the permanent record was a fiction. Except now we’ve gone and implemented the damned thing. Each of us leaves an indelible, comet-like trail across the Internet that cannot be erased and that we’re not even allowed to see.
The things we really care about seem to disappear from the Internet immediately, but post a stupid YouTube comment (now linked to your real identity) and it will live forever.
And we have to track all this stuff, because the economic basis of today’s web is advertising, or the promise of future advertising. The only way we can convince investors to keep the money flowing is by keeping the most detailed records possible, tied to people’s real identities. Apart from a few corners of anonymity, which not by accident are the most culturally vibrant parts of the Internet, everything is tracked and has to be tracked or the edifice collapses.
What upsets me isn’t that we created this centralized version of the Internet based on permanent surveillance.
What upsets me, what really gets my goat, is that we did it because it was the easiest thing to do. There was no design, forethought, or analysis involved. No one said “hey, this sounds like a great world to live in, let’s make it”. It happened because we couldn’t be bothered.
Making things ephemeral is hard.
Making things distributed is hard.
Making things anonymous is hard.
Coming up with a sane business model is really hard—I get tired just thinking about it.
So let’s take people’s data, throw it on a server, link it to their Facebook profiles, keep it forever, and if we can’t raise another round of venture funding we’ll just slap Google ads on the thing.
"High five, Chad!"
"High five, bro!"
That is the design process that went into building the Internet of 2014.
And of course now we are shocked—shocked!—when, for example, the Ukrainian government uses cell tower data to send scary text messages to protesters in Kiev, in order to try to keep them off the streets. Bad people are using the global surveillance system we built to do something mean! Holy crap! Who could have imagined this?
Or when we learn that the American government is reading the email that you send unencrypted to the ad-supported mail service in another country where it gets archived forever. Inconceivable!
I’m not saying these abuses aren’t serious. But they’re the opposite of surprising. People will always abuse power. That’s not a new insight. There are cuneiform tablets complaining about it. Yet here we are in 2014, startled because unscrupulous people have started to use the powerful tools we created for them.
We put so much care into making the Internet resilient from technical failures, but make no effort to make it resilient to political failure. We treat freedom and the rule of law like inexhaustible natural resources, rather than the fragile and precious treasures that they are.
And now, of course, it’s time to make the Internet of Things, where we will connect everything to everything else, and build cool apps on top, and nothing can possibly go wrong.
“German executives and intelligence officials called Mr. Snowden a hero and said his disclosures had been a boon for business, as N.S.A. suspicions prompted global companies to look for alternatives to American products and services. One German executive said that many clients who had considered moving their services to the cloud were now looking to store their data on hardware inside Germany, given that “the U.S. owns the cloud.””—At the RSA Security Conference, Things Get Testy and Then They Get Awkward - NYTimes.com
While I am far from a Luddite who fetishizes a life without tech, we need to consider the consequences of this latest batch of apps and tools that remind us to contact significant others, boost our willpower, provide us with moral guidance, and encourage us to be civil. Taken together, we’re observing the emergence of tech that doesn’t just augment our intellect and lives — but is now beginning to automate and outsource our humanity.
But let’s take a concrete example. Instead of doing the professorial pontification thing we tech philosophers are sometimes wont to do, I talked to the makers of BroApp, a “clever relationship wingman” (their words) that sends “automated daily text messages” to your significant other. It offers the promise of “maximizing” romantic connection through “seamless relationship outsourcing.”
Now, it’s perfectly possible that this app is a parody (the promo video includes bitcoin creator Satoshi Nakamoto and feminist voice Germaine Greer among the demo contacts), and its creators “James” and “Tom” didn’t share their last names with me. But my 29-year-old interlocutors — one who apparently has a degree in Engineering and Mathematics, the other in Design and Applied Finance — had clearly thought deeply about why relationship management tools are socially desirable and will be increasingly integrated into our everyday lives.
Drawn here and shared with permission is their rationale, which I believe goes beyond just this one app. So even if it’s a parody (indeed, sadly “we can’t tell”), it captures a real automation-app trend and widely held convictions in the tech community we need to pay attention to.
“Like it or not, there’s a Secret Language of Domesticity. In technology terms, it’s the equivalent of “viewing source”: it’s not intentionally secret, it’s just easy to ignore if you’re not interested or don’t understand it. It’s the thing that creates the persistent rhythms of the home, and it’s passed down – by and large – from mother to daughter.”—Domestic Folklore, or Washing Machines for Men | fabric of things
“If you don’t listen to Google’s robot car, it will yell at you. I’m not kidding: I learned that on my test-drive at a Stanford conference on vehicle automation a couple weeks ago. The car wanted its human driver to retake the wheel, since this particular model wasn’t designed to merge lanes. If we ignored its command a third time, I wondered, would it pull over and start beating us like an angry dad from the front seat? Better to not find out.”—The Ethics of Saving Lives With Autonomous Cars Are Far Murkier Than You Think | Wired Opinion | Wired.com (via iamdanw)
A startup is developing machine-learning technology that mimics the way the ear works, which it believes will make it easier for smartphones and wearable devices to constantly listen for sounds of danger.
One Llama will show some of its capabilities in an app called Audio Aware, which is meant to alert hard-of-hearing smartphone users and “distracted walkers” (an issue previously explored in “Safe Texting While Walking? Soon There May be an App for That”). The app, planned for release in March, will run in the background on an Android smartphone, detecting sounds like screeching tires and wailing sirens and alerting you to them by interrupting the music you’re listening to, for instance. The app will arrive with knowledge of a number of perilous sounds, and users will be able to add their own sounds to the app and share them with other people.
“Our choice is not between “regulation” and “no regulation.” The code regulates. It implements values, or not. It enables freedoms, or disables them. It protects privacy, or promotes monitoring. People choose how the code does these things. People write the code. Thus the choice is not whether people will decide how cyberspace regulates. People—coders—will. The only choice is whether we collectively will have a role in their choice—and thus in determining how these values regulate—or whether collectively we will allow the coders to select our values for us.”—Lawrence Lessig on the increasing regulation of cyberspace | Harvard Magazine Jan-Feb 2000
The publishers Springer and IEEE are removing more than 120 papers from their subscription services after a French researcher discovered that the works were computer-generated nonsense. Over the past two years, computer scientist Cyril Labbé of Joseph Fourier University in Grenoble, France, has catalogued computer-generated papers that made it into more than 30 published conference proceedings between 2008 and 2013. Sixteen appeared in publications by Springer, which is headquartered in Heidelberg, Germany, and more than 100 were published by the Institute of Electrical and Electronic Engineers (IEEE), based in New York. Both publishers, which were privately informed by Labbé, say that they are now removing the papers.
Among the works were, for example, a paper published as a proceeding from the 2013 International Conference on Quality, Reliability, Risk, Maintenance, and Safety Engineering, held in Chengdu, China. (The conference website says that all manuscripts are “reviewed for merits and contents”.) The authors of the paper, entitled ‘TIC: a methodology for the construction of e-commerce’, write in the abstract that they “concentrate our efforts on disproving that spreadsheets can be made knowledge-based, empathic, and compact”. (Nature News has attempted to contact the conference organizers and named authors of the paper but received no reply; however at least some of the names belong to real people. The IEEE has now removed the paper).
“Long years have passed.
I think of goodbye.
Locked tight in the night
I think of passion;
Drawn to for blue, the night
During the page
My shattered pieces of life
watching the joy
shattered pieces of love
My shattered pieces of love
gone stale.”—"Long years have passed", current leader in "Most human-like computer poems" on Leaderboard | bot or not, “a Turing test for poetry. You, the judge, have to guess whether the poem you’re reading is written by a human or by a computer”, via @goto80
Crowdworking is often hailed by its boosters as ushering in a new age of work. With the zeal of high-tech preachers, they cast it as a space in which individualism, choice and self-determination flourish. “CrowdFlower, and others in the crowdsourcing industry, are bringing opportunities to people who never would have had them before, and we operate in a truly egalitarian fashion, where anyone who wants to can do microtasks, no matter their gender, nationality, or socio-economic status, and can do so in a way that is entirely of their choosing and unique to them,” asserts Lukas Biewald, the CEO of CrowdFlower, in an e-mail exchange. (CrowdFlower claims to have “among the largest, if not the largest, crowd” available, with roughly 100,000 workers completing tasks on any given day.)
But if you happen to be a low-end worker doing the Internet’s grunt work, a different vision arises. According to critics, Amazon’s Mechanical Turk may have created the most unregulated labor marketplace that has ever existed. Inside the machine, there is an overabundance of labor, extreme competition among workers, monotonous and repetitive work, exceedingly low pay and a great deal of scamming. In this virtual world, the disparities of power in employment relationships are magnified many times over, and the New Deal may as well have never happened.
As Miriam Cherry, one of the few legal scholars focusing on labor and employment law in the virtual world, has explained: “These technologies are not enabling people to meet their potential; they’re instead exploiting people.” Or, as CrowdFlower’s Biewald told an audience of young tech types in 2010, in a moment of unchecked bluntness: “Before the Internet, it would be really difficult to find someone, sit them down for ten minutes and get them to work for you, and then fire them after those ten minutes. But with technology, you can actually find them, pay them the tiny amount of money, and then get rid of them when you don’t need them anymore.”
Today the National Highway Safety Administration officially published two recall announcements, one from Tesla Motors and one from GM. Both are related to problems that could cause fires. In the case of GM, trucks left idling can overheat and catch fire—eight fires have been reported. In Tesla’s case, an overheating charger plug seems have to have been the cause of a fire in a garage (it’s not clear if the problem had to do with miswiring of the wall charger, damage to the plug, or something else).
Both problems can be addressed with software updates–in Tesla’s case, the software detects charging problems and decreases charging rates to avoid overheating (GM hasn’t provided details). Owners of 370,000 Chevrolet Silverado and GMC Sierra pickups will need to find time to take their pickups to the dealer to get the software fixed. But because of its ability to send software updates to its vehicles wirelessly, the 29,222 Tesla Model S electric cars that were affected have already been fixed.
Your favorite basketball player is about to get one step closer to being a cyborg.
The NBA’s Development League (D-League) will soon begin experimenting with wearable technology on the court, the league announced today. A small disc weighing in at a whopping one ounce—attached either to players’ chests or between their shoulder blades and worn underneath their uniforms—measures vital biological statistics.
Developed in conjunction with STAT Sports, Catapult, and Zephyr, this groundbreaking wearable tech makes available—in real time—individual players’ current state and statistics. The information is relayed to coaching and medical staffs alike in an effort to improve players’ efficiency and effectiveness on the court.
“Researchers at Australia’s Flinders University showed twenty participants smiley faces, along with real faces and strings of symbols that shouldn’t look like faces, all while recording the signals in the region of the brain that’s primarily activated when we see faces. This signal, called the N170 event-related potential, is the highest when people see actual faces, but was also high when people saw the standard emoticon :). “This indicates that when upright, emoticons are processed in occipitotemporal sites similarly to faces due to their familiar configuration,” the researchers write.”—Your brain no longer knows the difference between emoticons and emotion, via @gridinoc