"We curate our lives around this perceived sense of perfection because we get rewarded in these short-term signals — hearts, likes, thumbs up — and we conflate that with value, and we conflate it with truth," he said. "And instead what it really is is fake, brittle popularity that's short-term and that leaves you even more — admit it — vacant and empty before you did it, because then it forces you into this vicious cycle where you're like, 'What's the next thing I need to do now because I need it back?'

Read more

If you have a moment take a look at Facebookistan

Posted
Authoralexanderhayes

"Silicon Valley's utopians genuinely but mistakenly believe that more information and connection makes us more analytical and informed. But when faced with quinzigabytes of data, the human tendency is to simplify things. Information overload forces us to rely on simple algorithms to make sense of the overwhelming noise. This is why, just like the advertising industry that increasingly drives it, the internet is fundamentally an emotional medium that plays to our base instinct to reduce problems and take sides, whether like or don't like, my guy/not my guy, or simply good versus evil. It is no longer enough to disagree with someone, they must also be evil or stupid...

Nothing holds a tribe together like a dangerous enemy. That is the essence of identity politics gone bad: a universe of unbridgeable opinion between opposing tribes, whose differences are always highlighted, exaggerated, retweeted and shared. In the end, this leads us to ever more distinct and fragmented identities, all of us armed with solid data, righteous anger, a gutful of anger and a digital network of likeminded people. This is not total connectivity; it is total division."

Source: http://www.newsweek.com/how-silicon-valley...
lead_960.jpg

"Think about the computing systems you use every day. All of them represent attempts to simulate something else. Like how Turing's original thinking machine strived to pass as a man or woman, a computer tries to pass, in a way, as another thing. As a calculator, for example, or a ledger, or a typewriter, or a telephone, or a camera, or a storefront, or a cafe. After a while, successful simulated machines displace and overtake the machines they originally imitated. The word processor is no longer just a simulated typewriter or secretary, but a first-order tool for producing written materials of all kinds. Eventually, if they thrive, simulated machines become just machines. Today, computation overall is doing this. There's not much work and play left that computers don't handle. And so, the computer is splitting from its origins as a means of symbol manipulation for productive and creative ends, and becoming an activity in its own right. Today, people don't seek out computers in order to get things done; they do the things that let them use computers. [...] This new cyberpunk dystopia is more Stepford Wives, less William Gibson. Everything continues as it was before, but people treat reality as if it were in a computer."

Source: https://www.theatlantic.com/technology/arc...

"In a small recent study, researchers from New York University found that those who considered themselves in higher classes looked at people who walked past them less than those who said they were in a lower class did. The results were published in the journal of the Association for Psychological Science.

According to Pia Dietze, a social psychology doctoral student at NYU and a lead author of the study, previous research has shown that people from different social classes vary in how they tend to behave towards other people. So, she wanted to shed some light on where such behaviours could have originated. The research was divided into three separate studies.

For the first, Dietze and NYU psychology lab director Professor Eric Knowles asked 61 volunteers to walk along the street for one block while wearing Google Glass to record everything they looked at. These people were also asked to identify themselves as from a particular social class: either poor, working class, middle class, upper middle class, or upper class. An independent group watched the recordings and made note of the various people and things each Glass wearer looked at and for how long. The results showed that class identification, or what class each person said they belonged to, had an impact on how long they looked at the people who walked past them.

During Study 2, participants viewed street scenes while the team tracked their eye movements. Again, higher class was associated with reduced attention to people in the images.

For the third and final study, the results suggested that this difference could stem from the way the brain works, rather than being a deliberate decision. Close to 400 participants took part in an online test where they had to look at alternating pairs of images, each containing a different face and five objects. Whereas higher class participants took longer to notice when the face was different in the alternate image compared to lower classes, the amount of time it took to detect the change of objects did not differ between them. The team reached the conclusion that faces seem to be more effective in grabbing the attention of individuals who come from relatively lower class backgrounds."

Source: http://www.businessinsider.com.au/rich-peo...
Posted
AuthorJordan Brown
The CIA claims to be able to predict social unrest days before it happens thanks to powerful super computers dubbed Siren Servers by the father of Virtual Reality, Jaron Lanier.

CIA Deputy Director for Digital Innovation Andrew Hallman announced that the agency has beefed-up its “anticipatory intelligence” through the use of deep learning and machine learning servers that can process an incredible amount of data.

“We have, in some instances, been able to improve our forecast to the point of being able to anticipate the development of social unrest and societal instability some I think as near as three to five days out,” said Hallman on Tuesday at the Federal Tech event, Fedstival.

This Minority Report-type technology has been viewed skeptically by policymakers as the data crunching hasn’t been perfected, and if policy were to be enacted based on faulty data, the results could be disastrous. Iraq WMDs?

[...]

“I called it a siren server because there’s no plan to be evil. A siren server seduces you,” said Lanier.

In the case of the CIA; however, whether the agency is being innocently seduced or is actively planning to use this data for its own self-sustaining benefit, one can only speculate.

Given the Intelligence Community’s track record for toppling governments, infiltrating the mainstream media, MK Ultra, and scanning hundreds of millions of private emails, that speculation becomes easier to justify.
Source: http://sociable.co/technology/cia-siren-se...

The Guardian's Julia Powles writes about how with the advent of artificial intelligence and so-called "machine learning," this society is increasingly a world where decisions are more shaped by calculations and data analytics rather than traditional human judgement:

Jose van Dijck, president of the Dutch Royal Academy and the conference’s keynote speaker, expands: Datification is the core logic of what she calls “the platform society,” in which companies bypass traditional institutions, norms and codes by promising something better and more efficient — appealing deceptively to public values, while obscuring private gain. Van Dijck and peers have nascent, urgent ideas. They commence with a pressing agenda for strong interdisciplinary research — something Kate Crawford is spearheading at Microsoft Research, as are many other institutions, including the new Leverhulme Centre for the Future of Intelligence. There’s the old theory to confront, that this is a conscious move on the part of consumers and, if so, there’s always a theoretical opt-out. Yet even digital activists plot by Gmail, concedes Fieke Jansen of the Berlin-based advocacy organisation Tactical Tech. The Big Five tech companies, as well as the extremely concentrated sources of finance behind them, are at the vanguard of “a society of centralized power and wealth. “How did we let it get this far?” she asks. Crawford says there are very practical reasons why tech companies have become so powerful. “We’re trying to put so much responsibility on to individuals to step away from the ‘evil platforms,’ whereas in reality, there are so many reasons why people can’t. The opportunity costs to employment, to their friends, to their families, are so high” she says.
Source: https://www.theguardian.com/technology/201...

This short video explores how the online world has overwhelmingly become the popular outlet for public rage by briefly illustrating some of the many stories of everyday people which have suddenly become public enemy number one under the most misunderstood of circumstances and trivial narratives. With the web acting like a giant echo-chamber, amplifying false stories and feeding on the pent-up aggression of the audience watching the spectacle, The Outrage Machine shows how these systems froth the mob mentality into a hideous mess, as a good example of where the spectacle goes and how its intensity has to keep ratcheting up in order maintain the audience attention, in a culture of dwindling attention spans, distraction and triviality.

Filmmaker and author Jon Ronson also recently wrote a book about this topic too, which is quite good. So You've Been Publicly Shamed. His TED talk is essentially a 17 min overview:

And a longer presentation with interview and Q&A from earlier this year:

"I've found my kids pushing the virtual assistant further than they would push a human," says Avi Greengart, a tech analyst and father of five who lives in Teaneck, New Jersey. "[Alexa] never says 'That was rude' or 'I'm tired of you asking me the same question over and over again.'" Perhaps she should, he thinks. "One of the responsibilities of parents is to teach your kids social graces," says Greengart, "and this is a box you speak to as if it were a person who does not require social graces."

[...]

Alexa, tell me a knock-knock joke.

Alexa, how do you spell forest?

Alexa, what’s 17 times 42?

The syntax is generally simple and straightforward, but it doesn’t exactly reward niceties like “please.” Adding to this, extraneous words can often trip up the speaker’s artificial intelligence. When it comes to chatting with Alexa, it pays to be direct—curt even. “If it’s not natural language, one of the first things you cut away is the little courtesies,” says Dennis Mortensen, who founded a calendar-scheduling startup called x.ai.

For parents trying to drill good manners into their children, listening to their kids boss Alexa around can be disconcerting.

“One of the responsibilities of parents is to teach your kids social graces,” says Greengart, “and this is a box you speak to as if it were a person who does not require social graces.”

It’s this combination that worries Hunter Walk, a tech investor in San Francisco. In a blog post, he described the Amazon Echo as “magical” while expressing fears it’s “turning our daughter into a raging asshole.”

 

Source: http://qz.com/701521/parents-are-worried-t...

"....The problem here is that often people have a misunderstanding of what implantable technology really is. Rather than looking at the technology that currently exists today (and has existed for years), the general public tends to look to the media for examples of what implantable technology might be. We see examples of previously unheard of technology in movies and television and we come to recognize these futuristic devices as the definition of implantable technology. But this really is not the case. Indeed, there have been examples of implantable technology in society for many years, we just have not recognized these devices as such."

Read more - http://realityshifting.tumblr.com/post/102609492758/is-implantable-technology-really-futuristic

Referred article: http://www.usatoday.com/story/tech/reviewed-com/2014/03/27/implantable-tech-is-the-next-wave/6914363/

Posted
Authoralexanderhayes
Categoriesuberveillance