The American Scholar: Saving the Self in the Age of the Selfie – James McWilliams

The American Scholar: Saving the Self in the Age of the Selfie – James McWilliams

In 2012, Paul Miller, a 26-year-old journalist and former writer for The Verge, began to worry about the quality of his thinking. His ability to read difficult studies or to follow intricate arguments demanding sustained attention was lagging. He found himself easily distracted and, worse, irritable about it. His longtime touchstone—his smartphone—was starting to annoy him, making him feel insecure and anxious rather than grounded in the ideas that formerly had nourished him. “If I lost my phone,” he said, he’d feel “like I could never catch up.” He realized that his online habits weren’t helping him to work, much less to multitask. He was just switching his attention all over the place and, in the process, becoming a bit unhinged.

Subtler discoveries ensued. As he continued to analyze his behavior, Miller noticed that he was applying the language of nature to digital phenomena. He would refer, for example, to his “RSS feed landscape.” More troubling was how his observations were materializing not as full thoughts but as brief Tweets—he was thinking in word counts.

When he realized he was spending 95 percent of his waking hours connected to digital media in a world where he “had never known anything different,” he proposed to his editor a series of articles that turned out to be intriguing and prescriptive. What would it be like to disconnect for a year? His editor bought the pitch, and Miller, who lives in New York, pulled the plug.

For the first several months, the world unfolded as if in slow motion. He experienced “a tangible change in my ability to be more in the moment,” recalling how “fewer distractions now flowed through my brain.” The Internet, he said, “teaches you to expect instant gratification, which makes it hard to be a good human being.” Disconnected, he found a more patient and reflective self, one more willing to linger over complexities that he once clicked away from. “I had a longer attention span, I was better able to handle complex reading, I did not need instant gratification, and,” he added somewhat incongruously, “I noticed more smells.” The “endless loops that distract you from the moment you are in,” he explained, diminished as he became “a more reflective writer.” It was an encouraging start.

But if Miller became more present-minded, nobody else around him did. “People felt uncomfortable talking to me because they knew I wasn’t doing anything else,” he said. Communication without gadgets proved to be a foreign concept in his peer world. Friends and colleagues—some of whom thought he might have died—misunderstood or failed to appreciate Miller’s experiment.

Plus, given that he had effectively consigned himself to offline communications, all they had to do to avoid him was to stay online. None of this behavior was overtly hostile, all of it was passive, but it was still a social burden reminding Miller that his identity didn’t thrive in a vacuum. His quality of life eventually suffered.

What we do about it may turn out to answer one of this century’s biggest questions. A list of user-friendly behavioral tips—a Poor Richard’s Almanack for achieving digital virtue—would be nice.

But this problem eludes easy prescription. The essence of our dilemma, one that weighs especially heavily on Generation Xers and millennials, is that the digital world disarms our ability to oppose it while luring us with assurances of convenience. It’s critical not only that we identify this process but also that we fully understand how digital media co-opt our sense of self while inhibiting our ability to reclaim it. Only when we grasp the inner dynamics of this paradox can we be sure that the Paul Millers of the world—or others who want to preserve their identity in the digital age—can form technological relationships in which the individual determines the use of digital media rather than the other way around.

The Unbearable Homogeneity of Design — Medium

The Unbearable Homogeneity of Design — Medium

Section 1: What The Fuck Are We Doing, Tho?
Call it the Dribbblization of design, but we’re all making more or less the same thing.

Certainly, design should follow some basic paradigms to make whatever we’re designing easy to use. All scissors look fundamentally the same because that’s what works.

But digital design—whether it’s for desktop, mobile, VR, games, whatever—is still relatively young. We simply do not know what the best solutions are. At best, we’ve reached a local maximum. And so long as we reward predictable designs, we will never move past this local maximum.

“Empathy” in design doesn’t just mean designing for marginalized people. It can simply mean designing with the understanding that there are other aesthetics and world views than yours.

Why Do I Have to Call This App ‘Julie’? – The New York Times

Why Do I Have to Call This App ‘Julie’? – The New York Times

  • Why does artificial intelligence need a gender.
  • The latest technology is stuck in the oldest stereotypes.

And why does artificial intelligence need a gender at all? Why not imagine a talking cat or a wise owl as a virtual assistant? I would trust an anthropomorphized cartoon animal with my calendar. Better yet, I would love to delegate tasks to a non-binary gendered robot alien from a galaxy where setting up meetings over email is respected as a high art.

Technologies speak with recorded feminine voices because women “weren’t normally there to be heard,” Helen Hester, a media studies lecturer at the University of West London, told me. A woman’s voice stood out. For example, an automated recording of a woman’s voice used in cockpit navigation becomes a beacon, a voice in stark contrast with that of everyone else, when all the pilots on board are men.

..

The product is an interesting idea and easy to use, but interacting with a fake woman assistant just feels too weird. So I shut “her” off. This Stepford app, designed to make my work more efficient, only reminds me of the gendered division of labor that I’m trying to escape.

Watch Data Attack | Are You Addicted To Your Phone? | WIRED Video | CNE

Watch Data Attack | Are You Addicted To Your Phone? | WIRED Video | CNE

On a typical day, the average person checks their phone 85 times. In total, we spend about 5 hours on our phones each day. Here we explore the fine line between normal phone use and device addiction.

‘The Book of Lost Books,’ by Stuart Kelly – New York Times

‘The Book of Lost Books,’ by Stuart Kelly – New York Times

“The Book of Lost Books” concerns itself with two main subjects: books that have disappeared, either through negligence, deliberate destruction or the vicissitudes of history; and books that never got written in the first place. Ranging over authors as famous as Homer, Hemingway, Austen and Aristophanes, it also contains chapters devoted to non-marquee names like Widsith the Wide-Traveled, Fulgentius, Ahmad ad-Daqiqi and Faltonia Betitia Proba.

Each chapter contains abundant biographical information about the author in question, then proceeds to explain how one or more of his or her books was lost, stolen, mutilated, bowdlerized, incinerated or abandoned.

Kelly seems to grudgingly accept that we are lucky so much great literature has survived, but would be a whole lot luckier if cultural pyromaniacs had refrained from burning down the library at Alexandria once and for all nearly a millennium and a half ago, where the only complete copy of Aeschylus’ 80 plays had been housed for a thousand years.

Occasionally Kelly gets lost inside his sentences; it’s anyone’s guess what he’s ranting about early in the book when he repeats the accusation by Lasus of Hermione that Onomacritus might have been guilty of misattribution, nay forgery, in his edition of Musaeus. In other places, he can turn pedantic; discussing the language of the “Iliad,” he writes: “Predominantly in the Ionic dialect, it contains traces of the Aeolic, hints of Arcado-Cypriot.” Mr. Kelly: behave!

But these occasional lapses quickly give way to delightful vignettes like the one about a critic thrown off a cliff by “irate Athenians who objected to his carping criticism of the divine Homer.” Today, if anyone got thrown off a cliff, it would be for complaining about Oprah.

IBM 704 – Wikipedia, the free encyclopedia

IBM 704 – Wikipedia, the free encyclopedia

The programming languages FORTRAN[5] and LISP[6] were first developed for the 704.

MUSIC, the first computer music program, was developed on the IBM 704 by Max Mathews.

In 1962 physicist John Larry Kelly, Jr created one of the most famous moments in the history ofBell Labs by using an IBM 704 computer to synthesize speech. Kelly’s voice recorder synthesizervocoder recreated the song Daisy Bell, with musical accompaniment from Max Mathews. Arthur C. Clarke was coincidentally visiting friend and colleague John Pierce at the Bell Labs Murray Hill facility at the time of this speech synthesis demonstration, and Clarke was so impressed that six years later he used it in the climactic scene of his novel and screenplay for 2001: A Space Odyssey,[7] where the HAL 9000 computer sings the same song.[8][contradictory]

Edward O. Thorp, a math instructor at MIT, used the IBM 704 as a research tool to investigate the probabilities of winning while developing his blackjack gaming theory.[9][10] He used FORTRAN to formulate the equations of his research model.

The IBM 704 was used as the official tracker for the Smithsonian Astrophysical ObservatoryOperation Moonwatch in the fall of 1957. See The M.I.T. Computation Center and Operation Moonwatch. IBM provided four staff scientists to aid Smithsonian Astrophysical Observatoryscientists and mathematicians in the calculation of satellite orbits: Dr. Giampiero Rossoni, Dr. John Greenstadt, Thomas Apple and Richard Hatch.

The Custodian of Forgotten Books – The New Yorker

The Custodian of Forgotten Books – The New Yorker

In recent years, many publishers have come to the same realisation —that the graveyard of literary history includes many works worth resurrecting. ‘It’s a pretty striking change in the last decade or so,’ Edwin Frank, the editor of the Classics series from New York Review Books, told me.

Frank believes that publishers have the power to change the canon, but only if they’re truly open to lesser-known titles. ‘Those books are there to search you out,’ he said. “They can exist to change your mind about what a book can be.’ Paradoxically, the new interest in neglected books can be seen as a reaction to the decline of book culture. Books used to be a centrepiece of both education and entertainment, but television and the Internet have challenged that role.

Frank believes that among book lovers, ‘there’s a kind of sitting and looking—a kind of assessing the culture’ going on. We’ve become more aware of what could be lost forever.