The American Scholar: Saving the Self in the Age of the Selfie – James McWilliams
In 2012, Paul Miller, a 26-year-old journalist and former writer for The Verge, began to worry about the quality of his thinking. His ability to read difficult studies or to follow intricate arguments demanding sustained attention was lagging. He found himself easily distracted and, worse, irritable about it. His longtime touchstone—his smartphone—was starting to annoy him, making him feel insecure and anxious rather than grounded in the ideas that formerly had nourished him. “If I lost my phone,” he said, he’d feel “like I could never catch up.” He realized that his online habits weren’t helping him to work, much less to multitask. He was just switching his attention all over the place and, in the process, becoming a bit unhinged.
Subtler discoveries ensued. As he continued to analyze his behavior, Miller noticed that he was applying the language of nature to digital phenomena. He would refer, for example, to his “RSS feed landscape.” More troubling was how his observations were materializing not as full thoughts but as brief Tweets—he was thinking in word counts.
When he realized he was spending 95 percent of his waking hours connected to digital media in a world where he “had never known anything different,” he proposed to his editor a series of articles that turned out to be intriguing and prescriptive. What would it be like to disconnect for a year? His editor bought the pitch, and Miller, who lives in New York, pulled the plug.
For the first several months, the world unfolded as if in slow motion. He experienced “a tangible change in my ability to be more in the moment,” recalling how “fewer distractions now flowed through my brain.” The Internet, he said, “teaches you to expect instant gratification, which makes it hard to be a good human being.” Disconnected, he found a more patient and reflective self, one more willing to linger over complexities that he once clicked away from. “I had a longer attention span, I was better able to handle complex reading, I did not need instant gratification, and,” he added somewhat incongruously, “I noticed more smells.” The “endless loops that distract you from the moment you are in,” he explained, diminished as he became “a more reflective writer.” It was an encouraging start.
But if Miller became more present-minded, nobody else around him did. “People felt uncomfortable talking to me because they knew I wasn’t doing anything else,” he said. Communication without gadgets proved to be a foreign concept in his peer world. Friends and colleagues—some of whom thought he might have died—misunderstood or failed to appreciate Miller’s experiment.
Plus, given that he had effectively consigned himself to offline communications, all they had to do to avoid him was to stay online. None of this behavior was overtly hostile, all of it was passive, but it was still a social burden reminding Miller that his identity didn’t thrive in a vacuum. His quality of life eventually suffered.
…
What we do about it may turn out to answer one of this century’s biggest questions. A list of user-friendly behavioral tips—a Poor Richard’s Almanack for achieving digital virtue—would be nice.
But this problem eludes easy prescription. The essence of our dilemma, one that weighs especially heavily on Generation Xers and millennials, is that the digital world disarms our ability to oppose it while luring us with assurances of convenience. It’s critical not only that we identify this process but also that we fully understand how digital media co-opt our sense of self while inhibiting our ability to reclaim it. Only when we grasp the inner dynamics of this paradox can we be sure that the Paul Millers of the world—or others who want to preserve their identity in the digital age—can form technological relationships in which the individual determines the use of digital media rather than the other way around.