DNA/How to Stop Worrying and Learn to Love the Internet

DNA/How to Stop Worrying and Learn to Love the Internet

This piece first appeared in the News Review section of The Sunday Times on August 29th 1999.

A couple of years or so ago I was a guest on Start The Week, and I was authoritatively informed by a very distinguished journalist that the whole Internet thing was just a silly fad like ham radio in the fifties, and that if I thought any different I was really a bit naïve. It is a very British trait – natural, perhaps, for a country which has lost an empire and found Mr Blobby – to be so suspicious of change.

But the change is real. I don’t think anybody would argue now that the Internet isn’t becoming a major factor in our lives. However, it’s very new to us. Newsreaders still feel it is worth a special and rather worrying mention if, for instance, a crime was planned by people ‘over the Internet.’ They don’t bother to mention when criminals use the telephone or the M4, or discuss their dastardly plans ‘over a cup of tea,’ though each of these was new and controversial in their day.

Then there’s the peculiar way in which certain BBC presenters and journalists (yes, Humphrys Snr., I’m looking at you) pronounce internet addresses. It goes ‘www DOT … bbc DOT… coDOT… uk SLASH… today SLASH…’ etc., and carries the implication that they have no idea what any of this new-fangled stuff is about, but that you lot out there will probably know what it means.

I suppose earlier generations had to sit through all this huffing and puffing with the invention of television, the phone, cinema, radio, the car, the bicycle, printing, the wheel and so on, but you would think we would learn the way these things work, which is this:

1) everything that’s already in the world when you’re born is just normal;

2) anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it;

3) anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilisation as we know it until it’s been around for about ten years when it gradually turns out to be alright really.

Apply this list to movies, rock music, word processors and mobile phones to work out how old you are.

This subjective view plays odd tricks on us, of course. For instance, ‘interactivity’ is one of those neologisms that Mr Humphrys likes to dangle between a pair of verbal tweezers, but the reason we suddenly need such a word is that during this century we have for the first time been dominated by non-interactive forms of entertainment: cinema, radio, recorded music and television. Before they came along all entertainment was interactive: theatre, music, sport – the performers and audience were there together, and even a respectfully silent audience exerted a powerful shaping presence on the unfolding of whatever drama they were there for. We didn’t need a special word for interactivity in the same way that we don’t (yet) need a special word for people with only one head.

I expect that history will show ‘normal’ mainstream twentieth century media to be the aberration in all this. ‘Please, miss, you mean they could only just sit there and watch? They couldn’t doanything? Didn’t everybody feel terribly isolated or alienated or ignored?’

‘Yes, child, that’s why they all went mad. Before the Restoration.’

‘What was the Restoration again, please, miss?’

‘The end of the twentieth century, child. When we started to get interactivity back.’

The halting genius of science-fiction writer Ted Chiang – The California Sunday Magazine

The halting genius of science-fiction writer Ted Chiang – The California Sunday Magazine

“Sometimes, people who read my work tell me, ‘I like it, but it’s not really science fiction, is it?’” he says. “And I always feel like, no, actually, my work is exactly science fiction.”

After Star Wars forever made the genre synonymous with what Chiang calls “adventure stories dressed up with lasers,” people forgot that science fiction includes the word “science” for a reason: It is supposed to be largely about exploring the boundaries of knowledge, he says. “All the things I do in my work — engaging in thought experiments, investigating philosophical questions — those are all things that science fiction does.”

Tom Vanderbilt Explains Why We Could Predict Self-Driving Cars, But Not Women in the Workplace

Tom Vanderbilt Explains Why We Could Predict Self-Driving Cars, But Not Women in the Workplace

In his book Predicting the Future, Nicholas Rescher writes that “we incline to view the future through a telescope, as it were, thereby magnifying and bringing nearer what we can manage to see.” So too do we view the past through the other end of the telescope, making things look farther away than they actually were, or losing sight of some things altogether.

These observations apply neatly to technology.

But when it comes to culture we tend to believe not that the future will be very different than the present day, but that it will be roughly the same. Try to imagine yourself at some future date. Where do you imagine you will be living? What will you be wearing? What music will you love?

Chances are, that person resembles you now. As the psychologist George Lowenstein and colleagues have argued, in a phenomenon they termed “projection bias,”1 people “tend to exaggerate the degree to which their future tastes will resemble their current tastes.”

This over- and under-predicting is embedded into how we conceive of the future. “Futurology is almost always wrong,” the historian Judith Flanders suggested to me, “because it rarely takes into account behavioral changes.” And, she says, we look at the wrong things: “Transport to work, rather than the shape of work; technology itself, rather than how our behavior is changed by the very changes that technology brings.” It turns out that predicting who we will be is harder than predicting what we will be able to do.

As the theorist Nassim Nicholas Taleb writes in Antifragile, “we notice what varies and changes more than what plays a larger role but doesn’t change. We rely more on water than on cell phones, but because water does not change and cell phones do, we are prone to thinking that cell phones play a larger role than they do.”

5 Things I Learned From Chris Dixon | The Waiter’s Pad

Source: 5 Things I Learned From Chris Dixon | The Waiter’s Pad

The 3rd way to use Twitter well.

Twitter can be a great tool if you use it well. Here’s what others have said:

The first way to use Twitter well is to be inspired and collaborate.Austin Kleon, John August, and Nicholas Megalis talked about the value of connnecting with other people.

The second way to use Twitter well is to check your conclusions. Jason Zweig, Tadas Viskanta, and Tren Griffin all suggested we generate conclusions from multiple perspectives, of which Twitter can be one.

The third way to use Twitter – courtesy of Dixon – is to have it work for you. “Essentially I have two thousand of the smartest people in the world finding information for me and telling me what to read,” Dixon says.

Linguistic relativity – Wikipedia, the free encyclopedia

Linguistic relativity – Wikipedia, the free encyclopedia

The essays of Paul Graham explore similar themes, such as a conceptual hierarchy of computer languages, with more expressive and succinct languages at the top. Thus, the so-called blub paradox (after a hypothetical programming language of average complexity called Blub) says that anyone preferentially using some particular programming language will know that it is more powerful than some, but not that it is less powerful than others. The reason is that writing in some language means thinking in that language. Hence the paradox, because typically programmers are “satisfied with whatever language they happen to use, because it dictates the way they think about programs”.[82]

Save “Save For Web” – Zeldman on Web & Interaction Design

Save “Save For Web” – Zeldman on Web & Interaction Design

Adobe created a “Save For Web” option (in Photoshop 3, if I remember rightly), and Furbo Filters’s beautiful market was gone in a moment. All that remains as a memento of that time and that product is the domain name furbo.org, which is where Craig keeps his blog.

I was reminded of this during a workplace discussion about the seeming disappearance of “Save For Web” from modern Photoshop.

To be clear, “Save For Web” still exists in Photoshop CC 2015. But it has clearly been deprecated, as is indicated by both UX (“Save For Web” no longer appears in the part of the interface where we’ve been trained to look for it for the past twenty years) and language: when we stumble onto “Save For Web” hiding under Export, after not finding it where we expect it, we’re presented with the words “Save For Web (Legacy),” clearly indicating that the feature is no longer a recommended part of today’s workflow.

Adobe explains: “Because Save for Web is built on the former ImageReady product (now discontinued), the code is too antiquated to maintain and develop new features.” (If Furbo Filters and DeBabelizer didn’t resurrect dead brain cells for some of you, I bet “ImageReady” did. Remember that one? Also, how scary is it for me that half the tools I’ve used in my career only exist today as Wikipedia entries?)

Instead of Save For Web, we’re to use Export: Export As…, which Adobe has built on its Generator platform.