12 | The Year’s Boldest Ideas In User Interface Design | Co.Design | business + design

12 | The Year’s Boldest Ideas In User Interface Design | Co.Design | business + design

When design historians look back on 2015, they will likely point out two major trends.

The first? The UIs of 2015 effortlessly stride between cyberspace and meatspace, from Microsoft’s HoloLens to MIT’s Lineform, a snakebot that can morph into any gadget you want.

The second: The rise of ambient interfaces, so-called zero UIs that can range from virtual secretaries to clothes that work like touch screens.

Retrotechtacular: Electronic Publishing in the 1930s | Hackaday

Retrotechtacular: Electronic Publishing in the 1930s | Hackaday

We are living in the age of citizen journalism and the 24-hour news cycle. Reports about almost anything newsworthy can be had from many perspectives, both vetted and amateur.

Just a few decades ago, people relied on daily newspapers, radio, and word of mouth for their news. On the brink of the television age, several radio stations in the United States participated in an experiment to broadcast news over radio waves. But this was no ordinary transmission. At the other end, a new type of receiver printed out news stories, line drawings, and pictures on a long roll of paper.

Radio facsimile newspaper technology was introduced to the public at the 1939 World’s Fair at two different booths. One belonged to an inventor named William Finch, and one to RCA. Finch had recently made a name for himself with his talking newspaper, which embedded audio into a standard newspaper in the form of wavy lines along the edges that were read by a special device.

DNA/How to Stop Worrying and Learn to Love the Internet

DNA/How to Stop Worrying and Learn to Love the Internet

This piece first appeared in the News Review section of The Sunday Times on August 29th 1999.

A couple of years or so ago I was a guest on Start The Week, and I was authoritatively informed by a very distinguished journalist that the whole Internet thing was just a silly fad like ham radio in the fifties, and that if I thought any different I was really a bit naïve. It is a very British trait – natural, perhaps, for a country which has lost an empire and found Mr Blobby – to be so suspicious of change.

But the change is real. I don’t think anybody would argue now that the Internet isn’t becoming a major factor in our lives. However, it’s very new to us. Newsreaders still feel it is worth a special and rather worrying mention if, for instance, a crime was planned by people ‘over the Internet.’ They don’t bother to mention when criminals use the telephone or the M4, or discuss their dastardly plans ‘over a cup of tea,’ though each of these was new and controversial in their day.

Then there’s the peculiar way in which certain BBC presenters and journalists (yes, Humphrys Snr., I’m looking at you) pronounce internet addresses. It goes ‘www DOT … bbc DOT… coDOT… uk SLASH… today SLASH…’ etc., and carries the implication that they have no idea what any of this new-fangled stuff is about, but that you lot out there will probably know what it means.

I suppose earlier generations had to sit through all this huffing and puffing with the invention of television, the phone, cinema, radio, the car, the bicycle, printing, the wheel and so on, but you would think we would learn the way these things work, which is this:

1) everything that’s already in the world when you’re born is just normal;

2) anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it;

3) anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilisation as we know it until it’s been around for about ten years when it gradually turns out to be alright really.

Apply this list to movies, rock music, word processors and mobile phones to work out how old you are.

This subjective view plays odd tricks on us, of course. For instance, ‘interactivity’ is one of those neologisms that Mr Humphrys likes to dangle between a pair of verbal tweezers, but the reason we suddenly need such a word is that during this century we have for the first time been dominated by non-interactive forms of entertainment: cinema, radio, recorded music and television. Before they came along all entertainment was interactive: theatre, music, sport – the performers and audience were there together, and even a respectfully silent audience exerted a powerful shaping presence on the unfolding of whatever drama they were there for. We didn’t need a special word for interactivity in the same way that we don’t (yet) need a special word for people with only one head.

I expect that history will show ‘normal’ mainstream twentieth century media to be the aberration in all this. ‘Please, miss, you mean they could only just sit there and watch? They couldn’t doanything? Didn’t everybody feel terribly isolated or alienated or ignored?’

‘Yes, child, that’s why they all went mad. Before the Restoration.’

‘What was the Restoration again, please, miss?’

‘The end of the twentieth century, child. When we started to get interactivity back.’

Tom Vanderbilt Explains Why We Could Predict Self-Driving Cars, But Not Women in the Workplace

Tom Vanderbilt Explains Why We Could Predict Self-Driving Cars, But Not Women in the Workplace

In his book Predicting the Future, Nicholas Rescher writes that “we incline to view the future through a telescope, as it were, thereby magnifying and bringing nearer what we can manage to see.” So too do we view the past through the other end of the telescope, making things look farther away than they actually were, or losing sight of some things altogether.

These observations apply neatly to technology.

But when it comes to culture we tend to believe not that the future will be very different than the present day, but that it will be roughly the same. Try to imagine yourself at some future date. Where do you imagine you will be living? What will you be wearing? What music will you love?

Chances are, that person resembles you now. As the psychologist George Lowenstein and colleagues have argued, in a phenomenon they termed “projection bias,”1 people “tend to exaggerate the degree to which their future tastes will resemble their current tastes.”

This over- and under-predicting is embedded into how we conceive of the future. “Futurology is almost always wrong,” the historian Judith Flanders suggested to me, “because it rarely takes into account behavioral changes.” And, she says, we look at the wrong things: “Transport to work, rather than the shape of work; technology itself, rather than how our behavior is changed by the very changes that technology brings.” It turns out that predicting who we will be is harder than predicting what we will be able to do.

As the theorist Nassim Nicholas Taleb writes in Antifragile, “we notice what varies and changes more than what plays a larger role but doesn’t change. We rely more on water than on cell phones, but because water does not change and cell phones do, we are prone to thinking that cell phones play a larger role than they do.”

Tom Vanderbilt Explains Why We Could Predict Self-Driving Cars, But Not Women in the Workplace

Tom Vanderbilt Explains Why We Could Predict Self-Driving Cars, But Not Women in the Workplace

The historian Lawrence Samuel has called social progress the “Achilles heel” of futurism.8 He argues that people forget the injunction of the historian and philosopher Arnold Toynbee: Ideas, not technology, have driven the biggest historical changes. When technology changes people, it is often not in the ways one might expect: Mobile technology, for example, did not augur the “death of distance,” but actually strengthened the power of urbanism. The washing machine freed women from labor, and, as the social psychologists Nina Hansen and Tom Postmes note, could have sparked a revolution in gender roles and relations. But, “instead of fueling feminism,” they write, “technology adoption (at least in the first instance) enabled the emergence of the new role of housewife: middle-class women did not take advantage of the freed-up time … to rebel against structures or even to capitalize on their independence.” Instead, the authors argue, the women simply assumed the jobs once held by their servants.

Take away the object from the historical view, and you lose sight of the historical behavior. Projecting the future often presents a similar problem: The object is foregrounded, while the behavioral impact is occluded. The “Jetsons idea” of jetpacking and meals in a pill missed what actually has changed: The notion of a stable career, or the social ritual of lunch.

One futurist noted that a 1960s film of the “office of the future” made on-par technological predictions (fax machines and the like), but had a glaring omission: The office had no women.9 Self-driving car images of the 1950s showed families playing board games as their tail-finned cars whisked down the highways. Now, 70 years later, we suspect the automated car will simply allow for the expansion of productive time, and hence working hours. The self-driving car has, in a sense, always been a given. But modern culture hasn’t.

Meet Margaret Hamilton, the badass ’60s programmer who saved the moon landing – Vox

Meet Margaret Hamilton, the badass ’60s programmer who saved the moon landing – Vox

In the early days, women were often assigned software tasks because software just wasn’t viewed as very important. “It’s not that managers of yore respected women more than they do now,” Rose Eveleth writes in a great piece on early women programmers for Smithsonian magazine. “They simply saw computer programming as an easy job. It was like typing or filing to them and the development of software was less important than the development of hardware. So women wrote software, programmed and even told their male colleagues how to make the hardware better.”

“I began to use the term ‘software engineering’ to distinguish it from hardware and other kinds of engineering,” Hamilton told Verne’s Jaime Rubio Hancock in an interview. “When I first started using this phrase, it was considered to be quite amusing. It was an ongoing joke for a long time. They liked to kid me about my radical ideas. Software eventually and necessarily gained the same respect as any other discipline.”

Hamilton is now 78 and runs Hamilton Technologies, the Cambridge, Massachusetts-based company she founded in 1986. She’s lived to see “software engineering” — a term she coined — grow from a relative backwater in computing into a prestigious profession.

How To Engineer Serendipity – Aspen Ideas – Medium

How To Engineer Serendipity

I’d like to tell the story of a paradox: How do we bring the right people to the right place at the right time to discover something new, when we don’t know who or where or when that is, let alone what it is we’re looking for? This is the paradox of innovation: If so many discoveries — from penicillin to plastics – are the product of serendipity, why do we insist breakthroughs can somehow be planned? Why not embrace serendipity instead?

The final piece is the network. Google has made its ambitions clear — as far as chairman Eric Schmidt is concerned, the future of search is a “serendipity engine” answering questions you never thought to ask. “It’ll just know this is something that you’re going to want to see,” explained artificial intelligence pioneer Ray Kurzweil shortly after joining the company as its director of engineering.

One antidote to this all-encompassing filter bubble is an opposing serendipity engine proposed by MIT’s Ethan Zuckerman. In his book, Rewirehe sketches a set of recommendation and translation tools designed to nudge us out of our media comfort zones and “help us understand whose voices we’re hearing and whom we are ignoring.”

As Zuckerman points out, the greatest threats to serendipity are our ingrained biases and cognitive limits — we intrinsically want more known knowns, not unknown unknowns. This is the bias a startup named Ayasdi is striving to eliminate in Big Data.Rather than asking questions, its software renders its analysis as a network map, revealing hidden connections between tumors or terrorist cells, which CEO Gurjeet Singh calls “digital serendipity.”