Information Overload or a Search for Meaning? – The American Interest

Information Overload or a Search for Meaning? – The American Interest

The principal response to the anxiety about Information Overload has been a technical one, namely, trying to improve the processing and management of information. But the development of new techniques of storage and retrieval of information does not relieve their users of the burden of interpreting it and understanding what it means. To gain meaning is a cultural accomplishment, not technical one. Unfortunately, Western society has become estranged from the messy business of engaging with meaning. This sensibility is vividly captured by the oft-repeated idiom (‘That’s too much information!”), so common that it’s now often communicated in texting simply by thumbing out “TMI.” This idiom is often used playfully to warn about “over-sharing” personal details or inappropriate sentiments. But the very fact that the ambiguities of everyday encounters are expressed through a language that quantifies personal communication (“too much”) and reduces it to abstract information speaks to a culture that all too readily assigns people the role of passive victims of information overload.

The corollary of Information Overload is the phenomenon of what Nico Macdonald, a British writer on digital culture, has characterised as Paradigm Underload. Macdonald notes that the problem facing society is not the quantity of information but the conceptual tools and paradigms with which to “filter, prioritise, structure and make sense of information.” Unfortunately, without a paradigm, the meaning of human experience becomes elusive to the point that the worship of Big Data displaces the quest for Big Ideas.

 

Do You Read Differently Online and in Print?

Do You Read Differently Online and in Print?

The Internet may cause our minds to wander off, and yet a quick look at the history of books suggests that we have been wandering off all along. When we read, the eye does not progress steadily along the line of text; it alternates between saccades—little jumps—and brief stops, not unlike the movement of the mouse’s cursor across a screen of hypertext. From the invention of papyrus around 3000 B.C., until about 300 A.D., most written documents were scrolls, which had to be rolled up by one hand as they were unrolled by the other: a truly linear presentation. Since then, though, most reading has involved codices, bound books or pamphlets, a major advantage of which (at least compared to the scroll) is that you can jump around in them, from chapter to chapter (the table of contents had been around since roughly the first century B.C.); from text to marginal gloss, and, later, to footnote.

In the age of print, nonlinear reading found its most elaborate support in the “book wheel,” invented by the Italian engineer Agostino Ramelli in 1588: a “rotary reading desk” which allowed the reader to keep a great number of books at once, and to switch between them by giving the wheel a turn. The book wheel was— unfortunately!—a rarity in European libraries, but when you think about all the kinds of reading that print affords, the experience of starting a text at its beginning and reading all the way to the end, which we now associate with “deep” reading, looks less characteristic of print in general than of the novel in particular: the one kind of book in which, we feel, we might be depriving ourselves of something vital if we skipped or skimmed.

The quality of digital media poses one kind of problem for the reading brain; the quantity of information available to the wired reader poses a different and more serious problem. But it’s worth noting that readers have faced this problem before, too. Gutenberg printed his first Bible in 1455, and by 1500, some 27,000 titles had been published in Europe, in a total of around 10 million copies. The flood of printed matter created a reading public, and changed the way that people read.

IBM Design Language | Animation: Fundamentals

IBM Design Language | Animation: Fundamentals

Learn how IBM products move with the accuracy and precision of a machine.

For over one hundred years, IBM has crafted business machines for professionals around the world. From the powerful strike of a printing arm to the smooth slide of a typewriter carriage, each movement was fit for purpose and designed with intent. Our software demands the same attention to detail for making products feel lively and realistic.

We take inspiration from our heritage to define our animation style. Machines have solid planes, rigid surfaces and sharp, exact movements that are acted upon by physical forces. They don’t go from full-stop to top speed instantly or come to an abrupt stop, but instead take time to accelerate and decelerate. They have an inherent mass and move at different speeds in order to accomplish the tasks they were designed for.

DNA/How to Stop Worrying and Learn to Love the Internet

DNA/How to Stop Worrying and Learn to Love the Internet

This piece first appeared in the News Review section of The Sunday Times on August 29th 1999.

A couple of years or so ago I was a guest on Start The Week, and I was authoritatively informed by a very distinguished journalist that the whole Internet thing was just a silly fad like ham radio in the fifties, and that if I thought any different I was really a bit naïve. It is a very British trait – natural, perhaps, for a country which has lost an empire and found Mr Blobby – to be so suspicious of change.

But the change is real. I don’t think anybody would argue now that the Internet isn’t becoming a major factor in our lives. However, it’s very new to us. Newsreaders still feel it is worth a special and rather worrying mention if, for instance, a crime was planned by people ‘over the Internet.’ They don’t bother to mention when criminals use the telephone or the M4, or discuss their dastardly plans ‘over a cup of tea,’ though each of these was new and controversial in their day.

Then there’s the peculiar way in which certain BBC presenters and journalists (yes, Humphrys Snr., I’m looking at you) pronounce internet addresses. It goes ‘www DOT … bbc DOT… coDOT… uk SLASH… today SLASH…’ etc., and carries the implication that they have no idea what any of this new-fangled stuff is about, but that you lot out there will probably know what it means.

I suppose earlier generations had to sit through all this huffing and puffing with the invention of television, the phone, cinema, radio, the car, the bicycle, printing, the wheel and so on, but you would think we would learn the way these things work, which is this:

1) everything that’s already in the world when you’re born is just normal;

2) anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it;

3) anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilisation as we know it until it’s been around for about ten years when it gradually turns out to be alright really.

Apply this list to movies, rock music, word processors and mobile phones to work out how old you are.

This subjective view plays odd tricks on us, of course. For instance, ‘interactivity’ is one of those neologisms that Mr Humphrys likes to dangle between a pair of verbal tweezers, but the reason we suddenly need such a word is that during this century we have for the first time been dominated by non-interactive forms of entertainment: cinema, radio, recorded music and television. Before they came along all entertainment was interactive: theatre, music, sport – the performers and audience were there together, and even a respectfully silent audience exerted a powerful shaping presence on the unfolding of whatever drama they were there for. We didn’t need a special word for interactivity in the same way that we don’t (yet) need a special word for people with only one head.

I expect that history will show ‘normal’ mainstream twentieth century media to be the aberration in all this. ‘Please, miss, you mean they could only just sit there and watch? They couldn’t doanything? Didn’t everybody feel terribly isolated or alienated or ignored?’

‘Yes, child, that’s why they all went mad. Before the Restoration.’

‘What was the Restoration again, please, miss?’

‘The end of the twentieth century, child. When we started to get interactivity back.’

Linguistic relativity – Wikipedia, the free encyclopedia

Linguistic relativity – Wikipedia, the free encyclopedia

The essays of Paul Graham explore similar themes, such as a conceptual hierarchy of computer languages, with more expressive and succinct languages at the top. Thus, the so-called blub paradox (after a hypothetical programming language of average complexity called Blub) says that anyone preferentially using some particular programming language will know that it is more powerful than some, but not that it is less powerful than others. The reason is that writing in some language means thinking in that language. Hence the paradox, because typically programmers are “satisfied with whatever language they happen to use, because it dictates the way they think about programs”.[82]

Remember Facebook Notes? It’s Back With a Vengeance | WIRED

Remember Facebook Notes? It’s Back With a Vengeance | WIRED

In all, the new Notes looks quite a bit like wunderkind blogging platform Medium, which may be because Notes and Medium appear to have been designed with input from the same design team. Teehan + Lax worked on both early prototypes of what would evolve into Medium as well as the final product.

How To Engineer Serendipity – Aspen Ideas – Medium

How To Engineer Serendipity

I’d like to tell the story of a paradox: How do we bring the right people to the right place at the right time to discover something new, when we don’t know who or where or when that is, let alone what it is we’re looking for? This is the paradox of innovation: If so many discoveries — from penicillin to plastics – are the product of serendipity, why do we insist breakthroughs can somehow be planned? Why not embrace serendipity instead?

The final piece is the network. Google has made its ambitions clear — as far as chairman Eric Schmidt is concerned, the future of search is a “serendipity engine” answering questions you never thought to ask. “It’ll just know this is something that you’re going to want to see,” explained artificial intelligence pioneer Ray Kurzweil shortly after joining the company as its director of engineering.

One antidote to this all-encompassing filter bubble is an opposing serendipity engine proposed by MIT’s Ethan Zuckerman. In his book, Rewirehe sketches a set of recommendation and translation tools designed to nudge us out of our media comfort zones and “help us understand whose voices we’re hearing and whom we are ignoring.”

As Zuckerman points out, the greatest threats to serendipity are our ingrained biases and cognitive limits — we intrinsically want more known knowns, not unknown unknowns. This is the bias a startup named Ayasdi is striving to eliminate in Big Data.Rather than asking questions, its software renders its analysis as a network map, revealing hidden connections between tumors or terrorist cells, which CEO Gurjeet Singh calls “digital serendipity.”