my blog for Web Layout and Design class (formerly for Digital New Media class).

Showing posts with label notes. Show all posts
Showing posts with label notes. Show all posts

Tuesday, November 21, 2006

Into the hands of the people

An increasing presence in the public realm, computing literally fell into the laps of laypeople when Alan Kay and Adele Goldberg imagined the future of notebook computing in Personal Dynamic Media. The Dynabook vision rightly envisioned personal computers being used by everyone from businessmen to children, both creatively and educationally. The pair did not foresee, however, today’s networked media use of DVD’s and MP3’s on personal laptops.

Seymour Papert had an even greater vision for children in his book Mindstorms, published in 1980. At a time when game consoles were becoming widely available for children’s use, Papert saw great educational potential in software that actually engages children in programming, something he called “constructionism.” Papert rightly predicted that future toys for children would be as technologically adept as million dollar IBM’s selling at the time.`

In Literary Machines, Ted Nelson culminated his ideas of hypertext and Vannear Bush’s conception of the memex in something called Xanadu. Xanadu is the “ultimate archive” that has characteristics of anarchy and navigation. While some aspects of Nelson’s Xanadu vision have come to fruition in reality, others have not; Nelson remains active today in continuing to push forward his ideas such as “ZigZag” and “transcopyrighting.”

The new media technologies realized by visionaries of the 1960’s and ‘70s spread into the public sphere in the ‘80s. In 1983, Ben Bagdikian took on the role of understanding and predicting what the hold of business would be on the emerging new media sphere in The Endless Chain. Bagdikian was precient in his prediction of increasing horizontal and vertical integration of companies and emerging monopolies over new media technology and distribution.

Today, direct manipulation interfaces are omnipresent, found in applications such as Photoshop as well as within internet browsers and sites. Ben Schneiderman in his 1983 essay “Direct Manipulation: A Step Beyond Programming Languages,” describes how trends will move away from computer users employing a command language and toward direct manipulation interfaces, where computer activity will imitate activity in the user’s world via a metaphor system.

Direct manipulation interfaces were especially influential in the development of video games, where operators could move control devices and directly affect movement in the game interface. In her 1984 book, The Second Self, Sherry Turkle uses video games as a social laboratory to study human-computer interaction and concludes that humans and the perception of self are influenced by encounters with computers in distinct psychological ways.

Donna Haraway has been a huge presence in the last thirty years with her socialist-feminist inspired theories of social construction in this age of science and technology. Possibly her greatest work, Cyborg Manifesto, was published in 1985. The “mythology” describes the human’s place and position as a cyborg, part organism and part machine. While some people at the time were concerned about societies movement into the unknown of technology, Haraway argued that by imagining ourselves as part machine and products of technology allows for blurred boundaries and a greater approach to locating positional objectivity.

In 1984, Richard Stallman and other computer programmers saw their free software begin to slip away as AT&T announced that UNIX would no longer be free. Programmers found themselves in a position where they could no longer manipulate programs and, due to proprietary liscensing restrictions, they were unable to share programs with others. These changes inspired Stallman to create GNU (GNU’s Not Unix) In 1985, a project that spearheaded “copyleft” and the Free Software Foundation. Stallman and his projects still fight for shared sources and software freedom today.

While so much attention had been focused on Artificial Intelligence and the ability of humans to communicate with computers, Terry Winograd and Fernando Flores wrote Understanding Computers and Cognition in 1986 to emphasize computers as tools for design.

Friday, November 17, 2006

The 70's

Up through the 1960’s, computers were still relegated to the realm of central processors for military and university use. Ushering in the next decade, Ted Nelson looked to the future of personal computers in his 1974 book Computer Lib/Dream Machines.

Nelson’s book communicated revolutionary ideas about what the computer could be used for, going beyond their capacity for calculation and into their potential for media and design. In addition, these potentials would be available for everyone to explore in an open publishing network. The network would be flexible and interconnected, reaching back to his 1965 conceptions of hypertext.

Augusto Boal further blurred the divide between producer and consumer/actor and spectator through theater in the late seventies. In his 1979 work, Theatre of the Oppressed, Boal described techniques for embodying interaction in performance. The concepts of encoder and decoder were consequently melting away in both technology and art.

The ultimate realization of immersing the “decoder” audience in the product began to be realized in the late seventies with the inception of virtual reality. Initially, virtual reality was a great application for architecture, allowing architects to visually experience and graphically design structures before bringing them into reality. Ted Nelson’s vision for computers catering to design came to life with the founding of MIT’s Architecture Machine Group by Nicholas Negroponte in 1967 and the opening of the MIT Media lab in 1985.

Virtual reality, and the relationship of art and technology, took a great leap forward in 1977 with the introduction of “responsive environments” by Myron Krueger, who would become known as the “father of virtual reality.” Krueger insisted “that the art world was ready to embrace work that focused on response rather than the creation of appealing physical items” as he brought the realms of computer science and art together to deconstruct the “form/content divide.”

Krueger also was adament that people explore all aspects of their inventions and know them because only then will we understand and be able to “choose what we become as a result of what we have made.” Krueger’s concerns echo those of his contemporary Joseph Weizenbaum, who demanded in 1976 that scientists and technologists take responsibility for computer and machine influences on human society.

Monday, October 16, 2006

Timeline Continued

In the early 1960's, Ivan Sutherland developed the Sketchpad system. Sketchpad was a launching pad for modern conversational interface systems. Allowing users to manipulate objects, magnify their workspace, and perform recursive operations, the interface ushered in the future of user-empowered programs. Sketchpad opened up a new digital world for graphic art.

Roy Ascott observed the impending shift in new media art from linear production to audience participation to full two way interaction. Ascott wrote The Construction of Change in 1961 to distinguish the interactive potential of new media art from participatory art such as Allen Kaprow's Happenings. Ascott also references cybernetics, as he encourages the artist to fully understand the science of behavioral experience.

A File Structure for the Complex, the Changing, and the Indeterminate returns to textual possibilities in new media. In the essay, published in 1965, Ted Nelson coins the term hypertext. "Hyper...connotes extension and generality." Nelson goes on to define hypertext as "a body of written or pictorial material interconnected in such a complex way that it could not conveniently be presented or represented on paper." Though today's world wide web is woven together with a hyperlinking system, we still have not acheived the whole vision of what Nelson meant by the word, embodied in his described filing and listing systems, ELF and PRIDE.

Feeling threatened by the mathematical realism of computer science, the literary field responded in 1961 with A Hundred Thousand Billion Poems by Raymond Queneau. The work invited the reader to cut apart the lines and reconstruct the poem, a style predating refridgerator magnetic poetry. The method attempted to reconfigure "the relationship between reader, author, and text." Later, computers were utilized to break apart and restructure poetic creations, a move that ultimately married literary art with mathematics as artists began looking at the algorithms of possibilities.

Monday, October 09, 2006

Chapters 7 & 8 Summary


"The Cut-Up Method of Brion Gysin" by William Burroughs is a call to arms for everyone to begin experimenting with randomness and recombination in writing. An author or artist can acheive the cut-up method explicitly by using scissors to cut up an original work and paste it back together to create a collage. Burroughs says that this method will introduce "a new dimension into writing." The article was published in 1961, paving the way for Ted Nelson's coining of "hypertext" and the further deconstruction of hierarchical text.

One year later, Douglas Englebert, the genius involved in the development of the internet, word processor, mouse, and window, takes the deconstruction of traditional text much futher in his bookAugmenting Human Intellect, a Conceptual Framework. The framework focuses on the goal of "increasing the capability of a man to approach a complex problem situation," or increasing human intellectual effectiveness. Englebert encourages a systems approach to the problem.

He first references Vannevar Bush's conception of the Memex, and expounds upon it to illuminate possibilities stemming from a mechanical card system of organization. Here, Englebert introduces associative linking to connect card A to card B and develop genereal grouping classifications.

The next section discusses an electronic computer based augmentation system. An extensive dialogue is played out in which the subject comes to realize that human intellect does not work linearly like our traditional symbol structures (books, etc.), but rather criss-crosses, feedbacks, and operates with substructures and antecedent links. We should ultimately be able to operate computers in a similar way, by manipulating documents to create links among topics and streams of thought. We will acheive better comprehension if human symbol structures (text) mirror human conceptual structures of nodes, branches, and links.
My favorite comment of Englebert's articulates how valuable such a leap in processing would be:

"I found, when I learned to work with the stuctures and manipulation processes such as we have outlined, that I got rather impatient if I had to go back to dealing with the serial-statement structuring in books and journals, or other ordinary means of communicating with other workers. It is rather like having to project three-dimensional images onto two-dimensional frames and to work with them there instead of in their natural form. Actually, it is much closer to the truth to day that it is like trying to project n-dimensional forms (the concept structures, which we have seen can be related with many many nonintersecting links) onto a one-dimensional form (the serial string of symbols), where the human memory and visualization has to hold and pucture the links and relationships."

Human's naturally think in multiple-dimensions incorporating reversion and association, yet until the 20th century our language and text symbol structures remained as a flat, linear represenation. Douglas Englebert imagined a future where humans developed better mechical and electrical tools in efforts to acheive the great potential of the human intellect.