Tuesday, February 28, 2017

Social Media I

The evolution of social media can be conceived of in many ways -- in one sense, it could be said that language itself was the first social medium. Even then, considering a "social medium" to be any means of transmitting or recording language over time and space, alphabetic writing could well be seen as the earliest, followed swiftly by the development of the "letter" as a social form, which dates back to at least the seventh century BCE. The ancient Library of Ashurbanipal, King of Assyria from 668 to 627, included personal letters written in cuneiform on clay tablets.

The telegraph and telephone come next in line; even if, as a recent NY Times article noted, the phone is experiencing a slow decline, it remains our oldest electronic social media. I'm old enough to remember the old "Reach out and touch someone" adverts for Ma Bell, and for a while, there was nothing more direct and personal than a phone call. Electronic mail protocols over ARPANET and its successors debuted in 1969, but did not become a common form of communication until the late 1980's; well before then, home computer users setting up BBS sites where they could post notices and download simple programs. My home town of Cleveland had a huge site, Freenet, where you could also get medical advice from doctors at Case Western Reserve and University Hospitals. The WELL, a large social site based in San Francisco, was the first home of integrated mail, chatroom, and file services; perhaps not coincidentally, it was also the site of the first case of online impersonation that went to court (a man was sued by two women for pretending to be a different, older woman who was a mutual friend).

In academia, the LISTSERV protocol brought people together by field and interest, and made it possible to, in effect, send a message to hundreds of people at once in search of advice or response; LISTSERVs were often associated with archives where you could search through older messages. Early online game spaces, such as MUDs and MOOs go back to the late 1970's, and many became highly social, with tens of thousands of "inhabitants" maintaining spaces there. All of these interactions were exclusively text-based, and the only "graphics" consisted of what could be cobbled together out of ASCII characters.

It wasn't until the arrival of the commercial internet in 1993, and the WWW protocol the next year, that social media really took off; by the end of the decade, Six Degrees, LiveJournal, Blogger, and eOpinion had launched. In 2003, Second Life offered its users a virtual retake on their first lives, albeit with a graphical interface that looks primitive by today's standards; that same year, MySpace became the first modern social networking platform, and a model for Facebook two years later. With half a billion users, including everyone from the President to the Pope to Adam West, it certainly has the critical mass to change the face of human communication -- and yet, in recent years, the loss of many of its younger ("Millenial" generation) users has some people wondering whether it may someday go the way of MySpace.

Friday, February 24, 2017

History of Computing II: Mouse forward

The move toward the possibility of a computer that could truly be called "personal" begins in many ways with Douglas Engelbart's question: "If in your office, you as an intellectual worker were supplied with a computer display backed up by a computer that was alive for you all day and was instantly responsive, how much value could you derive from that?” The question was posed on December 9, 1968, at what has come to be called the "Mother of all Demos," where Engelbart and his team at Augmentation Research Center at Stanford Research Institute in Menlo Park, California. Unlike anyone else in 1968, Engelbart had some concrete answers to this seemingly abstract question: he was seated at a console which included a chorded keyboard (not unlike that employed by the operator of the "Voder" at the 1939 World's Fair) as well as the first operational three-button mouse, which Engelbart had designed together with Bill English starting in 1963. Using this interface, as well as an audio and video projector, Engelbart demonstrated the other capacities of his system, which included collapsible and relational menus, a simple mapping system, a text editor, and basic programming tools.

Of course the computers that backed up Engelbart's console were still massive, and required a number of other human operators and technicians (he chats with several of them in the course of the demo). It would be nearly another sixteen years before advances in microchips, display screens, and hardware would enable the production of the Apple Macintosh, the first computer to incorporate a mouse along with a graphical user interface (GUI) and some degree of WYSIWYG (What you see is what you get) graphics. It was these technologies, much more than earlier screen and keyboard machines, that turned the modest interest in home computers into the revolution in personal computers that enabled the "Internet" age.

Interestingly, although Engelbart is actually depending on a remote set of machines connected over a cable, it would be a long time before the computer was not only an independent platform but also a means of communications. Early modems were slow and unreliable, and took hours to send long files; even then, they more often connected to a remote "host" which was itself isolated from the 'net, such as a BBS system. The earliest version of the Internet, known as ARPANET, was created from plans developed for the US Department of Defense by the RAND corporation, its architecture designed to link DoD facilities with contractors and research universities, with a "distributed" set of nodes which was chosen as the most likely to survive a Soviet nuclear attack. Even well into the late 1980's, when I sent my first e-mail (I was then a grad student doing a work study job at Brown's Graduate School offices) 90% of the traffic on the Internet went from one big host computer to another at universities and research institutes. I remember sending a message to someone with an odd-sounding hostname, and finding out only later that the user was in Tel Aviv, Israel!

The Internet was not opened to commercial traffic of any kind until 1993, and it was around this time that Sir Tim Berners-Lee released his hypertext "world wide web" protocol, and Mosaic, the first widely-used browser, came into use. This software, because it enabled terminal-to-terminal communication using an interface which worked in much the same way as the GUI's of individual computers (well, Macs in any case!), was the key step toward the 'net becoming a true mass medium. And it was only then that the answer, or rather answers, to Engelbart's question became clear, with nearly two billion Internet users worldwide, and global e-commerce quickly becoming the dominant means of trade and exchange throughout the developed world. And, of course, the humble "intellectual worker" -- such as yours truly -- has, and continues to derive great value from all this; in the case of my most recent book, which took about six years to research, I'd estimate that, without access to Internet-based historical materials, the project would have taken at least twice and long, and cost tens of thousands of dollars in airfare to travel to and search through archives around the world.

Saturday, February 18, 2017

History of Computing I: The Colossi

The earliest notion of a "computer" that most people had in the nineteenth and twentieth centuries had one key feature: enormous size. Babbage's 1837 "Analytical Engine," widely regarded as the earliest ancestor of the computer, would -- had it ever been completed -- have filled a warehouse-sized room and weighed nearly 30,000 pounds. Babbage's designs used interlocking gears with various ratios to perform calculations, and his system contemplated a punch-card I/O unit, a calculating unit known as the "Mill" (a sort of CPU), and a storage unit he called the "Store." In his search for backers, he enlisted Lord Byron's daughter Ada Lovelace, who wrote an erudite explanation of the Engine's operations for a French journal; in 1983, a new computer language designed for the US Department of Defense was christened "Ada" in her honor. Two working models of his machine have been built in recent years; you can see one of them in action here.

Few significant advances in computing were made until the late 1930's and early 1940's, when military needs -- calculating target data, and (most importantly) breaking secret codes such as Germany's ENIGMA, provided both the impetus and the funding. In the UK, researchers at Bletchley Park, led by the young computer genius Alan Turing, constructed machines they named "bombes" which used electrical relays and motors to run through hundreds of thousands of possible combinations of the wheels and wires of an Enigma machine. Later, they constructed a far more advanced machine, known literally as "Colossus," for the same task. Advances in cryptography would eventually render all these computers obsolete -- indeed, a carefully-done "one time pad" or "Vernam cipher"is unbreakable once the pad text is destroyed (as witness a message found on a skeletized pigeon, though some have claimed to have deciphered it).

Yet other problems, such as calculating trajectories, remained. The first fully digital machine along these lines was ENIAC (short for Electronic Numerical Integrator And Computer) which used vacuum tubes -- more than 17,000 of them! -- as relays and switches. The machine had to be literally re-wired for each different kind of operation; the task was entrusted to a group of young women who, even though many of them had college degrees in mathematics and engineering, were regarded at first as little more than glorified switchboard operators. All of the units together weighed more than 60,000 pounds, and consumed 150 kilowatts of power -- all to perform roughly 5,000 calculations per second. While this was many times faster than any earlier machine, it's equivalent to a CPU speed of 5 kHz -- fifty million times slower than the average desktop computer of today.

The key invention which began the change from room- and-building-sized machines to something that could actually fit in an office or a home was of course the transistor, developed at Bell Labs in 1947. The basic idea was to use a semiconductor sandwiched between more conductive materials; such a device, like a radio tube, could be used either as a signal amplifier or a switch. There were several key advantages: transistors produced less heat, were cheaper to manufacture, and -- even in their early state -- much smaller. Each of the women of ENIAC shown in the photo above is holding a unit with the same storage capacity; the first two decreases in size are due to smaller, specially-made tubes, but the last is due to transistors. A typical smart phone today has nearly a million times the number of transistors in this smallest unit.

With the war over, business demands drove the computer market. The first commercial computer introduced for this market was the UNIVAC, introduced in the early 1950's. For around $750,000, you got a CPU speed of 1.9 kHz, about 1.5 Kb of memory, and tape drives, each the size of a small refrigerator, which held about 1.5 Kb per tape. A decade later, IBM introduced its 1401 system; with the top model, one could now have 16 Kb of memory, and perform almost 23,000 calculations per second -- 23 kHz. IBM did not sell the 1401, but you could lease one for around $2,500 a month. Home computing on a practical scale was still far in the future; although the SIMON and other home-kit computers were available throughout this period for home hobbyists, their size -- 8 binary switches -- made them useless for any but the most limited tasks.

Friday, February 10, 2017

Ghosts in the Machine: Early Television from 1928

The image to the left is a single frame from the earliest known television recording of a human face, made by the inventor John Logie Baird. The subject, a Mr. Wally Fowlkes, was a young lab assistant undistinguished save by his willingness to sit for lengthy periods under the bright, hot lights required to make television recordings. And, amazingly, these recordings were made almost entirely using mechanical means -- a giant disc with glass lenses was linked directly to a Columbia Records turntable equipped with a cutting stylus -- and predate any electronic images of humans by several years! They were preserved on discs that look much like audio recordings, and the frequency of the image data is so low that, if played through speakers, a sound in the audible range is produced. Indeed, Baird claimed that he could distinguish, just by listening to them, a recording of a face from say, a recording of a pair of scissors or a soccer ball. Baird called his process Phonovision, and although he abandoned it as offering too brief, and posing too many technical obstacles, it was nevertheless the first system of recorded television in history.

These recordings were little-known until a few years ago, when recording engineer Donald McLean collected several of them, and transferred their analog signal into digital form. Once this was done, he was able to correct for all kinds of problems that plagued Baird's engineers -- mechanical resonance ("rumble"), pops and scratches on the disc, speed irregularities, and problems with frame registration. The earliest recordings are still quite primitive, but one can at least recognize the faces.

Even more remarkably, in addition to these laboratory discs, there exist home recordings, made using "Silvatone" aluminum discs (one of these was referenced recently in The King's Speech). Silvatone discs used a heavy, weighted cutting stylus, and could record any sort of signal, whether of the human voice or a radio broadcast. And, due to the relatively low frequency of the signal, they could be used to record television broadcasts as well. During the brief period from the late 1920's through to the early 1930's, when Baird was able to send out television signals with the BBC's co-operation, a number of amateur recordings were made; these, too, have been restored by Mr. Mclean. There are about a half-dozen different snippets: dancing girls (of course!), a marionette show, and a singer by the name of Betty Bolton. McLean actually located Miss Bolton, by then 92 years old, and she was able to personally identify herself as the subject of the recording!

During this era -- in 1930 -- the BBC broadcast the very first television drama, an adaptation of Pirandello's play "The Man with a Flower in his Mouth." Although this does not survive, there is a re-enacted version, using the exact same script, the original music and title cards, and an identical 30-line Baird camera system -- you can watch it here, along with comments on the original broadcast and the recreation.

Mr. McLean has kindly permitted me to show his restored original Baird recordings to you -- but in class only -- as he is concerned to protect his rights in the restored versions. So look for some haunting images at Wednesday's class!

SIDEBAR: Here's a chart I've prepared showing the relative frequency and bandwidth of television signals, from the days of the Baird discs to HDTV.

ADDITIONAL LINKS: The excellent Television History site, a film of the 1936 Radiolympia demonstration broadcast as well as the High-def opening ceremony later that year. Both feature versions of the commissioned theme song, with its curious lyrics:
A mighty maze, of mystic, magic rays
Is all about us in the blue
And in sight and sound they trace
Living pictures out of space
To bring this enchantment to you ...
Here also you can see a modern 32-line mechanical TV in action; a 1938 Nazi TV station ident (they named the station after Paul Nipkow, inventor of the Nipkow disc, so as to claim TV as an "Aryan" invention); and lastly, a TV advert for Dumont TV featuring Wally Cox, later a "Hollywood Squares" regular and voice of Underdog.

Tuesday, February 7, 2017

3D Movies: Always Just Over the Horizon

From its first appearance in 1922 to the current wave of films today, 3D has always been hailed as a great technical advance which would bring the cinema closer to its future as an all-encompassing form of entertainment. This future, alas, has always remained just over the horizon, and the reason is plain to see: it has always required special, add-on technologies that have made films more expensive to produce, project and view. This has led to cost, which has led to its being seen as a premium entertainment, which has prevented it from becoming more widely used. Doubtless the current wave of 3D will fade, but in the meantime, it might be educational to take a look at Teleview, the very first 3D system for the cinema, as nearly all of the technological elements -- and all of the hurdles -- were there are the start, nearly ninety years ago.

Basically, there have always been two methods of achieving the effect of 3D -- one, as with Kinemacolor, was an active method using alternating frames of the film for left-eye and right-eye views; such systems then required either a polarizing filter (with the projected images also alternating in polarity) or a synchronized, electrical shutter for every viewer (this was the method of Teleview, and seen in the diagram of the viewer above). Oddly, this is not only the earliest, but the latest, system: 3D television similarly uses alternating frames, along with a special set of electronic glasses designed so that each eye sees only the frames made from "its" perspective (at $50 a pair, they're hardly cheap).

The other method, the passive one, is to project both left-eye and right-eye perspectives simultaneously, and use either red/blue or polarizing eyeglasses so that the overlapping images are "sorted out" by each eye. This has the advantage of cheap, disposable means of reception, but the disadvantage that the image on the screen will be poor to anyone without the eyewear. While we often associate this system and its red/blue glasses with the earlier heyday of 3D in the 1950's, polarizing glasses were in fact far more commonly used, primarily because such films did not have to be printed on colored stock, or use color at all.

Today, converting a modern multiplex cinema to 3D costs about $300,000 a screen -- which, at some larger houses, would mean several millions of dollars. The practice has therefore been to convert only a few screens, which means that any film released in 3D will be on fewer screens, and even with a premium will make less for both the studios and the exhibitors. The dwindling economic returns of such a thing, especially in the current recession, have caused some studios, such as Warner Brothers, to pull out of earlier commitments to making films, such as the last Harry Potter features, in 3D. The jury is still out on 3D TV, and my bet is that, before too long, we will once again associate 3D, that magnificent technology of the future, with the past.

Saturday, February 4, 2017

Later Developments in Cinema

The history of the development of cinema after the early portion of the silent era is largely -- though not entirely -- a question of the gradual progress towards both sound and color. Each of these, as we've already seen, started much earlier than generally imagined; sound began with Dickson's "Experimental Sound Film" of 1894, and hand-painted color had already reached a high-water mark with Georges Méliès's 1900 version of Joan of Arc. With sound, the great problem was synchronization; there were all kinds of schemes for keeping sound -- as a phonograph record, an optical code, or any other pre-recorded substrate -- in time with image. When it came to color, hand-painted films -- even with stencils, and armies of (mostly female) colorists, it remained a premium mode without a premium payback. The main use of color in commercial film, in fact, was with tinting -- a process in which certain segments of film to be edited were run through chemical baths. An emotional scene might be bathed in red, while another encounter would be shown in blue or purple. The advantage of tinting was that all the varied colors could be achieved in post-production, at the director's discretion. Such scenes as the "mellow yellow" of the frame from an unknown film of this era, were common indeed. In some cases, tinted prints survive and have been restored; in others, the indications for tinting have been recreated in restoration.

At the same time, efforts progressed toward a technology that would bring about the appearnce (at least) of full color. The pioneer in this field was Charles Urban, an American expat in England who had already achieved success with his black-and-white films in the era of the "Cinema of Attractions." Urban realized that persistence of vision, the same principle that enabled the illusion of motion, could enable an illusion of color as well; this was the basis of his "Kinemacolor" system. Black-and-white was shot through a special camera using a spinning filter which filtered alternate frames in red and green. After developing the film, it was played back through alternating color filters, so that the "red" frames were tinted red and the "green" frames green; the result was something very close to the feeling of full color (though in fact the process missed part of the spectrum -- with dark blue being very imperfectly reproduced). Urban's process also had the huge technical advantage that, although special cameras and projectors were needed, the film was just ordinary black-and-white stock. Urban promoted his system through ambitious, epic-sized films shown in specially built, luxurious cinemas. Unfortunately for Urban, he was sued by cinema pioneer William Friese-Greene, who (falsely) claimed he had had the idea for this kind of color alternation before. As has happened with modern patent lawsuits, the British judges had no grasp of the technology on which they were ruling, confusing concept with practical art, and Friese-Greene's scheme of staining alternate frames (which produced only a muddy mess) with Urban's far superior pictures. They ruled in favor of Friese-Green, and Urban was eventually forced into bankruptcy. Friese-Greene was never able to bring his system to the point commercial success, though his son Claude, using a process much more like Urban's system than his father's, made a number of fine early color films.

Ironically, it was to be one of William Friese-Greene's original concepts -- dyed film which was glued or bonded together -- which would ultimately be the precursor of modern color processes. The Technicolor company started out with a red/green system much like Urban's; they called this "System 1." Films made with this system have a haunting, greenish-yellowish hue which, while perfect for horror features such as "Dr. X" (1932) was less well suited for dramatic or comedic subjects. They next developed "System 2," a subtractive color process in which two dyed films were cemented together, but the finished film was prone to bubbling and cupping. A third system transferred the dyed prints to a fresh single film, but was still limited to two colors.

By the mid-1903's Technicolor shifted to a three-strip system, which was shot on three separate films, which were then dyed and transferred to produce the final prints. This offered the first commercially successful full color image, although red and green still had the most zing -- thus Victor Fleming's choice of ruby slippers and green witch's makeup for 1939's The Wizard of Oz. Not many people realize it, but "Color by Technicolor" was a licensed process not owned by the studios; directors had to hire Technicolor's camera operators and technical consultants, as well as entrusting post-production to their facilities.

Now, as to sound: at nearly the same time, different technologies were being tried to synchronize sound with moving pictures. Emile Berliner was involved with a disc-based system; Edison offered a cylinder-based one, but neither achieved real success. All the various attempts at sound stumbled with the issue of synchronization until the development of optical soundtrack systems, which in turn had to wait until amplified electrical recording became possible in the mid-1920's. These, because they could be recorded on to the actual film, and duplicated along with it, were both reliable and economically feasible, though of course exhibitors would have to invest in new equipment. Although hailed as the first sound picture, 1927's "The Jazz Singer" in fact only had sound in certain portions of the film, and still relied on the old sound-on-disc system. Rival technologies -- RCA's "Photophone" system, Western Electric's variable density system -- vied for the new industry standard.

The introduction of sound to film brought with it a host of technical problems: microphones had limited range, and had to be hidden in potted plants and tableware; camera noise was too easily picked up, and cameras had to be encased in sound-proof coverings. Mary Pickford, one of the greatest stars of her day and a founder of United Artists, had a terrible experience with her 1929 sound film, "Coquette"; she had to strain her voice to get it picked up by the microphones, and the results were far from complimentary. Her UA partner Charlie Chaplin, though he eventually embraced the idea of using musical scores on his soundtracks, put off the use of voice; aside from a phonograph recording, a one-liner ("Get back to work!") and a nonsense song in 1936's "Modern Times," Chaplin did not use spoken dialogue in any of his films until "The Great Dictator" in 1940, though some years later he recorded narrative voice-overs for many of his early features. Nevertheless, sound, well before color, became a standard feature of film very soon after its introduction.

Next up: 3D film -- in 1922?!