Friday, March 23, 2018

Social Media II

The range and size of social media networks has increased almost exponentially in the early years of the twenty-first century. We've gone from early forums in which only a few hundred people might participate, such as a BBS or a LISTSERV list, to truly mass media such as Facebook and Twitter, which have billions of users around the globe.

But much more than just size has changed. At a certain 'tipping point,' social media begin to function in ways that, when they were smaller, would have been impossible. Facebook and Twitter have been credited as playing roles in the "Arab Spring" in the Middle East, particularly in Egypt and Tunisia; Facebook's founder has been the subject of a major Hollywood film; and twitter feeds and cell-phone photos has brought down politicians of every party, sometimes within a matter of mere hours. It certainly sounds as though these technologies have crossed some threshold, altering the fabric of reality itself -- but then, of course, one can look back at similar claims made about virtual-reality video helmets (anyone remember Lawnmower Man?) and wonder whether these revolutions will seem such a few years from now.

Three key developments have shaped this period: 1) Social media with "presence" -- a main page at which users can add or copy content, offer images, texts, or video of their own making or choosing; 2) Sites with instant linkability -- the ability of users to add (or subtract) active and immediate connections to other users; and 3) Sites that bundle essential tools (e-mail, instant messaging, and other software capabilities. Finally, all of the above, or at least the survivors in this highly competitive field, have gone multi-platform; no social medium of the future will thrive unless it is available on desktops, laptops, tablets, and smartphones, and has some system of synchronizing all its users' preferences and updates.

So what next? The spaghetti is still being hurled at the (virtual) refrigerator wall; Blippy, a site that enabled shoppers to instantly "share" posts about their purchases was hacked, and credit cards compromised -- so much for that! -- Google tried to launch its own "Wikipedia killer," dubbed Knol, but the site filled up with spam so quickly that it became almost useless, and Google discontinued it; it also failed to generate "Buzz," a hot-button social networking site that irritated users with its auto-generated list of "contacts," and Apple stumbled with Ping! an addition to its popular iTunes platform meant to enable people to share news about music purchases and performances. The latest entry Pinterest, allows users to "pin" content to one another, with a focus on bargain shopping, and has the unusual distinction that a majority of its users, in many surveys, are women. But will it go the way of the Lifetime network? And what of sites that advertise themselves as 'Pinterest for men'?

It may seem we're already "shared" too much in this era of TMI, and these social media may be reaching their limits -- but I wouldn't bet on it.

Thursday, March 22, 2018

Social Media I

The evolution of social media can be conceived of in many ways -- in one sense, it could be said that language itself was the first social medium. Even then, considering a "social medium" to be any means of transmitting or recording language over time and space, alphabetic writing could well be seen as the earliest, followed swiftly by the development of the "letter" as a social form, which dates back to at least the seventh century BCE. The ancient Library of Ashurbanipal, King of Assyria from 668 to 627, included personal letters written in cuneiform on clay tablets.

The telegraph and telephone come next in line; even if, as a recent NY Times article noted, the phone is experiencing a slow decline, it remains our oldest electronic social media. I'm old enough to remember the old "Reach out and touch someone" adverts for Ma Bell, and for a while, there was nothing more direct and personal than a phone call. Electronic mail protocols over ARPANET and its successors debuted in 1969, but did not become a common form of communication until the late 1980's; well before then, home computer users setting up BBS sites where they could post notices and download simple programs. My home town of Cleveland had a huge site, Freenet, where you could also get medical advice from doctors at Case Western Reserve and University Hospitals. The WELL, a large social site based in San Francisco, was the first home of integrated mail, chatroom, and file services; perhaps not coincidentally, it was also the site of the first case of online impersonation that went to court (a man was sued by two women for pretending to be a different, older woman who was a mutual friend).

In academia, the LISTSERV protocol brought people together by field and interest, and made it possible to, in effect, send a message to hundreds of people at once in search of advice or response; LISTSERVs were often associated with archives where you could search through older messages. Early online game spaces, such as MUDs and MOOs go back to the late 1970's, and many became highly social, with tens of thousands of "inhabitants" maintaining spaces there. All of these interactions were exclusively text-based, and the only "graphics" consisted of what could be cobbled together out of ASCII characters.

It wasn't until the arrival of the commercial internet in 1993, and the WWW protocol the next year, that social media really took off; by the end of the decade, Six Degrees, LiveJournal, Blogger, and eOpinion had launched. In 2003, Second Life offered its users a virtual retake on their first lives, albeit with a graphical interface that looks primitive by today's standards; that same year, MySpace became the first modern social networking platform, and a model for Facebook two years later. With half a billion users, including everyone from the President to the Pope to Adam West, it certainly has the critical mass to change the face of human communication -- and yet, in recent years, the loss of many of its younger ("Millenial" generation) users has some people wondering whether it may someday go the way of MySpace.

Thursday, March 15, 2018

History of Computing II: Mouse forward

The move toward the possibility of a computer that could truly be called "personal" begins in many ways with Douglas Engelbart's question: "If in your office, you as an intellectual worker were supplied with a computer display backed up by a computer that was alive for you all day and was instantly responsive, how much value could you derive from that?” The question was posed on December 9, 1968, at what has come to be called the "Mother of all Demos," where Engelbart and his team at Augmentation Research Center at Stanford Research Institute in Menlo Park, California. Unlike anyone else in 1968, Engelbart had some concrete answers to this seemingly abstract question: he was seated at a console which included a chorded keyboard (not unlike that employed by the operator of the "Voder" at the 1939 World's Fair) as well as the first operational three-button mouse, which Engelbart had designed together with Bill English starting in 1963. Using this interface, as well as an audio and video projector, Engelbart demonstrated the other capacities of his system, which included collapsible and relational menus, a simple mapping system, a text editor, and basic programming tools.

Of course the computers that backed up Engelbart's console were still massive, and required a number of other human operators and technicians (he chats with several of them in the course of the demo). It would be nearly another sixteen years before advances in microchips, display screens, and hardware would enable the production of the Apple Macintosh, the first computer to incorporate a mouse along with a graphical user interface (GUI) and some degree of WYSIWYG (What you see is what you get) graphics. It was these technologies, much more than earlier screen and keyboard machines, that turned the modest interest in home computers into the revolution in personal computers that enabled the "Internet" age.

Interestingly, although Engelbart is actually depending on a remote set of machines connected over a cable, it would be a long time before the computer was not only an independent platform but also a means of communications. Early modems were slow and unreliable, and took hours to send long files; even then, they more often connected to a remote "host" which was itself isolated from the 'net, such as a BBS system. The earliest version of the Internet, known as ARPANET, was created from plans developed for the US Department of Defense by the RAND corporation, its architecture designed to link DoD facilities with contractors and research universities, with a "distributed" set of nodes which was chosen as the most likely to survive a Soviet nuclear attack. Even well into the late 1980's, when I sent my first e-mail (I was then a grad student doing a work study job at Brown's Graduate School offices) 90% of the traffic on the Internet went from one big host computer to another at universities and research institutes. I remember sending a message to someone with an odd-sounding hostname, and finding out only later that the user was in Tel Aviv, Israel!

The Internet was not opened to commercial traffic of any kind until 1993, and it was around this time that Sir Tim Berners-Lee released his hypertext "world wide web" protocol, and Mosaic, the first widely-used browser, came into use. This software, because it enabled terminal-to-terminal communication using an interface which worked in much the same way as the GUI's of individual computers (well, Macs in any case!), was the key step toward the 'net becoming a true mass medium. And it was only then that the answer, or rather answers, to Engelbart's question became clear, with nearly two billion Internet users worldwide, and global e-commerce quickly becoming the dominant means of trade and exchange throughout the developed world. And, of course, the humble "intellectual worker" -- such as yours truly -- has, and continues to derive great value from all this; in the case of my most recent book, which took about six years to research, I'd estimate that, without access to Internet-based historical materials, the project would have taken at least twice and long, and cost tens of thousands of dollars in airfare to travel to and search through archives around the world.

Thursday, March 8, 2018

History of Computing I: The Colossi

The earliest notion of a "computer" that most people had in the nineteenth and twentieth centuries had one key feature: enormous size. Babbage's 1837 "Analytical Engine," widely regarded as the earliest ancestor of the computer, would -- had it ever been completed -- have filled a warehouse-sized room and weighed nearly 30,000 pounds. Babbage's designs used interlocking gears with various ratios to perform calculations, and his system contemplated a punch-card I/O unit, a calculating unit known as the "Mill" (a sort of CPU), and a storage unit he called the "Store." In his search for backers, he enlisted Lord Byron's daughter Ada Lovelace, who wrote an erudite explanation of the Engine's operations for a French journal; in 1983, a new computer language designed for the US Department of Defense was christened "Ada" in her honor. Two working models of his machine have been built in recent years; you can see one of them in action here.

Few significant advances in computing were made until the late 1930's and early 1940's, when military needs -- calculating target data, and (most importantly) breaking secret codes such as Germany's ENIGMA, provided both the impetus and the funding. In the UK, researchers at Bletchley Park, led by the young computer genius Alan Turing, constructed machines they named "bombes" which used electrical relays and motors to run through hundreds of thousands of possible combinations of the wheels and wires of an Enigma machine. Later, they constructed a far more advanced machine, known literally as "Colossus," for the same task. Advances in cryptography would eventually render all these computers obsolete -- indeed, a carefully-done "one time pad" or "Vernam cipher"is unbreakable once the pad text is destroyed (as witness a message found on a skeletized pigeon, though some have claimed to have deciphered it).

Yet other problems, such as calculating trajectories, remained. The first fully digital machine along these lines was ENIAC (short for Electronic Numerical Integrator And Computer) which used vacuum tubes -- more than 17,000 of them! -- as relays and switches. The machine had to be literally re-wired for each different kind of operation; the task was entrusted to a group of young women who, even though many of them had college degrees in mathematics and engineering, were regarded at first as little more than glorified switchboard operators. All of the units together weighed more than 60,000 pounds, and consumed 150 kilowatts of power -- all to perform roughly 5,000 calculations per second. While this was many times faster than any earlier machine, it's equivalent to a CPU speed of 5 kHz -- fifty million times slower than the average desktop computer of today.

The key invention which began the change from room- and-building-sized machines to something that could actually fit in an office or a home was of course the transistor, developed at Bell Labs in 1947. The basic idea was to use a semiconductor sandwiched between more conductive materials; such a device, like a radio tube, could be used either as a signal amplifier or a switch. There were several key advantages: transistors produced less heat, were cheaper to manufacture, and -- even in their early state -- much smaller. Each of the women of ENIAC shown in the photo above is holding a unit with the same storage capacity; the first two decreases in size are due to smaller, specially-made tubes, but the last is due to transistors. A typical smart phone today has nearly a million times the number of transistors in this smallest unit.

With the war over, business demands drove the computer market. The first commercial computer introduced for this market was the UNIVAC, introduced in the early 1950's. For around $750,000, you got a CPU speed of 1.9 kHz, about 1.5 Kb of memory, and tape drives, each the size of a small refrigerator, which held about 1.5 Kb per tape. A decade later, IBM introduced its 1401 system; with the top model, one could now have 16 Kb of memory, and perform almost 23,000 calculations per second -- 23 kHz. IBM did not sell the 1401, but you could lease one for around $2,500 a month. Home computing on a practical scale was still far in the future; although the SIMON and other home-kit computers were available throughout this period for home hobbyists, their size -- 8 binary switches -- made them useless for any but the most limited tasks.