Based on more than forty interviews with Jobs conducted over two years—as well as interviews with more than a hundred family members, friends, adversaries, competitors, and colleagues—Walter Isaacson has written a riveting story of the roller-coaster life and searingly intense personality of a creative entrepreneur whose passion for perfection and ferocious drive revolutionized six industries: personal computers, animated movies, music, phones, tablet computing, and digital publishing.
At a time when America is seeking ways to sustain its innovative edge, and when societies around the world are trying to build digital-age economies, Jobs stands as the ultimate icon of inventiveness and applied imagination. He knew that the best way to create value in the twenty-first century was to connect creativity with technology. He built a company where leaps of the imagination were combined with remarkable feats of engineering.
Although Jobs cooperated with this book, he asked for no control over what was written nor even the right to read it before it was published. He put nothing off-limits. He encouraged the people he knew to speak honestly. And Jobs speaks candidly, sometimes brutally so, about the people he worked with and competed against. His friends, foes, and colleagues provide an unvarnished view of the passions, perfectionism, obsessions, artistry, devilry, and compulsion for control that shaped his approach to business and the innovative products that resulted.
Driven by demons, Jobs could drive those around him to fury and despair. But his personality and products were interrelated, just as Apple’s hardware and software tended to be, as if part of an integrated system. His tale is instructive and cautionary, filled with lessons about innovation, character, leadership, and values.
The 27 full papers presented together with 4 invited talks were carefully reviewed and selected from 68 submissions. The papers cover a wide range of topics such as approximation algorithms, computational complexity, computational geometry, data structures, graph algorithms, graph coloring, graph exploration, and online algorithms.
The official book behind the Academy Award-winning film The Imitation Game, starring Benedict Cumberbatch and Keira Knightley
It is only a slight exaggeration to say that the British mathematician Alan Turing (1912-1954) saved the Allies from the Nazis, invented the computer and artificial intelligence, and anticipated gay liberation by decades--all before his suicide at age forty-one. This New York Times–bestselling biography of the founder of computer science, with a new preface by the author that addresses Turing's royal pardon in 2013, is the definitive account of an extraordinary mind and life.
Capturing both the inner and outer drama of Turing’s life, Andrew Hodges tells how Turing’s revolutionary idea of 1936--the concept of a universal machine--laid the foundation for the modern computer and how Turing brought the idea to practical realization in 1945 with his electronic design. The book also tells how this work was directly related to Turing’s leading role in breaking the German Enigma ciphers during World War II, a scientific triumph that was critical to Allied victory in the Atlantic. At the same time, this is the tragic account of a man who, despite his wartime service, was eventually arrested, stripped of his security clearance, and forced to undergo a humiliating treatment program--all for trying to live honestly in a society that defined homosexuality as a crime.
The inspiration for a major motion picture starring Benedict Cumberbatch and Keira Knightley, Alan Turing: The Enigma is a gripping story of mathematics, computers, cryptography, and homosexual persecution.
Eric Schmidt is one of Silicon Valley’s great leaders, having taken Google from a small startup to one of the world’s most influential companies. Jared Cohen is the director of Google Ideas and a former adviser to secretaries of state Condoleezza Rice and Hillary Clinton. With their combined knowledge and experiences, the authors are uniquely positioned to take on some of the toughest questions about our future: Who will be more powerful in the future, the citizen or the state? Will technology make terrorism easier or harder to carry out? What is the relationship between privacy and security, and how much will we have to give up to be part of the new digital age?
In this groundbreaking book, Schmidt and Cohen combine observation and insight to outline the promise and peril awaiting us in the coming decades. At once pragmatic and inspirational, this is a forward-thinking account of where our world is headed and what this means for people, states and businesses.
With the confidence and clarity of visionaries, Schmidt and Cohen illustrate just how much we have to look forward to—and beware of—as the greatest information and technology revolution in human history continues to evolve. On individual, community and state levels, across every geographical and socioeconomic spectrum, they reveal the dramatic developments—good and bad—that will transform both our everyday lives and our understanding of self and society, as technology advances and our virtual identities become more and more fundamentally real.
As Schmidt and Cohen’s nuanced vision of the near future unfolds, an urban professional takes his driverless car to work, attends meetings via hologram and dispenses housekeeping robots by voice; a Congolese fisherwoman uses her smart phone to monitor market demand and coordinate sales (saving on costly refrigeration and preventing overfishing); the potential arises for “virtual statehood” and “Internet asylum” to liberate political dissidents and oppressed minorities, but also for tech-savvy autocracies (and perhaps democracies) to exploit their citizens’ mobile devices for ever more ubiquitous surveillance. Along the way, we meet a cadre of international figures—including Julian Assange—who explain their own visions of our technology-saturated future.
Inspiring, provocative and absorbing, The New Digital Age is a brilliant analysis of how our hyper-connected world will soon look, from two of our most prescient and informed public thinkers.
Computers have changed since 1981, when The Soul of a New Machine first examined the culture of the computer revolution. What has not changed is the feverish pace of the high-tech industry, the go-for-broke approach to business that has caused so many computer companies to win big (or go belly up), and the cult of pursuing mind-bending technological innovations.
The Soul of a New Machine is an essential chapter in the history of the machine that revolutionized the world in the twentieth century.
WikiLeaks brought to light a new form of whistleblowing, using powerful cryptographic code to hide leakers’ identities while they spill the private data of government agencies and corporations. But that technology has been evolving for decades in the hands of hackers and radical activists, from the libertarian enclaves of Northern California to Berlin to the Balkans. And the secret-killing machine continues to evolve beyond WikiLeaks, as a movement of hacktivists aims to obliterate the world’s institutional secrecy.
This is the story of the code and the characters—idealists, anarchists, extremists—who are transforming the next generation’s notion of what activism can be.
With unrivaled access to such major players as Julian Assange, Daniel Domscheit-Berg, and WikiLeaks’ shadowy engineer known as the Architect, never before interviewed, reporter Andy Greenberg unveils the world of politically-motivated hackers—who they are and how they operate.
The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of “smart” hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective. He identifies four major threads that run throughout all of computing's technological development: digitization—the coding of information, computation, and control in binary form, ones and zeros; the convergence of multiple streams of techniques, devices, and machines, yielding more than the sum of their parts; the steady advance of electronic technology, as characterized famously by “Moore's Law”; and the human-machine interface.
Ceruzzi guides us through computing history, telling how a Bell Labs mathematician coined the word “digital” in 1942 (to describe a high-speed method of calculating used in anti-aircraft devices), and recounting the development of the punch card (for use in the 1890 U.S. Census). He describes the ENIAC, built for scientific and military applications; the UNIVAC, the first general purpose computer; and ARPANET, the Internet's precursor. Ceruzzi's account traces the world-changing evolution of the computer from a room-size ensemble of machinery to a “minicomputer” to a desktop computer to a pocket-sized smart phone. He describes the development of the silicon chip, which could store ever-increasing amounts of data and enabled ever-decreasing device size. He visits that hotbed of innovation, Silicon Valley, and brings the story up to the present with the Internet, the World Wide Web, and social networking.
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.
The Friendly Orange Glow is the first history to recount in fascinating detail the remarkable accomplishments and inspiring personal stories of the PLATO community. The addictive nature of PLATO both ruined many a college career and launched pathbreaking multimillion-dollar software products. Its development, impact, and eventual disappearance provides an instructive case study of technological innovation and disruption, project management, and missed opportunities. Above all, The Friendly Orange Glow at last reveals new perspectives on the origins of social computing and our internet-infatuated world.
Vikram Chandra has been a computer programmer for almost as long as he has been a novelist. In this extraordinary new book, his first work of nonfiction, he searches for the connections between the worlds of art and technology. Coders are obsessed with elegance and style, just as writers are, but do the words mean the same thing to both? Can we ascribe beauty to the craft of writing code?
Exploring such varied topics as logic gates and literary modernism, the machismo of tech geeks, the omnipresence of an "Indian Mafia" in Silicon Valley, and the writings of the eleventh-century Kashmiri thinker Abhinavagupta, Geek Sublime is both an idiosyncratic history of coding and a fascinating meditation on the writer's art. Part literary essay, part technology story, and part memoir, it is an engrossing, original, and heady book of sweeping ideas.
Lessig weaves the history of technology and its relevant laws to make a lucid and accessible case to protect the sanctity of intellectual freedom. He shows how the door to a future of ideas is being shut just as technology is creating extraordinary possibilities that have implications for all of us. Vital, eloquent, judicious and forthright, The Future of Ideas is a call to arms that we can ill afford to ignore.
From the Trade Paperback edition.
Forget Apple and IBM. For that matter forget Silicon Valley. The first
personal computer, a self-contained unit with its own programmable processor,
display, keyboard, internal memory, telephone interface, and mass storage of
data was born in San Antonio TX. US Patent number 224,415 was filed November
27, 1970 for a machine that is the direct lineal ancestor to the PC as we know
it today. The story begins in 1968, when two Texans, Phil Ray and Gus Roche,
founded a firm called Computer Terminal Corporation. As the name implies their
first product was a Datapoint 3300 computer terminal replacement for a
mechanical Teletype. However, they knew all the while that the 3300 was only a
way to get started, and it was cover for what their real intentions were - to
create a programmable mass-produced desktop computer. They brought in Jack
Frassanito, Vic Poor, Jonathan Schmidt, Harry Pyle and a team of designers,
engineers and programmers to create the Datapoint 2200. In an attempt to reduce
the size and power requirement of the computer it became apparent that the 2200
processor could be printed on a silicon chip. Datapoint approached Intel who
rejected the concept as a "dumb idea" but were willing to try for a
development contract. Intel belatedly came back with their chip but by then the
Datapoint 2200 was already in production. Intel added the chip to its catalog
designating it the 8008. A later upgrade, the 8080 formed the heart of the
Altair and IMSI in the mid-seventies. With further development it was used in
the first IBM PC-the PC revolution's chip dynasty. If you're using a PC, you're
using a modernized Datapoint 2000.
John Naughton is The Observer's "Networker" columnist, a prominent blogger, and vice president of Wolfson College, Cambridge. The Times has said of his writing, "[it] draws on more than two decades of study to explain how the internet works and the challenges and opportunities it will offer to future generations," and Cory Doctorow raved that "this is the kind of primer you want to slide under your boss's door." In From Gutenberg to Zuckerberg, Naughton explores the living history of one of the most radically transformational technologies of all time.
From Gutenberg to Zuckerberg is a clear-eyed history of one of the most central features of modern life: the internet. Once a technological novelty and now the very plumbing of the Information Age, the internet is something we have learned to take largely for granted. So, how exactly has our society become so dependent upon a utility it barely understands? And what does it say about us that this is the case?
While explaining in highly engaging language the way the internet works and how it got that way, technologist John Naughton has distilled the noisy chatter surrounding the technology's relentless evolution into nine essential areas of understanding. In doing so, he affords readers deeper insight into the information economy and supplies the requisite knowledge to make better use of the technologies and networks around us, highlighting some of their fascinating and far-reaching implications along the way.
There are companies that create waves and those that ride or are drowned by them. As only he can, bestselling author Ken Auletta takes readers for a ride on the Google wave, telling the story of how it formed and crashed into traditional media businesses?from newspapers to books, to television, to movies, to telephones, to advertising, to Microsoft. With unprecedented access to Google?s founders and executives, as well as to those in media who are struggling to keep their heads above water, Auletta reveals how the industry is being disrupted and redefined.
Using Google as a stand-in for the digital revolution, Auletta takes readers inside Google?s closed-door meetings and paints portraits of Google?s notoriously private founders, Larry Page and Sergey Brin, as well as those who work with?and against?them. In his narrative, Auletta provides the fullest account ever told of Google?s rise, shares the ?secret sauce? of Google?s success, and shows why the worlds of ?new? and ?old? media often communicate as if residents of different planets.
Google engineers start from an assumption that the old ways of doing things can be improved and made more efficient, an approach that has yielded remarkable results? Google will generate about $20 billion in advertising revenues this year, or more than the combined prime-time ad revenues of CBS, NBC, ABC, and FOX. And with its ownership of YouTube and its mobile phone and other initiatives, Google CEO Eric Schmidt tells Auletta his company is poised to become the world?s first $100 billion media company. Yet there are many obstacles that threaten Google?s future, and opposition from media companies and government regulators may be the least of these. Google faces internal threats, from its burgeoning size to losing focus to hubris. In coming years, Google?s faith in mathematical formulas and in slide rule logic will be tested, just as it has been on Wall Street.
Distilling the knowledge accrued from a career of covering the media, Auletta will offer insights into what we know, and don?t know, about what the future holds for the imperiled industry.
Beyond Deep Blue: Chess in the Stratosphere tells the continuing story of the chess engine and its steady improvement from its victory over Garry Kasparov to ever-greater heights. The book provides analysis of the games alongside a detailed examination of the remarkable technological progress made by the engines – asking the questions which one is best, how good is it, and how much better can it get.Presents a total of 118 games, played by 17 different chess engines, collected together for the first time in a single referenceDetails the processor speeds, memory sizes, and the number of processors used by each chess engineReviews Deep Blue’s matches with Garry Kasparov in 1996 and 1997Includes games from 10 World Computer Chess Championships, and the three most recent major computer chess tournaments of the Internet Chess ClubCovers the man-machine matches between Fritz and Kramnik in 2002 and 2006, and between Kasparov and Deep Junior in 2003Describes three historical matches between leading engines: Hydra versus Shredder, Junior versus Fritz, and Zappa versus Rybka
This fascinating account of the ongoing evolution of computer chess will appeal to both the general reader and to specialists in A.I. and computing. Chess players and aficionados will also appreciate this remarkable insight into the new superstars of the classic game.
In 1936, when he was just twenty-four years old, Alan Turing wrote a remarkable paper in which he outlined the theory of computation, laying out the ideas that underlie all modern computers. This groundbreaking and powerful theory now forms the basis of computer science. In Turing's Vision, Chris Bernhardt explains the theory, Turing's most important contribution, for the general reader. Bernhardt argues that the strength of Turing's theory is its simplicity, and that, explained in a straightforward manner, it is eminently understandable by the nonspecialist. As Marvin Minsky writes, “The sheer simplicity of the theory's foundation and extraordinary short path from this foundation to its logical and surprising conclusions give the theory a mathematical beauty that alone guarantees it a permanent place in computer theory.” Bernhardt begins with the foundation and systematically builds to the surprising conclusions. He also views Turing's theory in the context of mathematical history, other views of computation (including those of Alonzo Church), Turing's later work, and the birth of the modern computer.
In the paper, “On Computable Numbers, with an Application to the Entscheidungsproblem,” Turing thinks carefully about how humans perform computation, breaking it down into a sequence of steps, and then constructs theoretical machines capable of performing each step. Turing wanted to show that there were problems that were beyond any computer's ability to solve; in particular, he wanted to find a decision problem that he could prove was undecidable. To explain Turing's ideas, Bernhardt examines three well-known decision problems to explore the concept of undecidability; investigates theoretical computing machines, including Turing machines; explains universal machines; and proves that certain problems are undecidable, including Turing's problem concerning computable numbers.
This is a book about the computer revolution of the mid-twentieth century and the people who made it possible. Unlike most histories of computing, it is not a book about machines, inventors, or entrepreneurs. Instead, it tells the story of the vast but largely anonymous legions of computer specialists—programmers, systems analysts, and other software developers—who transformed the electronic computer from a scientific curiosity into the defining technology of the modern era. As the systems that they built became increasingly powerful and ubiquitous, these specialists became the focus of a series of critiques of the social and organizational impact of electronic computing. To many of their contemporaries, it seemed the “computer boys” were taking over, not just in the corporate setting, but also in government, politics, and society in general.
In The Computer Boys Take Over, Nathan Ensmenger traces the rise to power of the computer expert in modern American society. His rich and nuanced portrayal of the men and women (a surprising number of the “computer boys” were, in fact, female) who built their careers around the novel technology of electronic computing explores issues of power, identity, and expertise that have only become more significant in our increasingly computerized society.
In his recasting of the drama of the computer revolution through the eyes of its principle revolutionaries, Ensmenger reminds us that the computerization of modern society was not an inevitable process driven by impersonal technological or economic imperatives, but was rather a creative, contentious, and above all, fundamentally human development.
Chinese writing is character based, the one major world script that is neither alphabetic nor syllabic. Through the years, the Chinese written language encountered presumed alphabetic universalism in the form of Morse Code, Braille, stenography, Linotype, punch cards, word processing, and other systems developed with the Latin alphabet in mind. This book is about those encounters—in particular thousands of Chinese characters versus the typewriter and its QWERTY keyboard. Thomas Mullaney describes a fascinating series of experiments, prototypes, failures, and successes in the century-long quest for a workable Chinese typewriter.
The earliest Chinese typewriters, Mullaney tells us, were figments of popular imagination, sensational accounts of twelve-foot keyboards with 5,000 keys. One of the first Chinese typewriters actually constructed was invented by a Christian missionary, who organized characters by common usage (but promoted the less-common characters for “Jesus" to the common usage level). Later came typewriters manufactured for use in Chinese offices, and typewriting schools that turned out trained “typewriter girls” and “typewriter boys.” Still later was the “Double Pigeon” typewriter produced by the Shanghai Calculator and Typewriter Factory, the typewriter of choice under Mao. Clerks and secretaries in this era experimented with alternative ways of organizing characters on their tray beds, inventing an input method that was the first instance of “predictive text.”
Today, after more than a century of resistance against the alphabetic, not only have Chinese characters prevailed, they form the linguistic substrate of the vibrant world of Chinese information technology. The Chinese Typewriter, not just an “object history” but grappling with broad questions of technological change and global communication, shows how this happened.
A Study of the Weatherhead East Asian Institute
Darling-Hammond is the Charles E. Ducommun Professor of Education at Stanford University, a chief education advisor to President Obama, Co-Director of the Stanford Center for Opportunity Policy in Education, and Founding Director of the School Redesign Network at Stanford.
The Innovators by Walter Isaacson - A 30-minute Summary
Inside this Instaread Summary:
• Overview of the entire book
• Introduction to the important people in the book
• Summary and analysis of all the chapters in the book
• Key Takeaways of the book
• A Reader's Perspective
Preview of this summary:
Ada Byron, the daughter of poet Lord Byron, was tutored in math by her mother. As a result, she grew up comfortable with the combination of art and science. She met Charles Babbage, a science and math expert. Babbage demonstrated a model of a machine that he built called a Difference Engine that could solve polynomial equations.
Ada was inspired by Babbage’s Difference Engine and decided to undertake advanced lessons in mathematics. Ada became interested in mechanical weaving looms that used punch cards to create patterns in fabric. She recognized the similarity between the looms and Babbage’s Difference Engine. Ada married William King who became the Earl of Lovelace. This made her Ada, Countess of Lovelace, or more commonly, Ada Lovelace.
Babbage had an idea for another machine. He wanted to create a computer that could carry out different operations. He called his concept an Analytical Engine. Babbage wanted to use punch cards in his new machine similar to the ones used in looms.
Ada Lovelace believed in his idea and imagined that it might be used to process other symbolic notations such as for music and art in addition to numbers. From 1842 to 1843, she wrote a translation of notes written by a young military engineer about the Analytical Engine. Her notes became more famous than the engineer’s original article.
Ada’s notes covered four principles of historical significance. The first was that this would be a multi-purpose machine. The second was that it could process and act upon anything that could be expressed in symbols. The third was that the machine would work because of specific instructions given to it. Ada created this sequence of operations herself and wrote it up into a table and diagram. Her creation made her the world’s first computer programmer. The fourth concept Ada wrote about was that computers could not think and could only perform as they were instructed. Babbage’s machine was never built, and Ada never wrote another scientific paper, but their ideas were the beginnings of the digital age that came a century later.
In this book, Arimasa Naitoh, the father of the ThinkPad, collaborates with American business journalist and author William J. Holstein to write candidly about the incredible technological and personal struggles he and fellow engineers faced. And he offers his vision of the future of mobile computing—because this revolution is not even close to being finished.
Dungeons & Dragons, a fantasy role-playing game. Through the next 40
years, computer game developers used these fantasy worlds as archetypes
for the budding virtual game worlds These games would become as varied
as books in a library, but the essence of each was built upon community.
People gathered and played...together. Dungeons & Dreamers: A story
of how computer games created a global community follows the designers,
developers, and players who built the virtual games and communities
that define today's digital entertainment landscape and explores the
nature of what it means to live and thrive in virtual communities.
Computer graphics (or CG) has changed the way we experience the art of moving images. Computer graphics is the difference between Steamboat Willie and Buzz Lightyear, between ping pong and PONG. It began in 1963 when an MIT graduate student named Ivan Sutherland created Sketchpad, the first true computer animation program. Sutherland noted: “Since motion can be put into Sketchpad drawings, it might be exciting to try making cartoons.” This book, the first full-length history of CG, shows us how Sutherland's seemingly offhand idea grew into a multibillion dollar industry.
In Moving Innovation, Tom Sito—himself an animator and industry insider for more than thirty years—describes the evolution of CG. His story features a memorable cast of characters—math nerds, avant-garde artists, cold warriors, hippies, video game enthusiasts, and studio executives: disparate types united by a common vision. Sito shows us how fifty years of work by this motley crew made movies like Toy Story and Avatar possible.
This book is a treasure trove for those interested in the technology of early digital computers and those interested in the impact these machines had and still have on our current computer systems.
Part history, part memoir and part cultural study, Network Geeks charts the creation of the Internet and the establishment of the Internet Engineering Task Force, from the viewpoint of a self-proclaimed geek who witnessed these developments first-hand. With boundless enthusiasm and abundant humour, Brian Carpenter leads the reader on a journey from post-war Britain to post-millennium New Zealand, describing how the Internet grew into today’s ubiquitous, global network, including the genesis of the World-Wide Web in the hotbeds of a particle collider at CERN. Illuminating the science and technology behind the apparent “magic trick” of the Internet, Network Geeks opens a window into the initially bewildering world of the Internet engineering geek. After reading this book, you may wish to join this world yourself.
Write ups, specs and pictures of over 85 collectible consoles and variant models from 1972 to 2000. From the Magnavox Odyssey right through to the Sega Dreamcast.
Including the history of the evolution of electronic gaming and advice on how to collect classic consoles.
A comprehensive database of collectible consoles. Written by fellow collectors and enthusiasts.
Today, women earn a relatively low percentage of computer science degrees and hold proportionately few technical computing jobs. Meanwhile, the stereotype of the male “computer geek” seems to be everywhere in popular culture. Few people know that women were a significant presence in the early decades of computing in both the United States and Britain. Indeed, programming in postwar years was considered woman's work (perhaps in contrast to the more manly task of building the computers themselves). In Recoding Gender, Janet Abbate explores the untold history of women in computer science and programming from the Second World War to the late twentieth century. Demonstrating how gender has shaped the culture of computing, she offers a valuable historical perspective on today's concerns over women's underrepresentation in the field.
Abbate describes the experiences of women who worked with the earliest electronic digital computers: Colossus, the wartime codebreaking computer at Bletchley Park outside London, and the American ENIAC, developed to calculate ballistics. She examines postwar methods for recruiting programmers, and the 1960s redefinition of programming as the more masculine “software engineering.” She describes the social and business innovations of two early software entrepreneurs, Elsie Shutt and Stephanie Shirley; and she examines the career paths of women in academic computer science.
Abbate's account of the bold and creative strategies of women who loved computing work, excelled at it, and forged successful careers will provide inspiration for those working to change gendered computing culture.
One night in the late 1930s, in a bar on the Illinois–Iowa border, John Vincent Atanasoff, a professor of physics at Iowa State University, after a frustrating day performing tedious mathematical calculations in his lab, hit on the idea that the binary number system and electronic switches, combined with an array of capacitors on a moving drum to serve as memory, could yield a computing machine that would make his life and the lives of other similarly burdened scientists easier. Then he went back and built the machine. It worked. The whole world changed.
Why don’t we know the name of John Atanasoff as well as we know those of Alan Turing and John von Neumann? Because he never patented the device, and because the developers of the far-better-known ENIAC almost certainly stole critical ideas from him. But in 1973 a court declared that the patent on that Sperry Rand device was invalid, opening the intellectual property gates to the computer revolution.
Jane Smiley tells the quintessentially American story of the child of immigrants John Atanasoff with technical clarity and narrative drive, making the race to develop digital computing as gripping as a real-life techno-thriller.
From the Hardcover edition.
In the 1930s a series of seminal works published by Alan Turing, Kurt Gödel, Alonzo Church, and others established the theoretical basis for computability. This work, advancing precise characterizations of effective, algorithmic computability, was the culmination of intensive investigations into the foundations of mathematics. In the decades since, the theory of computability has moved to the center of discussions in philosophy, computer science, and cognitive science. In this volume, distinguished computer scientists, mathematicians, logicians, and philosophers consider the conceptual foundations of computability in light of our modern understanding.
Some chapters focus on the pioneering work by Turing, Gödel, and Church, including the Church-Turing thesis and Gödel's response to Church's and Turing's proposals. Other chapters cover more recent technical developments, including computability over the reals, Gödel's influence on mathematical logic and on recursion theory and the impact of work by Turing and Emil Post on our theoretical understanding of online and interactive computing; and others relate computability and complexity to issues in the philosophy of mind, the philosophy of science, and the philosophy of mathematics.
Scott Aaronson, Dorit Aharonov, B. Jack Copeland, Martin Davis, Solomon Feferman, Saul Kripke, Carl J. Posy, Hilary Putnam, Oron Shagrir, Stewart Shapiro, Wilfried Sieg, Robert I. Soare, Umesh V. Vazirani
The vast majority of all email sent every day is spam, a variety of idiosyncratically spelled requests to provide account information, invitations to spend money on dubious products, and pleas to send cash overseas. Most of it is caught by filters before ever reaching an in-box. Where does it come from? As Finn Brunton explains in Spam, it is produced and shaped by many different populations around the world: programmers, con artists, bots and their botmasters, pharmaceutical merchants, marketers, identity thieves, crooked bankers and their victims, cops, lawyers, network security professionals, vigilantes, and hackers. Every time we go online, we participate in the system of spam, with choices, refusals, and purchases the consequences of which we may not understand.
This is a book about what spam is, how it works, and what it means. Brunton provides a cultural history that stretches from pranks on early computer networks to the construction of a global criminal infrastructure. The history of spam, Brunton shows us, is a shadow history of the Internet itself, with spam emerging as the mirror image of the online communities it targets. Brunton traces spam through three epochs: the 1970s to 1995, and the early, noncommercial computer networks that became the Internet; 1995 to 2003, with the dot-com boom, the rise of spam's entrepreneurs, and the first efforts at regulating spam; and 2003 to the present, with the war of algorithms—spam versus anti-spam. Spam shows us how technologies, from email to search engines, are transformed by unintended consequences and adaptations, and how online communities develop and invent governance for themselves.
Welcome to the software paradox. In this O’Reilly report, RedMonk’s Stephen O’Grady explains why the real money no longer lies in software, and what it means for companies that depend on that revenue. You’ll learn how this paradox came about and what your company can do in response.
This book covers:Why it’s growing more difficult to sell software on a standalone basisHow software has come full circle, from enabler to product and back againThe roles that open source, software-as-a-service, and subscriptions playHow software developers have become the new kingmakersWhy Microsoft, Apple, and Google epitomize this transitionHow the paradox has affected other tech giants, such as Oracle and Salesforce.comStrategies your software firm can explore, including alternative revenue models
Conceived in 1943, completed in 1945, and decommissioned in 1955, ENIAC (the Electronic Numerical Integrator and Computer) was the first general-purpose programmable electronic computer. But ENIAC was more than just a milestone on the road to the modern computer. During its decade of operational life, ENIAC calculated sines and cosines and tested for statistical outliers, plotted the trajectories of bombs and shells, and ran the first numerical weather simulations. ENIAC in Action tells the whole story for the first time, from ENIAC's design, construction, testing, and use to its afterlife as part of computing folklore. It highlights the complex relationship of ENIAC and its designers to the revolutionary approaches to computer architecture and coding first documented by John von Neumann in 1945.
Within this broad sweep, the authors emphasize the crucial but previously neglected years of 1947 to 1948, when ENIAC was reconfigured to run what the authors claim was the first modern computer program to be executed: a simulation of atomic fission for Los Alamos researchers. The authors view ENIAC from diverse perspectives—as a machine of war, as the “first computer,” as a material artifact constantly remade by its users, and as a subject of (contradictory) historical narratives. They integrate the history of the machine and its applications, describing the mathematicians, scientists, and engineers who proposed and designed ENIAC as well as the men—and particularly the women who—built, programmed, and operated it.
In 1959, the electronics manufacturer Stromberg-Carlson produced the S-C 4020, a device that allowed mainframe computers to present and preserve images. In the mainframe era, the output of text and image was quite literally peripheral; the S-C 4020—a strange and elaborate apparatus, with a cathode ray screen, a tape deck, a buffer unit, a film camera, and a photo-paper camera—produced most of the computer graphics of the late 1950s and early 1960s. At Bell Laboratories in Murray Hill, New Jersey, the S-C 4020 became a crucial part of ongoing encounters among art, science, and technology. In this book, Zabet Patterson examines the extraordinary uses to which the Bell Labs SC-2040 was put between 1961 and 1972, exploring a series of early computer art projects shaped by the special computational affordances of the S-C 4020.
The S-C 4020 produced tabular data, graph plotting and design drawings, grid projections, and drawings of axes and vectors; it made previously impossible visualizations possible. Among the works Patterson describes are E. E. Zajac's short film of an orbiting satellite, which drew on the machine's graphic capacities as well as the mainframe's calculations; a groundbreaking exhibit of “computer generated pictures” by Béla Julesz and Michael Noll, two scientists interested in visualization; animations by Kenneth Knowlton and the Bell Labs artist-in-residence Stan VanDerBeek; and Lillian Schwartz's “cybernetic” film Pixillation.
Arguing for the centrality of a peripheral, Patterson makes a case for considering computational systems not simply as machines but in their cultural and historical context.
In 1944, Britain led the world in electronic computing. By 1974, the British computer industry was all but extinct. What happened in the intervening thirty years holds lessons for all postindustrial superpowers. As Britain struggled to use technology to retain its global power, the nation's inability to manage its technical labor force hobbled its transition into the information age.
In Programmed Inequality, Marie Hicks explores the story of labor feminization and gendered technocracy that undercut British efforts to computerize. That failure sprang from the government's systematic neglect of its largest trained technical workforce simply because they were women. Women were a hidden engine of growth in high technology from World War II to the 1960s. As computing experienced a gender flip, becoming male-identified in the 1960s and 1970s, labor problems grew into structural ones and gender discrimination caused the nation's largest computer user—the civil service and sprawling public sector—to make decisions that were disastrous for the British computer industry and the nation as a whole.
Drawing on recently opened government files, personal interviews, and the archives of major British computer companies, Programmed Inequality takes aim at the fiction of technological meritocracy.Hicks explains why, even today, possessing technical skill is not enough to ensure that women will rise to the top in science and technology fields. Programmed Inequality shows how the disappearance of women from the field had grave macroeconomic consequences for Britain, and why the United States risks repeating those errors in the twenty-first century.
Beautiful Security explores this challenging subject with insightful essays and analysis on topics that include:
The underground economy for personal information: how it works, the relationships among criminals, and some of the new ways they pounce on their preyHow social networking, cloud computing, and other popular trends help or hurt our online securityHow metrics, requirements gathering, design, and law can take security to a higher levelThe real, little-publicized history of PGP
This book includes contributions from:
Peiter "Mudge" ZatkoJim StickleyElizabeth NicholsChenxi WangEd BellisBen EdelmanPhil Zimmermann and Jon CallasKathy WangMark CurpheyJohn McManusJames RouthRandy V. SabettAnton ChuvakinGrant Geyer and Brian DunphyPeter WaynerMichael Wood and Fernando Francisco
All royalties will be donated to the Internet Engineering Task Force (IETF).
In Design For How People Learn, Second Edition, you'll discover how to use the key principles behind learning, memory, and attention to create materials that enable your audience to both gain and retain the knowledge and skills you're sharing. Updated to cover new insights and research into how we learn and remember, this new edition includes new techniques for using social media for learning as well as two brand new chapters on designing for habit and best practices for evaluating learning, such as how and when to use tests. Using accessible visual metaphors and concrete methods and examples, Design For How People Learn, Second Edition will teach you how to leverage the fundamental concepts of instructional design both to improve your own learning and to engage your audience.
This lively and fascinating text traces the key developments in computation – from 3000 B.C. to the present day – in an easy-to-follow and concise manner. Providing a comprehensive introduction to the most significant events and concepts in the history of computing, the book embarks upon a journey from ancient Egypt to modern times; taking in mechanical calculators, early digital computers, the first personal computers and 3G mobile phones, among other topics. This expanded and revised new edition also examines the evolution of programming languages and the history of software engineering, in addition to such revolutions in computing as the invention of the World Wide Web.
Topics and features: ideal for self-study, offering many pedagogical features such as chapter-opening key topics, chapter introductions and summaries, exercises, and a glossary; presents detailed information on major figures in computing, such as Boole, Babbage, Shannon, Turing, Zuse and Von Neumann; reviews the history of software engineering and of programming languages, including syntax and semantics; discusses the progress of artificial intelligence, with extension to such key disciplines as philosophy, psychology, linguistics, neural networks and cybernetics; examines the impact on society of the introduction of the personal computer, the World Wide Web, and the development of mobile phone technology; follows the evolution of a number of major technology companies, including IBM, Microsoft and Apple.
This clearly written and broad-ranging text both gives the reader a flavour of the history and stimulates further study in the subject. As such, it will be of great benefit to students of computer science, while also capturing the interest of the more casual reader.
With a novelist's sensitivity, David Leavitt portrays Turing in all his humanity—his eccentricities, his brilliance, his fatal candor—and elegantly explains his work and its implications.
Even with her talents, Bartik met obstacles in her career due to attitudes about women’s roles in the workplace. Her perseverance paid off and she worked with the earliest computer pioneers and helped launch the commercial computer industry. Despite their contributions, Bartik and the other female ENIAC programmers have been largely ignored. In the only autobiography by any of the six original ENIAC programmers, Bartik tells her story, exposing myths about the computer’s origin and properly crediting those behind the computing innovations that shape our daily lives.
Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider world of women and men who did the hard computational labor of science. His grandmother's casual remark, "I wish I'd used my calculus," hinted at a career deferred and an education forgotten, a secret life unappreciated; like many highly educated women of her generation, she studied to become a human computer because nothing else would offer her a place in the scientific world.
The book begins with the return of Halley's comet in 1758 and the effort of three French astronomers to compute its orbit. It ends four cycles later, with a UNIVAC electronic computer projecting the 1986 orbit. In between, Grier tells us about the surveyors of the French Revolution, describes the calculating machines of Charles Babbage, and guides the reader through the Great Depression to marvel at the giant computing room of the Works Progress Administration.
When Computers Were Human is the sad but lyrical story of workers who gladly did the hard labor of research calculation in the hope that they might be part of the scientific community. In the end, they were rewarded by a new electronic machine that took the place and the name of those who were, once, the computers.
With hilarious, exasperated acuity, social critic Hal Niedzviecki dives into peep, starting his own video blog, joining every social network that will have him, monitoring the movements of his toddler, selling his secrets on Craigslist, hiring a private detective to investigate him, spying on his neighbors, trying out for reality TV shows, and stripping for the pleasure of a web audience he isn’t even sure exists. Part travelogue, part diary, part meditation and social history, The Peep Diaries explores a rapidly emerging digital phenomenon that is radically changing not just the entertainment landscape, but also the firmaments of our culture and society.
The Peep Diaries introduces the arrival of the age of peep culture and explores its implications for entertainment, society, sex, politics, and everyday life. Mixing first-rate reporting with sociological observations culled from the latest research, this book captures the shift from pop to peep and the way technology is turning gossip into documentary and Peeping Toms into entertainment journalists. Packed with stranger-than-fiction true-life characters and scenarios, The Peep Diaries reflects the aspirations and confusions of the growing number of people willing to trade the details of their private lives for catharsis, attention, and notoriety.
"Take a peek at The Peep Diaries, an erudite (but not too erudite) look at the culture that Facebook, Twitter, et al. have spawned." —Real Simple
"It’s a great read; it mixes frank interviews with people pushing the boundaries of voyeurism and exhibitionism, alongside a bracing critique of the social context that got us into peep culture and the forces that now exploit our participation in it.” —The Globe and Mail
"A snapshot of a world in profound transformation. Compelling and creepy." —Naomi Klein, author of The Shock Doctrine and No Logo
"If you've found yourself obsessively posting to Facebook, Twitter, and YouTube – and becoming a little uneasy about how it's changing your life – you should read this book. The Peep Diaries is a superb investigation into how technology is shifting the landscape of our private lives." —Clive Thompson, Wired magazine columnist
"A cogent and penetrating analysis. I certainly hope, as The Peep Diaries suggests, that the cruel spectacle we're witnessing on the tube most evenings actually holds some hope for a more loving future." —Douglas Rushkoff, author of Media Virus and Life, Inc.
Hal Niedzviecki is the founder of Broken Pencil magazine and has published numerous works of social commentary and fiction, including Hello I’m Special: How Individuality Became the New Conformity and Look Down, This Is Where It Must Have Happened, which is also published by City Lights Publishers.
In kindergartens these days, children spend more time with math worksheets and phonics flashcards than building blocks and finger paint. Kindergarten is becoming more like the rest of school. In Lifelong Kindergarten, learning expert Mitchel Resnick argues for exactly the opposite: the rest of school (even the rest of life) should be more like kindergarten. To thrive in today's fast-changing world, people of all ages must learn to think and act creatively—and the best way to do that is by focusing more on imagining, creating, playing, sharing, and reflecting, just as children do in traditional kindergartens.
Drawing on experiences from more than thirty years at MIT's Media Lab, Resnick discusses new technologies and strategies for engaging young people in creative learning experiences. He tells stories of how children are programming their own games, stories, and inventions (for example, a diary security system, created by a twelve-year-old girl), and collaborating through remixing, crowdsourcing, and large-scale group projects (such as a Halloween-themed game called Night at Dreary Castle, produced by more than twenty kids scattered around the world). By providing young people with opportunities to work on projects, based on their passions, in collaboration with peers, in a playful spirit, we can help them prepare for a world where creative thinking is more important than ever before.
Blended learning has the power to reinvent education, but the transition requires a new approach to learning and a new skillset for educators. Loaded with research and examples, Blended Learning in Action demonstrates the advantages a blended model has over traditional instruction when technology is used to engage students both inside the classroom and online. Readers will find: Breakdowns of the most effective classroom setups for blended learning Tips for leaders Ideas for personalizing and differentiating instruction using technology Strategies for managing devices in schools Questions to facilitate professional development and deeper learning
This book is ideal for anyone who likes puzzles, brainteasers, games, gambling, magic tricks, and those who want to apply math and science to everyday circumstances. Several hacks in the first chapter alone-such as the "central limit theorem,", which allows you to know everything by knowing just a little-serve as sound approaches for marketing and other business objectives. Using the tools of inferential statistics, you can understand the way probability works, discover relationships, predict events with uncanny accuracy, and even make a little money with a well-placed wager here and there.
Statistics Hacks presents useful techniques from statistics, educational and psychological measurement, and experimental research to help you solve a variety of problems in business, games, and life. You'll learn how to:Play smart when you play Texas Hold 'Em, blackjack, roulette, dice games, or even the lotteryDesign your own winnable bar bets to make money and amaze your friendsPredict the outcomes of baseball games, know when to "go for two" in football, and anticipate the winners of other sporting events with surprising accuracyDemystify amazing coincidences and distinguish the truly random from the only seemingly random--even keep your iPod's "random" shuffle honestSpot fraudulent data, detect plagiarism, and break codesHow to isolate the effects of observation on the thing observed
Whether you're a statistics enthusiast who does calculations in your sleep or a civilian who is entertained by clever solutions to interesting problems, Statistics Hacks has tools to give you an edge over the world's slim odds.