A Brief History of Computing: Edition 2

Springer Science & Business Media
6
Free sample

The history of computing has its origins at the outset of civilization, and the need for increasingly sophisticated calculations has grown as towns and communities evolved.

This lively and fascinating text traces the key developments in computation – from 3000 B.C. to the present day – in an easy-to-follow and concise manner. Providing a comprehensive introduction to the most significant events and concepts in the history of computing, the book embarks upon a journey from ancient Egypt to modern times; taking in mechanical calculators, early digital computers, the first personal computers and 3G mobile phones, among other topics. This expanded and revised new edition also examines the evolution of programming languages and the history of software engineering, in addition to such revolutions in computing as the invention of the World Wide Web.

Topics and features: ideal for self-study, offering many pedagogical features such as chapter-opening key topics, chapter introductions and summaries, exercises, and a glossary; presents detailed information on major figures in computing, such as Boole, Babbage, Shannon, Turing, Zuse and Von Neumann; reviews the history of software engineering and of programming languages, including syntax and semantics; discusses the progress of artificial intelligence, with extension to such key disciplines as philosophy, psychology, linguistics, neural networks and cybernetics; examines the impact on society of the introduction of the personal computer, the World Wide Web, and the development of mobile phone technology; follows the evolution of a number of major technology companies, including IBM, Microsoft and Apple.

This clearly written and broad-ranging text both gives the reader a flavour of the history and stimulates further study in the subject. As such, it will be of great benefit to students of computer science, while also capturing the interest of the more casual reader.

Read more
Collapse

More by Gerard O'Regan

See more
From the earliest examples of computation to the digital devices that are ubiquitous in modern society, the application of mathematics to computing has underpinned the technology that has built our world.

This clearly written and enlightening textbook/reference provides a concise, introductory guide to the key mathematical concepts and techniques used by computer scientists. Spanning a wide range of topics – from number theory to software engineering – the book demonstrates the practical computing applications behind seemingly abstract ideas. The work of important figures such as Alan Turing and Robert Floyd are also discussed, highlighting how the theory has been informed by historical developments.

Topics and features: ideal for self-study, offering many pedagogical features such as chapter-opening key topics, chapter introductions and summaries, review questions, and a glossary; places our current state of knowledge within the context of the contributions made by early civilizations, such as the ancient Babylonians, Egyptians and Greeks; examines the building blocks of mathematics, including sets, relations and functions; presents an introduction to logic, formal methods and software engineering; explains the fundamentals of number theory, and its application in cryptography; describes the basics of coding theory, language theory, and graph theory; discusses the concept of computability and decideability; includes concise coverage of calculus, probability and statistics, matrices, complex numbers and quaternions.

This engaging and easy-to-understand book will appeal to students of computer science wishing for an overview of the mathematics used in computing, and to mathematicians curious about how their subject is applied in the field of computer science. The book will also capture the interest of the motivated general reader.

In order to maintain strong levels customer satisfaction and loyalty, software developers face considerable pressure to meet expectations for high-quality software products that are consistently delivered on time.

Introduction to Software Quality describes the approaches used by software engineers to build quality into their software. The fundamental principles of software quality management and software process improvement are discussed in detail, with a particular focus on the capability maturity model integration (CMMI) framework.

Topics and features: includes review questions at the end of each chapter; covers both theory and practice, in addition to providing guidance on applying the theory in an industrial environment; examines all aspects of the software development process, including project planning and tracking, software lifecycles, software inspections and testing, configuration management, and software quality assurance; provides detailed coverage of software metrics and problem solving; describes SCAMPI appraisals and how they form part of the continuous improvement cycle; presents an introduction to formal methods and the Z specification language, which are important in the safety critical field; discusses UML, which is used to describe the architecture of the system; reviews the history of the field of software quality, highlighting the pioneers who made key contributions to this area.

This clearly written and easy-to-follow textbook will be invaluable to students of computer science who wish to learn how to build high-quality and reliable software on time and on budget. Software engineers, quality professionals and software managers in industry will also find the book to be a useful tool for self-study.

3.3
6 total
Loading...

Additional Information

Publisher
Springer Science & Business Media
Read more
Collapse
Published on
Mar 5, 2012
Read more
Collapse
Pages
264
Read more
Collapse
ISBN
9781447123590
Read more
Collapse
Read more
Collapse
Read more
Collapse
Language
English
Read more
Collapse
Genres
Computers / General
Computers / History
Science / General
Science / History
Read more
Collapse
Content Protection
This content is DRM protected.
Read more
Collapse

Reading information

Smartphones and Tablets

Install the Google Play Books app for Android and iPad/iPhone. It syncs automatically with your account and allows you to read online or offline wherever you are.

Laptops and Computers

You can read books purchased on Google Play using your computer's web browser.

eReaders and other devices

To read on e-ink devices like the Sony eReader or Barnes & Noble Nook, you'll need to download a file and transfer it to your device. Please follow the detailed Help center instructions to transfer the files to supported eReaders.
How does a scientist go about solving problems? How do scientific discoveries happen? Why are cold fusion and parapsychology different from mainstream science? What is a scientific worldview? In this lively and wide-ranging book, Gregory Derry talks about these and other questions as he introduces the reader to the process of scientific thinking. From the discovery of X rays and semiconductors to the argument for continental drift to the invention of the smallpox vaccine, scientific work has proceeded through honest observation, critical reasoning, and sometimes just plain luck. Derry starts out with historical examples, leading readers through the events, experiments, blind alleys, and thoughts of scientists in the midst of discovery and invention. Readers at all levels will come away with an enriched appreciation of how science operates and how it connects with our daily lives.

An especially valuable feature of this book is the actual demonstration of scientific reasoning. Derry shows how scientists use a small number of powerful yet simple methods--symmetry, scaling, linearity, and feedback, for example--to construct realistic models that describe a number of diverse real-life problems, such as drug uptake in the body, the inner workings of atoms, and the laws of heredity.


Science involves a particular way of thinking about the world, and Derry shows the reader that a scientific viewpoint can benefit most personal philosophies and fields of study. With an eye to both the power and limits of science, he explores the relationships between science and topics such as religion, ethics, and philosophy. By tackling the subject of science from all angles, including the nuts and bolts of the trade as well as its place in the overall scheme of life, the book provides a perfect place to start thinking like a scientist.

FROM THE AUTHOR OF THE BESTSELLING BIOGRAPHIES OF BENJAMIN FRANKLIN AND ALBERT EINSTEIN, THIS IS THE EXCLUSIVE BIOGRAPHY OF STEVE JOBS.

Based on more than forty interviews with Jobs conducted over two years—as well as interviews with more than a hundred family members, friends, adversaries, competitors, and colleagues—Walter Isaacson has written a riveting story of the roller-coaster life and searingly intense personality of a creative entrepreneur whose passion for perfection and ferocious drive revolutionized six industries: personal computers, animated movies, music, phones, tablet computing, and digital publishing.

At a time when America is seeking ways to sustain its innovative edge, and when societies around the world are trying to build digital-age economies, Jobs stands as the ultimate icon of inventiveness and applied imagination. He knew that the best way to create value in the twenty-first century was to connect creativity with technology. He built a company where leaps of the imagination were combined with remarkable feats of engineering.

Although Jobs cooperated with this book, he asked for no control over what was written nor even the right to read it before it was published. He put nothing off-limits. He encouraged the people he knew to speak honestly. And Jobs speaks candidly, sometimes brutally so, about the people he worked with and competed against. His friends, foes, and colleagues provide an unvarnished view of the passions, perfectionism, obsessions, artistry, devilry, and compulsion for control that shaped his approach to business and the innovative products that resulted.

Driven by demons, Jobs could drive those around him to fury and despair. But his personality and products were interrelated, just as Apple’s hardware and software tended to be, as if part of an integrated system. His tale is instructive and cautionary, filled with lessons about innovation, character, leadership, and values.
It has been upon the shoulders of giants that the modern world has been forged.

This accessible compendium presents an insight into the great minds responsible for the technology which has transformed our lives. Each pioneer is introduced with a brief biography, followed by a concise account of their key contributions to their discipline. The selection covers a broad spread of historical and contemporary figures from theoreticians to entrepreneurs, highlighting the richness of the field of computing.

Topics and features: describes the lives and machines built by Hermann Hollerith, Vannevar Bush, Howard Aiken, John Atanasoff, Tommy Flowers, John Mauchly, and Konrad Zuse; examines the contributions made by Claude Shannon, John Von Neumann, Alan Turing, and Sir Frederick Williams; reviews such pioneers of commercial computing as John Backus, Fred Brooks, Gordon Moore, William Shockley, Vint Cerf, Don Estridge, Gary Kildall, and Tim Berners-Lee; surveys pivotal software engineers, including Robert Floyd, C.A.R Hoare, Dines Bjorner, Edger Dijkstra, Tom DeMarco, Michael Fagan, Watt Humphries, Ivor Jacobson, David Parnas, and Ed Yourdan; discusses important characters in theoretical computing, such as James Gosling, Grace Murray Hopper, Kenneth Iverson, Donald Knuth, Dennis Ritchie, Ken Thompson, Dana Scott, Christopher Strachey, Bjarne Stroustrup, and Niklaus Wirth; includes significant contributors to the field of artificial intelligence, including John McCarthy, Marvin Minsky, John Searle, and Joseph Weizenbaum; presents a selection of computer entrepreneurs, including Larry Ellison, Bill Gates, Steve Jobs, Ken Olsen, and Thomas Watson Sr. and Jr.

Suitable for the general reader, this concise and easy-to-read reference will be of interest to anyone curious about the inspiring men and women who have shaped the field of computer science.
A NEW YORK TIMES BESTSELLER

The official book behind the Academy Award-winning film The Imitation Game, starring Benedict Cumberbatch and Keira Knightley

It is only a slight exaggeration to say that the British mathematician Alan Turing (1912-1954) saved the Allies from the Nazis, invented the computer and artificial intelligence, and anticipated gay liberation by decades--all before his suicide at age forty-one. This New York Times–bestselling biography of the founder of computer science, with a new preface by the author that addresses Turing's royal pardon in 2013, is the definitive account of an extraordinary mind and life.


Capturing both the inner and outer drama of Turing’s life, Andrew Hodges tells how Turing’s revolutionary idea of 1936--the concept of a universal machine--laid the foundation for the modern computer and how Turing brought the idea to practical realization in 1945 with his electronic design. The book also tells how this work was directly related to Turing’s leading role in breaking the German Enigma ciphers during World War II, a scientific triumph that was critical to Allied victory in the Atlantic. At the same time, this is the tragic account of a man who, despite his wartime service, was eventually arrested, stripped of his security clearance, and forced to undergo a humiliating treatment program--all for trying to live honestly in a society that defined homosexuality as a crime.


The inspiration for a major motion picture starring Benedict Cumberbatch and Keira Knightley, Alan Turing: The Enigma is a gripping story of mathematics, computers, cryptography, and homosexual persecution.

©2020 GoogleSite Terms of ServicePrivacyDevelopersArtistsAbout Google|Location: United StatesLanguage: English (United States)
By purchasing this item, you are transacting with Google Payments and agreeing to the Google Payments Terms of Service and Privacy Notice.