Blending the informed analysis of The Signal and the Noise with the instructive iconoclasm of Think Like a Freak, a fascinating, illuminating, and witty look at what the vast amounts of information now instantly available to us reveals about ourselves and our world—provided we ask the right questions.
By the end of an average day in the early twenty-first century, human beings searching the internet will amass eight trillion gigabytes of data. This staggering amount of information—unprecedented in history—can tell us a great deal about who we are—the fears, desires, and behaviors that drive us, and the conscious and unconscious decisions we make. From the profound to the mundane, we can gain astonishing knowledge about the human psyche that less than twenty years ago, seemed unfathomable.
Everybody Lies offers fascinating, surprising, and sometimes laugh-out-loud insights into everything from economics to ethics to sports to race to sex, gender and more, all drawn from the world of big data. What percentage of white voters didn’t vote for Barack Obama because he’s black? Does where you go to school effect how successful you are in life? Do parents secretly favor boy children over girls? Do violent films affect the crime rate? Can you beat the stock market? How regularly do we lie about our sex lives and who’s more self-conscious about sex, men or women?
Investigating these questions and a host of others, Seth Stephens-Davidowitz offers revelations that can help us understand ourselves and our lives better. Drawing on studies and experiments on how we really live and think, he demonstrates in fascinating and often funny ways the extent to which all the world is indeed a lab. With conclusions ranging from strange-but-true to thought-provoking to disturbing, he explores the power of this digital truth serum and its deeper potential—revealing biases deeply embedded within us, information we can use to change our culture, and the questions we’re afraid to ask that might be essential to our health—both emotional and physical. All of us are touched by big data everyday, and its influence is multiplying. Everybody Lies challenges us to think differently about how we see it and the world.
This book develops a framework that shows how uncertainty in AI expands and generalizes traditional AI. It describes the cloud model, its uncertainties of randomness and fuzziness, and the correlation between them. The book also centers on other physical methods for data mining, such as the data field and knowledge discovery state space. In addition, it presents an inverted pendulum example to discuss reasoning and control with uncertain knowledge as well as provides a cognitive physics model to visualize human thinking with hierarchy.
With in-depth discussions on the fundamentals, methodologies, and uncertainties in AI, this book explains and simulates human thinking, leading to a better understanding of cognitive processes.
See What’s New in the Fourth Edition:
Up-to-date information on GNSS and GPS modernization Changes in hardware, software, and procedures Comprehensive treatment of novel signals on new blocks of satellites (L5 and L2C)
The book minimizes your reliance on mathematical explanations and maximizes use of illustrations and examples that allow you to visualize and grasp key concepts. The author explains the progression of ideas at the foundation of satellite positioning and delves into some of the particulars. He keeps presentation practical, providing a guide to techniques used in GPS, from their design through observation, processings, real-time kinematic (RTK), and real-time networks. These features and more make it easier for you to meet the challenge of keeping up in this field.
Pinpoint tells the fascinating story of a hidden system that touches nearly every aspect of modern life. Tracking the development of GPS from its origins as a bomb guidance system to its present ubiquity, Greg Milner examines the technology’s double-edged effect on the way we live, work, and travel. Savvy and original, this sweeping scientific history offers startling insight into how humans understand their place in the world.
Geospatial Technology for Earth Observation provides an in-depth and broad collection of recent progress in Earth observation. Contributed by leading experts in this field, the book covers satellite, airborne and ground remote sensing systems and system integration, sensor orientation, remote sensing physics, image classification and analysis, information extraction, geospatial service, and various application topics, including cadastral mapping, land use change evaluation, water environment monitoring, flood mapping, and decision making support.
Geospatial Technology for Earth Observation serves as a valuable training source for researchers, developers, and practitioners in geospatial science and technology industry. It is also suitable as a reference book for upper level college students and graduate students in geospatial technology, geosciences, resource management, and informatics.
The 70 papers presented in this volume were carefully reviewed and selected from 105 submissions. The selected papers covered a wide variety of important topics in the area of data mining, including parallel and distributed data mining algorithms, mining on data streams, graph mining, spatial data mining, multimedia data mining, Web mining, the Internet of Things, health informatics, and biomedical data mining.
Using Python code throughout, Xiao breaks the subject down into three fundamental areas:
Geometric Algorithms Spatial Indexing Spatial Analysis and Modelling With its comprehensive coverage of the many algorithms involved, GIS Algorithms is a key new textbook in this complex and critical area of geography.
The book shows you how satellite, inertial, and other navigation technologies work, and focuses on processing chains and error sources. In addition, you get a clear introduction to coordinate frames, multi-frame kinematics, Earth models, gravity, Kalman filtering, and nonlinear filtering. Providing solutions to common integration problems, the book describes and compares different integration architectures, and explains how to model different error sources. You get a broad and penetrating overview of current technology and are brought up to speed with the latest developments in the field, including context-dependent and cooperative positioning.
Let's face it, SQL is a deceptively simple language to learn, and many database developers never go far beyond the simple statement: SELECT columns FROM table WHERE conditions. But there is so much more you can do with the language. In the SQL Cookbook, experienced SQL developer Anthony Molinaro shares his favorite SQL techniques and features. You'll learn about:
Window functions, arguably the most significant enhancement to SQL in the past decade. If you're not using these, you're missing out
Powerful, database-specific features such as SQL Server's PIVOT and UNPIVOT operators, Oracle's MODEL clause, and PostgreSQL's very useful GENERATE_SERIES function
Pivoting rows into columns, reverse-pivoting columns into rows, using pivoting to facilitate inter-row calculations, and double-pivoting a result set
Bucketization, and why you should never use that term in Brooklyn.
How to create histograms, summarize data into buckets, perform aggregations over a moving range of values, generate running-totals and subtotals, and other advanced, data warehousing techniques
The technique of walking a string, which allows you to use SQL to parse through the characters, words, or delimited elements of a string
Written in O'Reilly's popular Problem/Solution/Discussion style, the SQL Cookbook is sure to please. Anthony's credo is: "When it comes down to it, we all go to work, we all have bills to pay, and we all want to go home at a reasonable time and enjoy what's still available of our days." The SQL Cookbook moves quickly from problem to solution, saving you time each step of the way.
Based on an MBA course Provost has taught at New York University over the past ten years, Data Science for Business provides examples of real-world business problems to illustrate these principles. You’ll not only learn how to improve communication between business stakeholders and data scientists, but also how participate intelligently in your company’s data science projects. You’ll also discover how to think data-analytically, and fully appreciate how data science methods can support business decision-making.Understand how data science fits in your organization—and how you can use it for competitive advantageTreat data as a business asset that requires careful investment if you’re to gain real valueApproach business problems data-analytically, using the data-mining process to gather good data in the most appropriate wayLearn general concepts for actually extracting knowledge from dataApply data science principles when interviewing data science job candidates
This textbook emphasizes an important shift in conceptualization and directs it toward students with prior knowledge of optical remote sensing: the author dispels any linkage between microwave and optical remote sensing. Instead, he constructs the concept of microwave remote sensing by comparing it to the process of audio perception, explaining the workings of the ear as a metaphor for microwave instrumentation.
This volume takes an “application-driven” approach. Instead of describing the technology and then its uses, this textbook justifies the need for measurement then explains how microwave technology addresses this need.
Following a brief summary of the field and a history of the use of microwaves, the book explores the physical properties of microwaves and the polarimetric properties of electromagnetic waves. It examines the interaction of microwaves with matter, analyzes passive atmospheric and passive surface measurements, and describes the operation of altimeters and scatterometers. The textbook concludes by explaining how high resolution images are created using radars, and how techniques of interferometry can be applied to both passive and active sensors.
Using the open source R language, you can build powerful statistical models to answer many of your most challenging questions. R has traditionally been difficult for non-statisticians to learn, and most R books assume far too much knowledge to be of help. R for Everyone, Second Edition, is the solution.
Drawing on his unsurpassed experience teaching new users, professional data scientist Jared P. Lander has written the perfect tutorial for anyone new to statistical programming and modeling. Organized to make learning easy and intuitive, this guide focuses on the 20 percent of R functionality you’ll need to accomplish 80 percent of modern data tasks.
Lander’s self-contained chapters start with the absolute basics, offering extensive hands-on practice and sample code. You’ll download and install R; navigate and use the R environment; master basic program control, data import, manipulation, and visualization; and walk through several essential tests. Then, building on this foundation, you’ll construct several complete models, both linear and nonlinear, and use some data mining techniques. After all this you’ll make your code reproducible with LaTeX, RMarkdown, and Shiny.
By the time you’re done, you won’t just know how to write R programs, you’ll be ready to tackle the statistical problems you care about most.
Coverage includesExplore R, RStudio, and R packages Use R for math: variable types, vectors, calling functions, and more Exploit data structures, including data.frames, matrices, and lists Read many different types of data Create attractive, intuitive statistical graphics Write user-defined functions Control program flow with if, ifelse, and complex checks Improve program efficiency with group manipulations Combine and reshape multiple datasets Manipulate strings using R’s facilities and regular expressions Create normal, binomial, and Poisson probability distributions Build linear, generalized linear, and nonlinear models Program basic statistics: mean, standard deviation, and t-tests Train machine learning models Assess the quality of models and variable selection Prevent overfitting and perform variable selection, using the Elastic Net and Bayesian methods Analyze univariate and multivariate time series data Group data via K-means and hierarchical clustering Prepare reports, slideshows, and web pages with knitr Display interactive data with RMarkdown and htmlwidgets Implement dashboards with Shiny Build reusable R packages with devtools and Rcpp
Register your product at informit.com/register for convenient access to downloads, updates, and corrections as they become available.
This book contains the latest developments in the implementation and application of Kalman filtering. Authors Grewal and Andrews draw upon their decades of experience to offer an in-depth examination of the subtleties, common pitfalls, and limitations of estimation theory as it applies to real-world situations. They present many illustrative examples including adaptations for nonlinear filtering, global navigation satellite systems, the error modeling of gyros and accelerometers, inertial navigation systems, and freeway traffic control.
Kalman Filtering: Theory and Practice Using MATLAB, Fourth Edition is an ideal textbook in advanced undergraduate and beginning graduate courses in stochastic processes and Kalman filtering. It is also appropriate for self-instruction or review by practicing engineers and scientists who want to learn more about this important topic.
See What’s New in the Second Edition:
Summaries at the end of each chapter Worked examples of techniques described Additional material on matrices and vectors Further material on map projections New material on spatial correlation A new section on global positioning systems
Written for those who need to make use geographic information systems but have a limited mathematical background, this book introduces the basic statistical techniques commonly used in geographic information systems and explains best-fit solutions and the mathematics behind satellite positioning. By understanding the mathematics behind the gathering, processing, and display of information, you can better advise others on the integrity of results, the quality of the information, and the safety of using it.
Updated to reflect recent advances in MySQL and InnoDB performance, features, and tools, this third edition not only offers specific examples of how MySQL works, it also teaches you why this system works as it does, with illustrative stories and case studies that demonstrate MySQL’s principles in action. With this book, you’ll learn how to think in MySQL.Learn the effects of new features in MySQL 5.5, including stored procedures, partitioned databases, triggers, and viewsImplement improvements in replication, high availability, and clusteringAchieve high performance when running MySQL in the cloudOptimize advanced querying features, such as full-text searchesTake advantage of modern multi-core CPUs and solid-state disksExplore backup and recovery strategies—including new tools for hot online backups
Advanced Methods for Data Analysis and Visualization
Featuring more than 500 color illustrations, this unique and visually powerful book outlines the required elements of a conceptual site model and provides numerous examples of supporting charts, cross-sections, maps, and 3D graphics. The authors describe advanced analytical methods such as geospatial processing, kriging, and groundwater modeling through practical real-life examples. They also present numerous case studies in groundwater supply and remediation to help explain key engineering design concepts.
Data-Driven Assessments of Groundwater Management Policy
The authors tackle controversial topics, ranging from technical impracticability of groundwater remediation to sustainable management of groundwater resources. They encourage discussion and independent thought about how current environmental policies and practices can evolve to achieve better outcomes at less cost to society.
Practical Strategies for Communicating Your Findings to the General Public
While the book is technical in nature, equations and advanced theory are kept to a minimum. The text focuses on practical strategies to help you create easy-to-understand data tables, graphs, maps, and illustrations for technical and nontechnical audiences alike. A companion DVD includes animations, reference material, modeling software, and more.
Using Python code throughout, Xiao breaks the subject down into three fundamental areas:
Geometric Algorithms Spatial Indexing Spatial Analysis and Modelling With its comprehensive coverage of the many algorithms involved, GIS Algorithms is a key new textbook in this complex and critical area of geography.
Divided into four sections, the first deals with various sensors, systems, or sensing operations using different regions of wavelengths. Drawing on the data and lessons learned from the U.S. Landsat remote sensing programs, it reviews key concepts, methods, and practical uses of particular sensors/sensing systems. Section II presents new developments in algorithms and techniques, specifically in image preprocessing, thematic information extraction, and digital change detection. It gives correction algorithms for hyperspectral, thermal, and multispectral sensors, discusses the combined method for performing topographic and atmospheric corrections, and provides examples of correcting non-standard atmospheric conditions, including haze, cirrus, and cloud shadow.
Section III focuses on remote sensing of vegetation and related features of the Earth’s surface. It reviews advancements in the remote sensing of ecosystem structure, process, and function, and notes important trade-offs and compromises in characterizing ecosystems from space related to spatial, spectral, and temporal resolutions of the imaging sensors. It discusses the mismatch between leaf-level and species-level ecological variables and satellite spatial resolutions and the resulting difficulties in validating satellite-derived products.
Finally, Section IV examines developments in the remote sensing of air, water, and other terrestrial features, reviews MODIS algorithms for aerosol retrieval at both global and local scales, and demonstrates the retrieval of aerosol optical thickness (AOT). This section rounds out coverage with a look at remote sensing approaches to measure the urban environment and examines the most important concepts and recent research.
* Contains recent UV applications not previously available in book form such as ozone, auroral images, and ionospheric sensing
* Features broad coverage of fundamentals of atmospheric geophysics with values for fluxes, cross-sections, and radiances
* Covers techniques that illustrate principles of measurements with typical values
* Contains numerous references to original literature
New and Updated in the Second Edition:
Web-based image viewing with Google Earth Aerial platforms Existing digital photogrammetric software systems, including Intergraph image station, Autodesk, and Oracle Spatial Land management and cadaster Imaging sensors such as laser scanning, image spectrometry, radar imaging, and radar interferometry
With the advent of high-resolution satellite systems in stereo, the theory of analytical photogrammetry restituting 2D image information into 3D is of increasing importance, merging the remote sensing approach with that of photogrammetry. This text describes the fundamentals of these approaches in detail, with an emphasis on global, regional, and local applications. It provides a short introduction to the GPS satellite positioning system in the context of data integration.
An extensive overview of the basic elements of GIS technologies and data management approaches, as well as the widely employed positioning systems such as GPS and GSM networks, complete the presentation of the technological framework for geoinformation. Appropriate for GIS courses at all levels, the book proceeds beyond the science and technology to tackle cost considerations and practical implementation issues, giving you a starting point for multidisciplinary new activities and services in the future.
Hyperspectral Remote Sensing of Vegetation integrates this knowledge, guiding readers to harness the capabilities of the most recent advances in applying hyperspectral remote sensing technology to the study of terrestrial vegetation. Taking a practical approach to a complex subject, the book demonstrates the experience, utility, methods and models used in studying vegetation using hyperspectral data. Written by leading experts, including pioneers in the field, each chapter presents specific applications, reviews existing state-of-the-art knowledge, highlights the advances made, and provides guidance for the appropriate use of hyperspectral data in the study of vegetation as well as its numerous applications, such as crop yield modeling, crop and vegetation biophysical and biochemical property characterization, and crop moisture assessment.
This comprehensive book brings together the best global expertise on hyperspectral remote sensing of agriculture, crop water use, plant species detection, vegetation classification, biophysical and biochemical modeling, crop productivity and water productivity mapping, and modeling. It provides the pertinent facts, synthesizing findings so that readers can get the correct picture on issues such as the best wavebands for their practical applications, methods of analysis using whole spectra, hyperspectral vegetation indices targeted to study specific biophysical and biochemical quantities, and methods for detecting parameters such as crop moisture variability, chlorophyll content, and stress levels. A collective "knowledge bank," it guides professionals to adopt the best practices for their own work.
Secondly, a MOO-based two-level spatial planning of land use is proposed. The spatial planning aims at managing and coordinating the land use at different geographic extents and involves spatial layouts and structures of land use at different levels. In spatial planning, GIS and Remote Sensing are used to evaluate, analyze, and measure environmental, economic and social issues. The quantitative relationships between these objectives and spatial land use allocation are then used as rules in the MOO process to simulate environmental conditions under different spatial land use allocation scenarios. The book features a case study of Shenzhen city, the most important Special Economic Zone in China.
This book will be of interest to academics and professionals in the fields of urban planning, land resource management, remote sensing and geographic information systems.
The book begins with an introduction to the basic processes that ensure the acquisition of space-borne imagery and provides an overview of the main satellite observation systems. It then describes visual and digital image analysis, highlights various interpretation techniques, and outlines their applications to science and management. The latter part of the book covers the integration of remote sensing with GIS for environmental analysis.
Based on the first English version published in 2010, this latest edition has been written to reflect a global audience, and factors in international debates and legal issues surrounding EO, as well as future developments and trends.
New in the Second Edition:
Includes additional illustrations now in full color Uses sample images acquired from different ecosystems at different spatial resolutions to illustrate different interpretation techniques Updates information on recent satellite missions (Landsat-8, Sentinel-2, hyperspectral and hyperspatial programs) Covers near-ground missions (including UAV) and ground sensors (spectro-radiometers, cameras, LIDAR, etc.) to support EO analysis Offers analysis of image spatial properties Presents material on visual analysis, time series analysis, and data fusion Provides examples of EO data that cover different environmental problems, with particular relevance to global observation
Fundamentals of Satellite Remote Sensing: An Environmental Approach, Second Edition details the tools that provide global, recurrent, and comprehensive views of the processes affecting the Earth and is a must-have for researchers, academics, students, and professionals involved in the field of environmental science.
If you have an aptitude for mathematics and some programming skills, author Joel Grus will help you get comfortable with the math and statistics at the core of data science, and with hacking skills you need to get started as a data scientist. Today’s messy glut of data holds answers to questions no one’s even thought to ask. This book provides you with the know-how to dig those answers out.Get a crash course in PythonLearn the basics of linear algebra, statistics, and probability—and understand how and when they're used in data scienceCollect, explore, clean, munge, and manipulate dataDive into the fundamentals of machine learningImplement models such as k-nearest Neighbors, Naive Bayes, linear and logistic regression, decision trees, neural networks, and clusteringExplore recommender systems, natural language processing, network analysis, MapReduce, and databases
Companies such as Google, Microsoft, and Facebook are actively growing in-house deep-learning teams. For the rest of us, however, deep learning is still a pretty complex and difficult subject to grasp. If you’re familiar with Python, and have a background in calculus, along with a basic understanding of machine learning, this book will get you started.Examine the foundations of machine learning and neural networksLearn how to train feed-forward neural networksUse TensorFlow to implement your first neural networkManage problems that arise as you begin to make networks deeperBuild neural networks that analyze complex imagesPerform effective dimensionality reduction using autoencodersDive deep into sequence analysis to examine languageLearn the fundamentals of reinforcement learning
Time-Of-Flight (TOF) Range-Imaging (TOF) is an emerging sensor technology able to deliver, at the same time, depth and intensity maps of the scene under observation. Featuring different sensor resolutions, RIM cameras serve a wide community with a lot of applications like monitoring, architecture, life sciences, robotics, etc. This book will bring together experts from the sensor and metrology side in order to collect the state-of-art researchers in these fields working with RIM cameras. All the aspects in the acquisition and processing chain will be addressed, from recent updates concerning the photo-detectors, to the analysis of the calibration techniques, giving also a perspective onto new applications domains.
As mobile computing devices become more and more prevalent and powerful, they are becoming more and more useful in the field of law enforcement investigations and forensics. Of all the widely used mobile applications, none have more potential for helping solve crimes than those with geo-location tools.
Written for investigators and forensic practitioners, Google Earth Forensics is written by an investigator and trainer with more than 13 years of experience in law enforcement who will show you how to use this valuable tool anywhere at the crime scene, in the lab, or in the courtroom.Learn how to extract location-based evidence using the Google Earth program or app on computers and mobile devicesCovers the basics of GPS systems, the usage of Google Earth, and helps sort through data imported from external evidence sourcesIncludes tips on presenting evidence in compelling, easy-to-understand formats
This book is ideal for data scientists who are familiar with C++ or Python and perform machine learning activities on a day-to-day basis. Intermediate and advanced machine learning implementers who need a quick guide they can easily navigate will find it useful.What You Will LearnBecome familiar with the basics of the TensorFlow machine learning libraryGet to know Linear Regression techniques with TensorFlowLearn SVMs with hands-on recipesImplement neural networks and improve predictionsApply NLP and sentiment analysis to your dataMaster CNN and RNN through practical recipesTake TensorFlow into productionIn Detail
TensorFlow is an open source software library for Machine Intelligence. The independent recipes in this book will teach you how to use TensorFlow for complex data computations and will let you dig deeper and gain more insights into your data than ever before. You'll work through recipes on training models, model evaluation, sentiment analysis, regression analysis, clustering analysis, artificial neural networks, and deep learning – each using Google's machine learning library TensorFlow.
This guide starts with the fundamentals of the TensorFlow library which includes variables, matrices, and various data sources. Moving ahead, you will get hands-on experience with Linear Regression techniques with TensorFlow. The next chapters cover important high-level concepts such as neural networks, CNN, RNN, and NLP.
Once you are familiar and comfortable with the TensorFlow ecosystem, the last chapter will show you how to take it to production.Style and approach
This book takes a recipe-based approach where every topic is explicated with the help of a real-world example.
The book covers all of the components needed to develop an LBIS. It discusses cellular phone programming using the Java ME platform, positioning technologies, databases and spatial databases, communications, client- and server-side data processing, and real-time data visualization via Google Maps and Google Earth. Using freely available software, the authors include many code examples and detailed instructions for building your own system and setting up your entire development environment.
A companion website at www.csee.usf.edu/~labrador/LBIS provides additional information and supporting material. It contains all of the software packages and applications used in the text as well as PowerPoint slides and laboratory examples.
Although LBIS applications are still in the beginning stages, they have the potential to transform our daily lives, from warning us about possible health problems to monitoring pollution levels around us. Exploring this novel technology, Location-Based Information Systems describes the technical components needed to create location-based services with an emphasis on nonproprietary, freely available solutions that work across different technologies and platforms.
See What’s New in the Second Edition:
All project instructions are in ArcGIS 10.2 using geodatabase datasets New chapters on regionalization methods and Monte Carlo simulation Popular tasks automated as a convenient toolkit: Huff Model, 2SFCA accessibility measure, regionalization, Garin-Lowry model, and Monte Carlo based spatial simulation Advanced tasks now implemented in user-friendly programs or ArcGIS: centrality indices, wasteful commuting measure, p-median problem, and traffic simulation
Each chapter has one subject theme and introduces the method (or a group of related methods) most relevant to the theme. While each method is illustrated in a special case of application, it can also be used to analyze different issues. For example, spatial regression is used to examine the relationship between job access and homicide patterns; systems of linear equations are analyzed to predict urban land use patterns; linear programming is introduced to solve the problem of wasteful commuting and allocate healthcare facilities; and Monte Carlo technique is illustrated in simulating urban traffic.
The book illustrates the range of computational methods and covers common tasks and major issues encountered in a spatial environment. It provides a platform for learning technical skills and quantitative methods in the context of addressing real-world problems, giving you instant access to the tools to resolve major socio-economic issues.
NoSQL Distilled is a concise but thorough introduction to this rapidly emerging technology. Pramod J. Sadalage and Martin Fowler explain how NoSQL databases work and the ways that they may be a superior alternative to a traditional RDBMS. The authors provide a fast-paced guide to the concepts you need to know in order to evaluate whether NoSQL databases are right for your needs and, if so, which technologies you should explore further.
The first part of the book concentrates on core concepts, including schemaless data models, aggregates, new distribution models, the CAP theorem, and map-reduce. In the second part, the authors explore architectural and design issues associated with implementing NoSQL. They also present realistic use cases that demonstrate NoSQL databases at work and feature representative examples using Riak, MongoDB, Cassandra, and Neo4j.
In addition, by drawing on Pramod Sadalage’s pioneering work, NoSQL Distilled shows how to implement evolutionary design with schema migration: an essential technique for applying NoSQL databases. The book concludes by describing how NoSQL is ushering in a new age of Polyglot Persistence, where multiple data-storage worlds coexist, and architects can choose the technology best optimized for each type of data access.
Across broad areas of the environmental and social sciences, simulation models are an important way to study systems inaccessible to scientific experimental and observational methods, and also an essential complement to those more conventional approaches. The contemporary research literature is teeming with abstract simulation models whose presentation is mathematically demanding and requires a high level of knowledge of quantitative and computational methods and approaches. Furthermore, simulation models designed to represent specific systems and phenomena are often complicated, and, as a result, difficult to reconstruct from their descriptions in the literature. This book aims to provide a practical and accessible account of dynamic spatial modelling, while also equipping readers with a sound conceptual foundation in the subject, and a useful introduction to the wide-ranging literature.
Spatial Simulation: Exploring Pattern and Process is organised around the idea that a small number of spatial processes underlie the wide variety of dynamic spatial models. Its central focus on three ‘building-blocks’ of dynamic spatial models – forces of attraction and segregation, individual mobile entities, and processes of spread – guides the reader to an understanding of the basis of many of the complicated models found in the research literature. The three building block models are presented in their simplest form and are progressively elaborated and related to real world process that can be represented using them. Introductory chapters cover essential background topics, particularly the relationships between pattern, process and spatiotemporal scale. Additional chapters consider how time and space can be represented in more complicated models, and methods for the analysis and evaluation of models. Finally, the three building block models are woven together in a more elaborate example to show how a complicated model can be assembled from relatively simple components.
To aid understanding, more than 50 specific models described in the book are available online at patternandprocess.org for exploration in the freely available Netlogo platform. This book encourages readers to develop intuition for the abstract types of model that are likely to be appropriate for application in any specific context. Spatial Simulation: Exploring Pattern and Process will be of interest to undergraduate and graduate students taking courses in environmental, social, ecological and geographical disciplines. Researchers and professionals who require a non-specialist introduction will also find this book an invaluable guide to dynamic spatial simulation.