Seasonal Adjustment Methods and Real Time Trend-Cycle Estimation

Free sample

This book explores widely used seasonal adjustment methods and recent developments in real time trend-cycle estimation. It discusses in detail the properties and limitations of X12ARIMA, TRAMO-SEATS and STAMP - the main seasonal adjustment methods used by statistical agencies. Several real-world cases illustrate each method and real data examples can be followed throughout the text. The trend-cycle estimation is presented using nonparametric techniques based on moving averages, linear filters and reproducing kernel Hilbert spaces, taking recent advances into account. The book provides a systematical treatment of results that to date have been scattered throughout the literature. Seasonal adjustment and real time trend-cycle prediction play an essential part at all levels of activity in modern economies. They are used by governments to counteract cyclical recessions, by central banks to control inflation, by decision makers for better modeling and planning and by hospitals, manufacturers, builders, transportation, and consumers in general to decide on appropriate action.

This book appeals to practitioners in government institutions, finance and business, macroeconomists, and other professionals who use economic data as well as academic researchers in time series analysis, seasonal adjustment methods, filtering and signal extraction. It is also useful for graduate and final-year undergraduate courses in econometrics and time series with a good understanding of linear regression and matrix algebra, as well as ARIMA modelling.

Read more

About the author

Estela Bee Dagum is currently a Research Professor of the Department of Statistical Sciences of the University of Bologna, Italy where she was a Full Professor for 10 years until 2007 (appointed by Chiara Fama, an Italian system for appointing internationally recognized scientists of the very highest caliber). From 2007 until December 2009 she was appointed as Alumna of the Business Survey and Methodology Division at Statistics Canada to serve as a consultant on time series issues, particularly on linkage, benchmarking, trend and seasonal adjustment. Previously, Estelle Bee Dagum was Director of the Time Series Research and Analysis Centre of Statistics Canada where she worked for 21 years (1972-1993). In 1980, she developed the X11ARIMA seasonal adjustment method, later modified to X12ARIMA, which is currently used by most of the world’s statistical agencies. In 1994, she jointly developed a benchmarking regression method that is currently used by Statistics Canada and other agencies for benchmarking, interpolation, linkage and reconciliation of time series systems. Estelle Bee Dagum has served as a consultant to a large number of governments and private entities, published 19 books on time series analysis related topics, and more than 150 papers in leading scientific and statistical journals.

Silvia Bianconcini is an Associate Professor at the Department of Statistical Sciences, University of Bologna, where she received her PhD on Statistical Methodology for the Scientific Research. Her main research interests are time series analysis with an emphasis on signal extraction, longitudinal data analysis based on latent variable models, and statistical inference of generalized linear models.

Read more

Additional Information

Read more
Published on
Jun 20, 2016
Read more
Read more
Read more
Read more
Read more
Read more
Business & Economics / Econometrics
Business & Economics / Economics / Macroeconomics
Business & Economics / General
Business & Economics / Statistics
Mathematics / Probability & Statistics / General
Social Science / Research
Social Science / Statistics
Read more
Content Protection
This content is DRM protected.
Read more

Reading information

Smartphones and Tablets

Install the Google Play Books app for Android and iPad/iPhone. It syncs automatically with your account and allows you to read online or offline wherever you are.

Laptops and Computers

You can read books purchased on Google Play using your computer's web browser.

eReaders and other devices

To read on e-ink devices like the Sony eReader or Barnes & Noble Nook, you'll need to download a file and transfer it to your device. Please follow the detailed Help center instructions to transfer the files to supported eReaders.
Sample data alone never suffice to draw conclusions about populations. Inference always requires assumptions about the population and sampling process. Statistical theory has revealed much about how strength of assumptions affects the precision of point estimates, but has had much less to say about how it affects the identification of population parameters. Indeed, it has been commonplace to think of identification as a binary event - a parameter is either identified or not - and to view point identification as a pre-condition for inference. Yet there is enormous scope for fruitful inference using data and assumptions that partially identify population parameters. This book explains why and shows how. The book presents in a rigorous and thorough manner the main elements of Charles Manski's research on partial identification of probability distributions. One focus is prediction with missing outcome or covariate data. Another is decomposition of finite mixtures, with application to the analysis of contaminated sampling and ecological inference. A third major focus is the analysis of treatment response. Whatever the particular subject under study, the presentation follows a common path. The author first specifies the sampling process generating the available data and asks what may be learned about population parameters using the empirical evidence alone. He then ask how the (typically) setvalued identification regions for these parameters shrink if various assumptions are imposed. The approach to inference that runs throughout the book is deliberately conservative and thoroughly nonparametric. Conservative nonparametric analysis enables researchers to learn from the available data without imposing untenable assumptions. It enables establishment of a domain of consensus among researchers who may hold disparate beliefs about what assumptions are appropriate. Charles F. Manski is Board of Trustees Professor at Northwestern University. He is author of Identification Problems in the Social Sciences and Analog Estimation Methods in Econometrics. He is a Fellow of the American Academy of Arts and Sciences, the American Association for the Advancement of Science, and the Econometric Society.
In modern economies, time series play a crucial role at all levels of activity. They are used by decision makers to plan for a better future, by governments to promote prosperity, by central banks to control inflation, by unions to bargain for higher wages, by hospital, school boards, manufacturers, builders, transportation companies, and by consumers in general.

A common misconception is that time series data originate from the direct and straightforward compilations of survey data, censuses, and administrative records. On the contrary, before publication time series are subject to statistical adjustments intended to facilitate analysis, increase efficiency, reduce bias, replace missing values, correct errors, and satisfy cross-sectional additivity constraints. Some of the most common adjustments are benchmarking, interpolation, temporal distribution, calendarization, and reconciliation.

This book discusses the statistical methods most often applied for such adjustments, ranging from ad hoc procedures to regression-based models. The latter are emphasized, because of their clarity, ease of application, and superior results. Each topic is illustrated with many real case examples. In order to facilitate understanding of their properties and limitations of the methods discussed, a real data example, the Canada Total Retail Trade Series, is followed throughout the book.

This book brings together the scattered literature on these topics and presents them using a consistent notation and a unifying view. The book will promote better procedures by large producers of time series, e.g. statistical agencies and central banks. Furthermore, knowing what adjustments are made to the data and what technique is used and how they affect the trend, the business cycles and seasonality of the series, will enable users to perform better modeling, prediction, analysis and planning.

This book will prove useful to graduate students and final year undergraduate students of time series and econometrics, as well as researchers and practitioners in government institutions and business.

From the reviews:

"It is an excellent reference book for people working in this area." B. Abraham for Short Book Reviews of the ISI, December 2006

A New York Times bestseller

"Brilliant, funny…the best math teacher you never had." —San Francisco Chronicle

Once considered tedious, the field of statistics is rapidly evolving into a discipline Hal Varian, chief economist at Google, has actually called "sexy." From batting averages and political polls to game shows and medical research, the real-world application of statistics continues to grow by leaps and bounds. How can we catch schools that cheat on standardized tests? How does Netflix know which movies you’ll like? What is causing the rising incidence of autism? As best-selling author Charles Wheelan shows us in Naked Statistics, the right data and a few well-chosen statistical tools can help us answer these questions and more.

For those who slept through Stats 101, this book is a lifesaver. Wheelan strips away the arcane and technical details and focuses on the underlying intuition that drives statistical analysis. He clarifies key concepts such as inference, correlation, and regression analysis, reveals how biased or careless parties can manipulate or misrepresent data, and shows us how brilliant and creative researchers are exploiting the valuable data from natural experiments to tackle thorny questions.

And in Wheelan’s trademark style, there’s not a dull page in sight. You’ll encounter clever Schlitz Beer marketers leveraging basic probability, an International Sausage Festival illuminating the tenets of the central limit theorem, and a head-scratching choice from the famous game show Let’s Make a Deal—and you’ll come away with insights each time. With the wit, accessibility, and sheer fun that turned Naked Economics into a bestseller, Wheelan defies the odds yet again by bringing another essential, formerly unglamorous discipline to life.

The Black Swan is a standalone book in Nassim Nicholas Taleb’s landmark Incerto series, an investigation of opacity, luck, uncertainty, probability, human error, risk, and decision-making in a world we don’t understand. The other books in the series are Fooled by Randomness, Antifragile, Skin in the Game, and The Bed of Procrustes.

A black swan is a highly improbable event with three principal characteristics: It is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random, and more predictable, than it was. The astonishing success of Google was a black swan; so was 9/11. For Nassim Nicholas Taleb, black swans underlie almost everything about our world, from the rise of religions to events in our own personal lives.
Why do we not acknowledge the phenomenon of black swans until after they occur? Part of the answer, according to Taleb, is that humans are hardwired to learn specifics when they should be focused on generalities. We concentrate on things we already know and time and time again fail to take into consideration what we don’t know. We are, therefore, unable to truly estimate opportunities, too vulnerable to the impulse to simplify, narrate, and categorize, and not open enough to rewarding those who can imagine the “impossible.”
For years, Taleb has studied how we fool ourselves into thinking we know more than we actually do. We restrict our thinking to the irrelevant and inconsequential, while large events continue to surprise us and shape our world. In this revelatory book, Taleb explains everything we know about what we don’t know, and this second edition features a new philosophical and empirical essay, “On Robustness and Fragility,” which offers tools to navigate and exploit a Black Swan world.
Elegant, startling, and universal in its applications, The Black Swan will change the way you look at the world. Taleb is a vastly entertaining writer, with wit, irreverence, and unusual stories to tell. He has a polymathic command of subjects ranging from cognitive science to business to probability theory. The Black Swan is a landmark book—itself a black swan.
Praise for Nassim Nicholas Taleb
“The most prophetic voice of all.”—GQ
Praise for The Black Swan
“[A book] that altered modern thinking.”—The Times (London)
“A masterpiece.”—Chris Anderson, editor in chief of Wired, author of The Long Tail
“Idiosyncratically brilliant.”—Niall Ferguson, Los Angeles Times
“The Black Swan changed my view of how the world works.”—Daniel Kahneman, Nobel laureate
“[Taleb writes] in a style that owes as much to Stephen Colbert as it does to Michel de Montaigne. . . . We eagerly romp with him through the follies of confirmation bias [and] narrative fallacy.”—The Wall Street Journal
“Hugely enjoyable—compelling . . . easy to dip into.”—Financial Times
“Engaging . . . The Black Swan has appealing cheek and admirable ambition.”—The New York Times Book Review
©2019 GoogleSite Terms of ServicePrivacyDevelopersArtistsAbout Google|Location: United StatesLanguage: English (United States)
By purchasing this item, you are transacting with Google Payments and agreeing to the Google Payments Terms of Service and Privacy Notice.