book explains how to generate an adequate description of uncertainty, how to justify
semiheuristic algorithms for processing uncertainty, and how to make these algorithms
more computationally efficient. It explains in what sense the existing approach to
uncertainty as a combination of random and systematic components is only an
approximation, presents a more adequate three-component model with an additional
periodic error component, and explains how uncertainty propagation techniques can
be extended to this model. The book provides a justification for a practically efficient
heuristic technique (based on fuzzy decision-making). It explains how the computational
complexity of uncertainty processing can be reduced. The book also shows how to
take into account that in real life, the information about uncertainty is often only
partially known, and, on several practical examples, explains how to extract the missing
information about uncertainty from the available data.
The book presents extended versions of selected papers from the annual International Workshops on Constraint Programming and Decision Making (CoProd’XX) from 2013 to 2016. These workshops, held in the US (El Paso, Texas) and in Europe (Würzburg, Germany, and Uppsala, Sweden), have attracted researchers and practitioners from all over the world.
It is of interest to practitioners who benefit from the new techniques, to researchers who want to extend the ideas from these papers to new application areas and/or further improve the corresponding algorithms, and to graduate students who want to learn more – in short, to anyone who wants to make more effective decisions under constraints.
The need for such methods stems from the fact that, when we have to decide where to place sensors, or which algorithm to use for processing the data—we mostly rely on experts’ opinions. As a result, the selected knowledge-related methods are often far from ideal. To make better selections, it is necessary to first create easy-to-use models of knowledge-related processes. This is especially important for big data, where traditional numerical methods are unsuitable.
The book offers a valuable guide for everyone interested in big data applications: students looking for an overview of related analytical techniques, practitioners interested in applying optimization techniques, and researchers seeking to improve and expand on these techniques.
This book shows how symmetries can be used in all classes of algorithmic problems of sciences and engineering: from analysis to prediction to control. Applications cover chemistry, geosciences, intelligent control, neural networks, quantum physics, and thermal physics. Specifically, it is shown how the approach based on symmetry and similarity can be used in the analysis of real-life systems, in the algorithms of prediction, and in the algorithms of control.
Our explanation is that since human abilities to process information are limited, we operate not with the exact values of relevant quantities, but with “granules” that contain these values. We show that optimization under such granularity indeed leads to observed human behavior. In particular, for the first time, we explain the mysterious empirical dependence of betting odds on actual probabilities.
This book can be recommended to all students interested in human decision-making, to researchers whose work involves human decisions, and to practitioners who design and employ systems involving human decision-making —so that they can better utilize our ability to make decisions under uncertainty.
The following articles have been published:
Neutrosophic Systems and Neutrosophic Dynamic Systems;
Tri-complex Rough Neutrosophic Similarity Measure and its Application in
Multi-attribute Decision Making;
Generalized Neutrosophic Soft Multi-attribute Group Decision Making
Based on TOPSIS;
When Should We Switch from Interval-Valued Fuzzy to Full Type-2 Fuzzy
Neutrosophic Index Numbers: Neutrosophic Logic Applied In The Statistical
Structural Properties of Neutrosophic Abel-Grassmann's Groupoids;
Neutrosophic Actions, Prevalence Order, Refinement of Neutrosophic
Entities, and Neutrosophic Literal Logical Operators.
The following papers have been published:
From Unbiased Numerical Estimates to Unbiased Interval Estimates;
New Distance and Similarity Measures of Interval Neutrosophic Sets;
Lower and Upper Soft Interval Valued Neutrosophic Rough Approximations of An
Cosine Similarity Measure of Interval Valued Neutrosophic Sets;
Reliability and Importance Discounting of Neutrosophic Masses;
The Handbook of Granular Computing offers a comprehensive reference source for the granular computing community, edited by and with contributions from leading experts in the field.Includes chapters covering the foundations of granular computing, interval analysis and fuzzy set theory; hybrid methods and models of granular computing; and applications and case studies. Divided into 5 sections: Preliminaries, Fundamentals, Methodology and Algorithms, Development of Hybrid Models and Applications and Case Studies. Presents the flow of ideas in a systematic, well-organized manner, starting with the concepts and motivation and proceeding to detailed design that materializes in specific algorithms, applications and case studies. Provides the reader with a self-contained reference that includes all pre-requisite knowledge, augmented with step-by-step explanations of more advanced concepts.
The Handbook of Granular Computing represents a significant and valuable contribution to the literature and will appeal to a broad audience including researchers, students and practitioners in the fields of Computational Intelligence, pattern recognition, fuzzy sets and neural networks, system modelling, operations research and bioinformatics.
This versatile volume helps practitioners to learn how to apply new techniques of econometrics of risk, and researchers to further improve the existing models and to come up with new ideas on how to best take into account economic risks.