Reverse Engineering the Universe: via Post-Structuralism

UberMann
Free sample

Ground Zero Earth

It appears that the standard solutions to our basic problems has deteriorated into yelling and screaming. I think there is a better way!

I have always felt that there must be some important conflict, some large out of equilibrium process, that drives philosophical development. In the past that important conflict was often major wars that were quite traumatic like the Civil War or World War I. The driving conflict could also be glaring inconsistencies in the social-cultural assumptions. This driving conflict leads the social-cultural realm to be out of equilibrium enough, over a long enough period, so it can cause major thought change. For the new age I could not, at first, find such a traumatic conflict, at least directly visible, such as in a war. What I have finally concluded is that the conflict of the new age is not any particular war but the fear of the war to end all wars. The new age has been driven by the out of equilibrium fear of the nuclear apocalypse. Thus the need for a new meta-physic.
 
So they needed to go elsewhere to seek out sources of knowledge. The standard sources seem to only know the old physical world (at least the learned sources appeared to only want to know the old physical world) which is definitely passing away in a horrible nuclear way. The physical apocalypse that the new agers saw was far worse than anything in the Apocalypse of John. Remember, although we all knew about this problem we were not able to talk about it generally. It was too horrible. So it remains the subconscious assumption behind our thoughts and our own philosophical development. It is obvious that this driving force would engender more interest in another reality, a reality that could not be affected by the final nuclear event. The social-cultural realm was far from equilibrium, in a hidden way, and thus we were driven to seek new approaches to resolving our inner conflict. The new age had to happen and it had to question the meta-physics.

Dr. Jerome Heath, Ph. D.

Read more

About the author

Dr. Jerome Heath received his Ph. D from the University of Hawaii in Communication and Information Science an interdisciplinary Ph. D. degree. He has worked as an application developer and taught programming and computer systems for many years. He has written many textbooks. His studies have been in hermeneutics based on archeology of communicative systems (the conservative view) and communicative action (the liberal view). This approach offers a more balanced and thorough view of the underlying systems.

(Also see books by Dr. Jerome Heath: sites.google.com/site/jbhcontextcalculus/jbhcontextcalculus)

Read more

Reviews

Loading...

Additional Information

Publisher
UberMann
Read more
Published on
Jan 1, 2016
Read more
Pages
138
Read more
Read more
Best For
Read more
Language
English
Read more
Genres
Philosophy / Metaphysics
Read more
Content Protection
This content is DRM protected.
Read more

Reading information

Smartphones and Tablets

Install the Google Play Books app for Android and iPad/iPhone. It syncs automatically with your account and allows you to read online or offline wherever you are.

Laptops and Computers

You can read books purchased on Google Play using your computer's web browser.

eReaders and other devices

To read on e-ink devices like the Sony eReader or Barnes & Noble Nook, you'll need to download a file and transfer it to your device. Please follow the detailed Help center instructions to transfer the files to supported eReaders.
Dr. Jerome Heath
Do you want to know how a computer works?

This book discusses many hardware and software aspects of computing. It covers the design of circuits, logical gates, data and data storage, counting and arithmetic, buses and bus designs, network connectivity, the CPU and its design, the computers instruction set, the modern use of microprocessing, assembly language and the interpretation of human readable computer code, the activities of the operating system, and computer peripherals. There is a lot to cover so the book is quite intense. But it is easy to read; and you can read it at your own pace (unless this is part of a course - then the professor sets the pace).

It is meant as a preparation for research, development, design, building, and testing of computer equipment.

The book is about how computers and parts of computers are designed and how connecting these part together makes the computer capable of processing internally held instructions that can run the computer independently. This covers how data and instructions are processed, stored, and communicated by computers. This also concerns how data is communicated among electronic components, and how software is written and compiled.
 

Finally -

This textbook is a comprehensive work and provides a basis for a detailed understanding of how the computer works. Descriptions include detailed drawings that clarify what the text is talking about. Everyone can read this book as the book is written for easy understanding. But the book is quite detailed in the coverage of computer design issues. I think you will enjoy reading it and it will inform you about computer design as no other textbook can.

The book can be easily understood by anyone who takes the time to study each of the diagrams and its description.

Dr. Jerome Heath

Dr. Jerome Heath

The Etiology of Natural Patterns 
 
Symmetry is pervasive in living things. Nature provides a number of different patterns: spirals, ripples, patterns on birds feathers, and spots and stripes on animals.

In high speed photography we can see the crown-shaped splash pattern formed when a drop falls into a pond. We see five-fold symmetry is found in such creatures as starfish, sea urchins, and sea lilies. Snowflakes have striking six-fold symmetry. Dunes may form  crescents, very long straight lines, stars, domes, parabolas, and longitudinal or sword shapes. There are also symmetries, like: trees, spirals, meanders, waves, foams, cracks, spots, and stripes.

From the common understanding of entropy, we expect most things in this world to be random instead of ordered, and these random distributions should show dissipation and not order. It takes energy to create order.

Ramsey Theory says that order is the inevitable result of a large amount of random trials. Hungarian biologist Aristid Lindenmayer, and French American mathematician Benoît Mandelbrot showed how the mathematics of fractals could create patterns that appear to be natural in computer printouts. These are just the beginnings of understanding the harmonics of nature.

This book looks into the patterns in nature. Instead of just listing the interesting patterns, I am concerned about demonstrating a general etiology of those patterns. This is a new way of looking at the physical universe itself to understand not only the etiology of harmonics but the general physics of those patterns. Thus we can see a set of characteristics that allows us to understand, predict, and use the processes of these patterns.

Entropy and Energy

At equilibrium the distribution of energy is not related to the physical dimensions of the container (or any other physical limitation). The bell curve of energy distribution is related to energy graphed on energy level not on the position of those energy levels in the container. That bell curve of energy versus energy level is a characteristic of equilibrium. Inside our container there is no bell shaped distribution of energy on distance. The motions of the molecules are quit chaotic. They are at equilibrium but the individual molecules do not show this.

On the whole the system is in equilibrium but the energy differences in local areas of the system make it possible for local disturbances (perturbations) to effect the system in a special way. The perturbation can affect the distribution of energy in the system without having an effect on the equilibrium. The local imbalances help the perturbation to produce wave like forms which continue the balance of that energy average that equilibrium requires but provides information to the system about the perturbations and about the system. The wave form balances local low and high energy to produce no change in energy or overall distribution of energy. The wave signal goes out and about the system and provides information about the perturbation and through reflected waves provides information about the system itself. All this without affecting the equilibrium of the system. This involves local energy and balances local energy. It does not affect global energy issues.

These local imbalances can violate the equilibrium of thermodynamic entropy because the individual areas of the process become isolated into small groups based on perturbations due to initiating circumstances and activities. At that point small groups of molecules (or other forms of small groups) do not follow entropy processes of thermodynamics laws since these are based on probabilities and the theory of large numbers. The whole unit of the system still must remain at thermodynamic equilibrium (on average).

The building blocks of closed systems in nature are the constraints. What we see (what derives from the constraints) of such nature is based on the harmonics that are tunable to the constraints, on those constraints, and on how the constraints effect a probability distribution. These harmonics are not traceable, as would be the case under an equal and opposite reaction in Newtonian processes, since they are created by constraints, the constrained probability distribution, and some energy process not directly related to the result (e.g. air blown across the end of a tube). The results are not specific (not based on an equal and opposite reaction), but are a series of possibilities that are describable as theme and variation.

Harmonics of Nature

1. The first rule of Harmonics processes is that small groups of entities (including individual entities) do not necessarily follow thermodynamic equations [the rule of large numbers].

2. The second rule of Harmonic processes is that there are one or more constraints and/or initiating activities that insure the entities within the process are separated into small groups so that Harmonic processes (i. e. symmetric behavior) become more important than Entropy (i. e. dissipation) [the rule of isolation].

3. The third rule of Harmonic processes is that the constraints on the process provide a theoretical envelope or box which determines the ‘space’ the Harmonics processes can act in [the rule of constraints].

4. The fourth (and most important) rule of Harmonic processes is that the Harmonics processes that fit neatly into the envelope of the constraints will occur [the rule of symmetry].

5. The fifth rule (call it the Macro-Harmonics rule) of Harmonic processes is that larger groupings that are isolated from each other enough that it limits functional interrelationships between the groups can show Harmonic behavior expressed above normal entropic behavior [the rule of separation]

This short book is an excerpt of chapters from Reverse Engineering the Universe via Post-Structuralism.

Dr. Jerome Heath

Dr. Jerome Heath
 Demythologizing Jung

Demythologizing and deconstruction is the territory of the post-structuralist. But reconstruction should be the goal of such endeavors. Here the deconstruction of Jung's archetypes is reconstructed into a meaningful, workable, and useful concept of how the mind works.

The process of understanding conversation is to compare the text of a sentence with contextual information we have. The question is: “How do we store and retrieve the context in our grammar?” It is not stored using relational algebra, which is the method we use to store computer database data for efficient computer store and retrieve mechanisms. Relational data storage is not fast enough and it is not broad enough in its combinatorial strength to explain the minds process.

The mind has a way of producing mental objects out of the interpretation of external information. A fresh encounter with the outer world is analyzed by a neural network. The information is carried by nerves from the sensing point. These nerve signals are then filtered through neural networks.

The archetype [Jung] for that area of mental processing is the link with the conscious. From this link, a memory object can be extended from the archetype (as base class). Then the extended archetype layer becomes the output layer of the neural network  Note the archetype layer serves both as the interpretation function determining layer (how the input is interpreted) and, in the instantiation of the object from the base class extended to a memory object from (based on the neural interpretation).

This is a probabilistic process that is under constraints. The process is probabilistic but the constraints provide limitations so the result is controlled by these limitations producing a meaningful pattern. Thus the constraints prevent dissipation, and encourage meaningful results. The constraints in the young child are the archetypes. As we grow older our minds develop aggregate (abstract) classes that are useful as though they were archetypes. These constrain the mental process so that meaningful patterns result from the interpretation process.

The features of the archetypal classes, relating to the attributes and methods of a class, are then the same as the neural network activation functions. With input (our nerves send these signals about our present context) these features are used to interpret the signals (our internal program adapts them to interpretation of the input signals). When applied to a memory object in our conscious mind, the features (activation functions) are used in a way that they make the memory object useful and meaningful in our thought process. Remember the class here is a (hidden) layer of the neural network not a single node. Also an abstract class can be extended into a memory object (real class).
 
(Also see books by Dr. Jerome Heath: https://sites.google.com/site/jbhcontextcalculus/)

Dr. Jerome Heath
Do you want to know how a computer works?

This book discusses many hardware and software aspects of computing. It covers the design of circuits, logical gates, data and data storage, counting and arithmetic, buses and bus designs, network connectivity, the CPU and its design, the computers instruction set, the modern use of microprocessing, assembly language and the interpretation of human readable computer code, the activities of the operating system, and computer peripherals. There is a lot to cover so the book is quite intense. But it is easy to read; and you can read it at your own pace (unless this is part of a course - then the professor sets the pace).

It is meant as a preparation for research, development, design, building, and testing of computer equipment.

The book is about how computers and parts of computers are designed and how connecting these part together makes the computer capable of processing internally held instructions that can run the computer independently. This covers how data and instructions are processed, stored, and communicated by computers. This also concerns how data is communicated among electronic components, and how software is written and compiled.
 

Finally -

This textbook is a comprehensive work and provides a basis for a detailed understanding of how the computer works. Descriptions include detailed drawings that clarify what the text is talking about. Everyone can read this book as the book is written for easy understanding. But the book is quite detailed in the coverage of computer design issues. I think you will enjoy reading it and it will inform you about computer design as no other textbook can.

The book can be easily understood by anyone who takes the time to study each of the diagrams and its description.

Dr. Jerome Heath

Dr. Jerome Heath
Do you want to know how software is developed?

Agile is the new world view of systems development. Structured design is being relegated to systems that have a short development time, the way to develop the software is already known (there is no need for design), and the system will not change in any way during the design.
 
Agile methodologies have been developed over time from developers experiencing success by rejecting the ideas of the structured methodology and the waterfall style of project management.
 
The main strengths of Agile methods are:
Visibility (through the looking glass)
Adaptability (context calculus)
Business Value (incrementally increasing the value)
Less Risk (changes are made on a Just In Time bases)
 
The biggest problems with the waterfall techniques are:
Risky and expensive.
Inability to deal with changing requirements.
Problems with late integration.
Always required extensive rework to make software usable
 
Business advantages of Agile development:
Benefits can be realized early.
First to market and early and regular releases.
Testing is integrated so there is early recognition of any quality issues.
Excellent visibility for key stakeholders ensures expectations are managed.
Customer satisfaction through project visibility; customers own the project.
Incremental releases reduce risks.
Change is accepted, even expected.
Cost control - the scope and features are variable, not the cost.
Developers feel that they are part of the project and enjoy doing the work.
 
In agile development you are using post-modernist methodologies. Agile is post-modern or post structural. That goal was the exact goal of post-modernism. Agile and quality-productivity are the most effective post-modernist movements.
 
Agile methodology sidesteps the structured and the detailed documentation part of a project. The concentration is on people: first the customers, then the developers. Structured methodology was trying to use “hard science” principles in code development. But users could not understand the science part of the process. So the users were asked some questions (by different people than the ones doing the coding) and then the team went into hiding. They had to develop the code without user interference since the user would not understand the science of the process. The code was developed in black box mode. Each module was a black box; the code was secret to everyone but the coder – the only requirement was that the code was to do only and exactly what the specs called for. Also the process requires extensive documentation. The black box and back room process is ripe for some kind of sleight of hand so the process needs extensive documentation; to prove the work was done scientifically.
 
The structured methodology was always behind schedule. With structured methodology the users or customers were always less than satisfied. Actually the code was always a surprise compared to what they expected. The back room and black box methodology would guarantee that kind of surprise.
 
The Archeology of Communicative Systems Augmented by Communicative Action (an Agile Methodology)

Discourse analysis is being used to critique activities in literate communities.  The texts of the communities are analyzed from the viewpoint that these texts are a part of the discourse in the communities.  That means the discourse and not the text itself is the focal point of the investigation.

It is not too great a leap to see the problems of a computer system and its code as being amenable to such analysis.

The concepts of layers is a common method of dealing with cultural issues.  To branch into the archeology of a system is a new concept.  To some extent we use some of the same methods of analysis but with an importantly new perspective.

The methods of Foucault are aimed at analyzing the discourse of a literate group in order see how their discourse has developed and changed over time.  He called this archeology because the results of the discourse are similar to that of an archeological study. He also developed the concept of layers in an archeology of the discourse of some field which like standard archeology appear as historical layers in the discourse. 

Habermas developed the concept of communicative action which moves discourse analysis into the realm of development.  He also develops concepts of meaning in discourse, that allows us to examine the discourse for subtleties of meaning to define and change the discourse. The change is to be used to get agreement on the meaning of words and of concepts in order to lead to an overall agreement.

Recueor developed a concept  called distanciation that allows us to look at a system and its texts from a less structured approach. He mediatded the text in its interpretation by using poetry. This opened the meaning of the text in special ways.

All this can be used to develop a more dynamic if perhaps less structured systems analysis. 

Such a  systems analysis can be effective.  Particularly in systems where structure has not provided a solution to the issues at hand or that may have caused the very problems we are trying to solve.

Dr. Jerome Heath, Ph. D.

©2018 GoogleSite Terms of ServicePrivacyDevelopersArtistsAbout Google
By purchasing this item, you are transacting with Google Payments and agreeing to the Google Payments Terms of Service and Privacy Notice.