Epistemological
Context in Social Simulation:
Merging Empirical
Realities and Data for Validation Strategies
Using Supercomputers,
DNA Microarray Analysis, and Case-Based Reasoning
by
Dallas F.
Bell, Jr.
Abstract: Social simulation
is increasingly being determined to be useful for human decision making
by facilitating the sorting of possible behavioral options into credible
probabilities. The effort to make simulation more accurate has
brought focus to the epistemological foundations of decision making.
Locke, Berkeley, Hume, and other philosophers help to provide the historical
solutions to questions concerning how we know. With their
reasoning tools of induction and deductive confirmation, the societal
universal center of gravity can be reflected much like Galileo's principle
involving motion and matter. Beginning at the highest societal
level of a nation-state, we may examine the nation-state's institutions
and the institutions' demographics. Then, the common need levels
of each individual can be observed as being pursued by behavior or motion
that indicates their epistemological rationality based on the core theological
belief. That structure or matter provides the simulation parameters
of potential for both the human brain and supercomputers. The recent
examples of merging empirical realities and data in DNA microarray analysis
and case-based reasoning illustrate the success of the validation strategies.
Thereby, epistemology in social simulation can be seen in its proper
context.
Keywords: epistemology,
social simulation, supercomputers, systematic political science
1. Introduction
Social simulation can obviously
be useful to human decision making reality by facilitating the sorting
of possible behavioral options into likely courses of action.
Conversely, the actual occurrences of the simulated behavior can validate
the accuracy of the reality of the inputted decision making foundations.
Understanding the foundations of decision making has correctly brought
due focus to epistemology. Galileo played a key role in the history
of science and philosophy. He provided a principle that encompassed
both the motion and nature of matter (Galilei 1638/2000).
Social sciences must have the same universal center of gravity for accurate
simulations. Simulators must ask how the subject or simulators
can know the object of the simulation. They must also determine
the motives of the subject and the object.
2. Epistemology
2.1 General
Context
If a car is observed to be
the color of blue, it is not in reality blue. The car is absorbing
the spectrum of all colors except blue which is being reflected and
perceived by the observer. Simulation must reflect reality and
not just mere perception. If humans are observed fleeing from
an office building which is filling with smoke, it would appear to reflect
panic. Those office workers in reality are not inherently panicky.
They have processed the input of smoke and expressed an output to flee
the building based on inherent decision making abilities.
The philosopher David Hume
said we do not see the cause of the universe but only the effect.
We know that if there is an effect that there must have been an antecedent
cause. Years earlier, John Locke stated that correspondence meant
truth comes from reality and not human imagination contrary to Plato,
Augustine, and other rationalists (Sproul 2000). Rationalists
knew that small children innately understood the law of noncontradiction
or the idea that something can't both be here and there and/or this
and that simultaneously. Aristotle, Thomas Aquinas, and other empiricists
acknowledge that all possible examples cannot be known and the senses
can be deceived. Therefore, the standard of 100 percent certainly
is not a viable goal for simulation. Later, George Berkeley added
that truth is correspondent to reality as perceived by the infinite
God. Thus, reality is objective and not subjectively based on
finite perceptions. Then simulation accuracy requires more than
inclusion of just materialism or things that can be experienced through
the senses. Simulations must also realistically consider the nonmaterial
(e.g., love and justice). If the standard required empirical verification,
the standard itself couldn't be verified.
Love and justice, two examples
of nonmaterial reality, can't be seen but their effects on behavior
make their existence a factor for behavioral motivations. The
alternative of "no" love and "no" justice is untrue because
humans innately compare via negationis, meaning love or "not"
love and just or "not" just. This example of Boolean logic
serves to reinforce linguistic efforts currently being undertaken to
mine and analyze related corpora which is providing valuable input for
simulations. Additionally, the truth of a cause may be questioned but
that query process should not consider that a violation of the law of
causation has occurred if the effect's cause is not seen or exactly
known.
In the 1700's, Immanuel Kant
merged rationalism and empiricism into the understanding that we know
through the senses and experience with a priori knowledge in
intuitive categories in the brain. That balanced idea was biblically
expressed by the writings of David who said that the heavens declare
the glory of God (David 1004-965 B.C.) and Paul's words that
the invisible things of Him (God) from the creation of the world are
clearly seen (Paul 55-56 A.D.). David and Paul realized
the senses and experience confirm the innate knowledge of the infinite
God, unlike Kant, ironically named Immanuel which is Hebrew for God
with us, who thought the seen could not be mixed with the unseen.
Kant's agnostic view led Georg W. F. Hegel to merge or synthesize
the seen and unseen with a dialectical approach where a thesis that
generates an antithesis is synthesized into a higher truth. Simply
said, those ideas of a subject and its object are thought to be evolving
into the highest truth that all things are made up of the parts of a
finite god.
Hegel's method was adopted
by those like Karl Marx, as evidenced in dialectical materialism which
formulated communism from its socialist utopian eschatology. Communism
and socialism have thus far resulted in a minimum of 100 million murders
(Courtois 1997/1999). This deadly eschatological view continues
to be held by many today. Their general atheist premise is that
anything that works, no matter how temporary, is to be considered the
truth. That truth is thought to relate chaotically with other
truths and are not considered to be subsets of a total truth.
History is replete with the destruction of both individuals and societies
that have implemented those failed theological beliefs. The emergence
of a prevalent world view which is also unanchored to natural law will
cause chaos from the corresponding conflict. This should eventually
produce a world leader to fill the vacuum of peace. Such an opportunist
could garner rapid popular support by declaring a utopian end to conflict.
Though he may offer the hope of world peace, his hidden nefarious agenda
would be revealed in time as all other despots before him.
2.2 Social Simulation
Context
The universal center of gravity
for social simulation may be found by inducing the structure of societies
beginning at the level of nation-states and their dominate eschatology.
Nation-states are composed of the descending institutions of government,
business, church, and family. These institutions are made up of
subset categories of individuals. Individuals must meet their
common ascending needs of survival, economic security, love and affection,
status and self-esteem, and self-actualization. Those individual
needs are pursued from the innate epistemological process of storing
input based on what is considered good or rational and what is considered
evil or irrational. These epistemological values are calibrated
by the individually chosen divine authority and standard for what is
considered good and/or evil. This reflects the universal center
of gravity for social science and its philosophers--theology.
Systematic political science
confirms the previously discussed structure of societal induction by
charting its conclusion with deduction beginning from either one of
three tracks of theological choices. The natural laws of freewill
(NLF) provide the anchors for the Manifold Equation of Theological Asymmetry
(META) game theory process. The theological tracks correspond
to the epistemological track which corresponds to the individual and
subsequent nation-state track formed from the eschatological belief
of the originating theology (Bell 2002-2006).
On a nonspecific micro societal
level, the theological categories of people fleeing an office building
filled with smoke may not be known but estimates can be made from appropriate
sample data that should not be considered stable. It would be
useful for crisis managers and risk analysts to know if the office workers
had a theology like Soren Kierkegaard, who promoted an epistemology
of self-sacrifice, which would make them likely to assist their fellow
workers injured by smoke or fire. On the other hand, it would
be equally as important to identify the office workers that had a theology
like Friedrich Nietzsche, the clinically diagnosed madman, and his followers
such as Adolf Hitler, Sigmund Freud, Carl Jung, Alfred Adler, William
Yeats, and George Bernard Shaw (Wicks 2004). Nietzsche
dialectically reasoned that man must become a Superman and embrace the
will to power. This involved epistemologically rejecting the self-sacrificing
attributes of pity, mercy, grace, etc. and would make such believers
very unlikely to assist co-workers in any situation that is not self-serving.
There are groups, such as college
fraternities, whose membership can be specifically known and the data
considered stable. On a macro societal level, more data is available
and provides more stability for nation-state simulation. For example,
the likelihood or probability of a government either rejecting or becoming
an ally of terrorists could be reasonable determined.
3. Supercomputers
and the Human Brain
The human brain contains billions
of neurons that are connected to around 10,000 synapses depending on
specific neuroanatomy. That cluster of neurons forms a parallel
information processing system. A person can simultaneously write
a letter while listening to a class lecture and smell the cologne of
another student as they scratch their back on their chair. That
ability contrasts with computers which execute a single series of instructions
on a single processor.
The number of action potentials
and synaptic potentials in the human brain are generated by a combination
of morphology, imaging during activity, and the energy consumption.
It is difficult to calculate its floating points of operations per second
(flops). Because, according to the National Center for Supercomputing
Applications (NCSA), the coding is analog which means the coding modality
is frequency of action potentials instead of having the information
coded in each one (Jakobsson 2006). Estimates of human
processing ability may range from 100 teraflops (Tflops) or 100 trillion
flops to 100 petaflops (Pflops) or 100 quadrillion flops. Studies
conducted on the intellectual ability of identical twins raised in different
environments indicate that their inherited problem solving ability (IQ)
remained the same relative to their ages (Garlick 2002).
That IQ range of potential may be achieved by stimulation causing the
neurons to adapt referred to as neural plasticity.
In the future supercomputers
are expected to reach the processing ability of humans.
Artificial intelligence (AI) systems will never be grounded in reality
which requires the human quality of self-will that motivates behavior.
However, supercomputers will be useful and need to epistemologically
react with the incomplete input of their finite human programmers by
probabilities. Those probabilities may be used in a Bayesian sense
to enhance decision making. The theological inputted values of
the programmer used for simulation will determine its accuracy and benefit.
The more the programmer is aligned with the realities of natural law
the higher the probability of accuracy. Just as a well known chess
program assigns a processor to each of the 64 squares on a chess board,
an ultimate supercomputer for social simulation at the highest levels
would need to assign a processor section, minimally equal to a human
brain, to each of the 512 subsets of the systematic political science
model. Of course isolated or limited microsimulations, such as
modeling the traffic patterns in an office building, could operate with
a much smaller system.
4. DNA Microarray
Analysis
To insure emails sent over
the Internet do not become changed in the transmission a "check sum"
protocol was developed. Bits of information beginning as a 0 or
1 value may become garbled during transmission by being inverted to
the opposite value during the email process. The communications
is now checked by counting the number of 1s in the message. If
the number is odd the last bit would be set to 1. Otherwise, it
is set to 0. By comparing the number of 1s from the sender with
the value of the last bit on the receiving end, the receiving computer
can determine if the message was accurate. If it wasn't, the
receiver's computer can request the sending computer resend the email.
Similarly, DNA microarray data
points in a time series are now being compared against a summary of
the temporal response (Bar-Joseph 2005). If the two sets
of results are equal, the DNA microarray time series is considered real.
If not, a gene's activation is considered to have been missed.
This method can be applied to social simulation. A time series
of human institutions can be sampled and compared to the static systematic
political science template. If there are inconstancies, an inquiry
would be needed and adjustments made. This process will also aid
in overcoming synchronization loss. Like large groups of living
cells, humans with similar points in time eventually become asynchronized
in their activity. If specific data is known, all relevant individuals
can be appropriately tagged and their asynchronization observed.
5. Case-Based
Reasoning
Microarray data sets have been
documented to improve prediction accuracy by approximately 10 percent
(Arshadi, Jurisica 2005) using case-based reasoning (CBR).
The problem paradigm of CBR does not just rely on the general knowledge
of a problem domain, or making an association between descriptors and
conclusions. As many know, CBR can use the specific knowledge
of previously experienced problems or cases. The new problem may
be solved by finding a similar problem and then reusing that solution.
Each time a problem is resolved that data is available to be reapplied
to future problems.
If a government's ambassador
was negotiating a difficult situation a previous case may be applied
and a decision reached. That problem solving and learning feature
of CBR should also incorporate all types of knowledge within the domain
of systematic political science. Thereby, post hoc tendencies,
or the fallacy of arguing from a temporal sequence to a false causal
relation, may be avoided. A common humorous example is to wrongly
surmise that a rooster causes the sun to rise since the sun is observed
rising each morning after the rooster has crowed.
6. Conclusion
The introduction of this paper
briefly reviewed the philosophy of how we know. The societal model
for simulation was demonstrated by the systematic political science
template. It was followed by a description of the potential of
knowing for humans and supercomputer AI. DNA microarray analysis
was demonstrated to have been enhanced by seemingly unrelated computing
methods and CBR. In turn, they may improve the art of social simulation.
It is becoming increasingly clear that when empirical realities and
data are properly merged they can serve to validate simulation strategies
which would be wholly inaccurate without their epistemological context.
References
Arshadi, Niloofar; Juriscia,
Igor (2005) Data Mining for Case-Based Reasoning in High-Dimensional
Biological Domains. IEEE Volume 17, 1127-1137.
Bar-Joseph, Ziv (December 2005)
Carnegie Mellon-Led Research Team Transforms DNA Microarray Analysis
with Ideas from a Standard Internet Communications Protocol. A
press release by Carnegie Mellon University.
Bell, Dallas F., Jr. (2002-2006)
Parts I-IX. A series of papers on systematic political science.
Courtois, Stephane (1997/1999)
editor, Le Livre Noir du Communisme: Harvard University Press.
Galilei, Galileo (1638/2000)
Dialogues Concerning Two New Sciences, Toronto: Wall and Emerson.
Garlick, Dennis (2002) Understanding
the Nature of the General Factor of Intelligence: The Role of Individual
Differences in Neural Plasticity Explanatory Mechanism. Psychological
Review 109, 116-136.
Jakobsson, Eric (February 2006).
An email exchange initiated by Dallas F. Bell, Jr. and facilitated by
Thom Dunning the director of the National Center for Supercomputing
Applications (NCSA).
King James Version Bible
(David 1004-965 B.C.) Psalms chapter 19 verse 1.
(Paul 55-56 A.D.) Romans chapter
1 verse 20.
Robinson, Thomas (1992)
The Bible Timeline, Nashville: Thomas Nelson Publishers.
Sproul, R. C. (2000) The
Consequences of Ideas: Understanding the Concepts That Shaped Our World,
Wheaton, Illinois: Crossway Books.
U.S. Census Bureau (2005) Population
Division, International Programs Center.
Wicks, Robert (2004) Friedrich
Nietzsche, Stanford Encyclopedia of Philosophy.
-----------------ALL
RIGHTS RESERVED © 2006 DALLAS F. BELL, JR.--------------
|