22. Simulation and Replication

     Real systems, as represented in thought, are products of definition, of imposing formal structure on the world-in-itself. Any definition is selective and intentional, like projecting patterns into an inkblot.

     Two systems are believed functionally equivalent when they are considered to behave alike in select ways. A set of propositions is abstracted to express the formalism of which they are both examples. They may differ, however, in a potential infinity of details that are treated as irrelevant and indeterminate with regard to the definition of the system supplied by some intentional agent with its purposes.

     Science, as an intentional creation, may be a completely formalizable system, but this does not mean that Nature is. That physical processes are rule-governed, and that these rules can be formally expressed in an exhaustive way, are mere assumptions. Physical processes may appear rule-governed for the same reason that they are recognized at all: because they correspond to categories which are intentional creations. Nature may be unfathomable because it is infinitely complex in one or both directions of scale. Alternately, thought itself may be interminable, condemned to essential incompleteness.

     It may be an error to regard two natural systems as structurally or functionally identical simply because they are indistinguishable when viewed through formalizing spectacles. It is merely an assumption that structure can be exhaustively specified, and a further assumption that function resides in structure.

     These considerations should lead us to regard the concept of modelling with some reserve. Simulation-- which implies only selective functional equivalence-- must be clearly distinguished from replication, which means producing an exact structural copy. If the artificial replication of a natural system is to proceed by formally exhausting the being of the original-- by fully axiomatizing it in some "blueprint"-- then it could in principle be doomed to failure. Reality resists being caged in definitions.

     Some simulations may be adequate to their given purposes, while others are not because the models which generate them omit critical aspects. The closeness of the simulation, like the adequacy of the model, is relative.

     Information as a technical concept is independent of meaning in the way that a formal system is independent of reality and proof is independent of truth. The concept of information assumes an intentionally defined situation imposed upon a potential infinity of possibilities. Information is defined as the number of binary decisions needed to reduce alternatives to one. This presumes a decision procedure for a finite set of defined alternatives. Information can only be as determinate as the situation. And the concept of information parallels that of proof or derivation of a theorem: each binary decision (bit of information) is analogous to a step in a proof. Each step narrows the margin between premise and conclusion, just as each binary decision reduces the alternatives by half.

     Information refers to the order in a system perceivable by any subject. This notion of information, as an absolute, presupposes an objective situation whose structure is fully (and not merely selectively) accountable in some model. It takes for granted a reality beyond any formalization, while tacitly assuming that this reality is the one given in some human cognitive domain.

     The same considerations apply to behavior as to structure, function and information. Simulating behavior is not the same as fully replicating it. Formal models, and mechanisms constructed to embody them, capture only aspects of the behavior of natural systems. Behaviorism assumes an organism to be a mechanism and, hence, the embodiment of a formalizable system. Inadvertently, it takes selected aspects of the totality of the creature's activity to represent its behavior. This is rather like assuming that a pitcher for a national baseball team and a machine for pitching practice balls engage in the same "behavior". Just as the action of the machine is a grossly chunked version of the man's, behaviors identified by the laboratory observer are parodies of whatever the creature-in-itself does.

     Some philosophers hold that zombies are logically possible: "The very conceivability of a zombie shows that behavior can be explained in terms that neither involve nor imply the existence of experience".[22]  (A zombie is defined as the functionally identical equivalent of a conscious person, except that the zombie is not conscious.)  I suspect that what is conceived as the zombie is not actually identical to what is conceived as the conscious person. There is a functional difference-- precisely the function known subjectively as consciousness. Perhaps we do not know yet how to fully specify that function as behavior. But this is due to inadequate investigation rather than to the absence of such a function.

     The zombie (or "absent qualia" ) argument against functionalism does not work, because it implicitly identifies certain limited aspects of behavior as "function". It depends on mistaking simulation for replication or identity. On the other hand, it could be argued that the functionalist account does not work either for the same reason: the true or complete functioning of an organism cannot be known exhaustively.

     If it could be proven that two organisms (or an organism and an artificial system) are exhaustively identical in their functioning, and one of them happens to be conscious, then the other would necessarily also be conscious. The zombie concept tacitly assumes that the zombie is merely a simulation of the conscious person-- or one altered in specific ways that render them not functionally equivalent.