**Introduction
**Computer science is formulated with concepts borrowed from mathematics. Even though mathematics defines mathematical computation and computer science is about computation, it is argued here that there are fundamental differences between the two, that computer science is not well served by the borrowed concepts and that there exists a conceptual grounding that more effectually addresses the goals and problems of computer science.

At the center of mathematics is a mathematician performing computations. At the center of computer science are computations performing all by themselves. Computer science, quite reasonably, was founded on computational concepts borrowed from mathematics, which concepts, however, are guides for the mathematician with pencil and paper who provides liveness, flow coordination and memory. Concepts which were never intended to stand on their own apart from the mathematician. In computer science, these concepts are realized by emulating a mathematician. This might seem perfectly reasonable, indeed, computer science was founded with a machine explicitly emulating a mathematician with pencil and paper, but nature computes without reference to mathematician, pencil or paper and computer science strives to encompass all computation including the computations of nature. A mathematician’s brain, for instance, does not compute by emulating a mathematician.

The ghost of the mathematician that haunts computer science is an impediment to understanding computation in itself and while we have gotten a lot of mileage out of emulating the mathematician we can get even more mileage out of transcending the mathematician. Part 1 focuses on difficulties with two concepts borrowed from mathematics, Boolean logic and the sequential process, part 2 suggests an alternative abstraction that better serves the goals of computer science.

**Boolean Logic
**In mathematics, Boolean logic, in conjunction with a mathematician, is considered a theoretically fundamental primitive of computation. But, enlivened Boolean logic on its own, without a mathematician, is an incomplete model of computation. It cannot determine when a resolution has begun nor when a resolution has completed and can generate incorrect results.

If an input is presented to a Boolean logic network that causes no transition activity (same input as previous input), the logic cannot, itself, determine that a resolution has begun nor that a resolution has ended.

If an input is presented that causes transition activity in the network (different from previous input) and there are delays associated with the functions of the network, the functions will temporarily output incorrect results which are presented to successor functions. The network asserts a chaos of incorrect transitions (glitches) before the input completely propagates and the network settles asserting correct result values for the presented input. Again, the logic cannot determine amidst the chaotic transitioning when an input is completely presented or when a resolution is completed.

There must be an extra logical time interval referent coordinated with the actual delay behavior of the Boolean network to control the presentation of input (the beginning) and to wait long enough for the worst case chaotic transitions of the network to settle to a stable output before accepting the output (the end). The waiting long enough is accomplished with explicit memory elements (registers – data lifeboats) which, at the beginning of each interval, presents stably maintained input to the Boolean network, which during the interval isolates their contents from the indeterminate chaos of the transitioning Boolean network and which at the end of the interval accepts the stably presented output data of the Boolean network into the memory element.

The recurring time interval and the memories emulate the flow coordination and memory of a mathematician with pencil and paper in relation to an enlivened Boolean logic network. Boolean logic cannot compute on its own logical merits without this emulated ghost of a mathematician.

So what, you say? It works. But, what if there is a logic than can compute on its own logical merits, that has no need of any extra logical referent such as the ghost’s time interval and memories? Might it serve better than Boolean logic both conceptually and practically?

**The Sequential Process
**In mathematics, step by step sequence is fundamental to proof. If a proof proceeds in any manner other than one humanly understandable step at a time it cannot be accepted. Sequence of computation is not as fundamental but is deeply ingrained by history, practice and concept. Historically, the 12th century place-value arithmetic algorisms specified sequences of steps that anybody could follow and perform a computation. This meshed with a central feature of mathematics which is the rare resource of a single mathematician which must be used over and over in sequence to perform a computation. Computing by a sequence of steps became standard practice.

The sequence of proof and the sequence of computation became conflated in Turing’s machine which was a proof about computation in which sequential computation was an integral part of the proof. Turing’s machine became a definition of mathematical computation which eventually became the notion of algorithm which, in turn, became a fundamental paradigm of computation in computer science.

*“The notion of the algorithm is basic to all computer programming…”(1)*

*“One of the concepts most central to computer science is that of an algorithm.”(2)*

The notion of the algorithm is typically defined by simply presenting a list of properties that an expression must possess to qualify as an algorithm. The following is typical.

1. An algorithm is a sequence of steps that must terminate in a finite number of steps

2. Each step must be precisely defined

3. An algorithm must effectively and deterministically yield a correct solution (3)

It is easy to see how this list of restrictive characteristics serves to define what might acceptable as a mathematical proof or computation, but what conceptual service does the distinction algorithm/not algorithm perform for computer science? Is the expression of one fundamentally different from the expression of the other? Is one interpreted differently from the other? Are algorithms first class citizens in some sense and non-algorithms second class citizens?

Important programs do not qualify as algorithms. An operating system is not supposed to terminate nor does it yield a singular “solution”. A program with truly random input is not an algorithm. No program with a bug can be an algorithm and it is generally accepted that no significant program can be demonstrated to be bug free. Programs and computers that utilize concurrency are not algorithms. What does it mean when a sequential program qualifying as an algorithm is parallelized by a vectorizing compiler, and no longer qualifies as an algorithm?

These questions have not gone unnoticed.

*“…there is an extension of the notion of algorithm (called nondeterministic algorithms).”(4)*

*“Any computer program is at least a semi-algorithm and any program that always halts is an algorithm.”(5)*

*“A procedure which has all of the characteristics of an algorithm except that it possibly lacks finiteness may be called a ‘computational method.’”(6)*

*“There is another word for algorithm which obeys all of the above properties except termination and that is computational procedure.”(7)*

*“An algorithm A is a probabilistically good algorithm if the algorithm “almost always” generates either an exact solution or a solution with a value that is “exceedingly close” to the value of the optimal solution.”(8)*

*“The procedure becomes an algorithm if the Turing machine always halts”.(9)*

*“By admitting probabilistic procedures in algorithms….”(10)*

*“…if, after executing some step, the control logic shifts to another step of the algorithm as dictated by a random device (for example, coin tossing), we call the algorithm random algorithm.”(11) *

*“An algorithm which is based on such convergence tests is called an infinite algorithm.”(12) *

*“Algorithms that are not direct are called indirect.”(13) *

*“We drop the requirement that the algorithm stop and consider infinite algorithms” .(14)*

With these exercises in labeling, a computation can now not terminate, not be deterministic and not yield a specific solution (not be an algorithm). All that is left is “sequence of precisely defined steps”. Instead of trying to “fix” the notion of the algorithm perhaps we should consider the possibility that the notion is not as fundamental to computer science as once supposed.

Mathematicians and computer scientists are pursuing fundamentally different aims.

*Mathematics is concerned with the nature of specific computations
independent of how they might be expressed.*

*Computer science is concerned with the nature of the expression of computation
independent of any specific computation.(15)*

The primary questions of computer science are not of computational possibilities but of expressional possibilities.

*“computer science itself becomes no more (and no less) than the discipline of constructing appropriate descriptive languages.”(16)*

Is limiting description to “a sequence of precisely defined steps” help or hindrance or simply beside the point?

What is a step? How is it sequenced? What does it do? The notion of the algorithm remains silent. A good engineer, however, can fill in the blanks. If there is sequence there must be a sequence controller to instantiate each step in the proper order. Each step must perform some behavior which must ultimately accumulate to realize the computation so there must be an accumulating memory which must be equally accessible to all steps in the sequence. Again, the engineer, filling in the blanks of an incomplete model, emulates the liveness, flow control and memory of the mathematician with pencil and paper.

** The Borrowing
**Boolean logic and the sequential process are not incomplete because of an inadequate application of thought. Both Boolean logic and the sequential process models of computation serve quite adequately in association with a mathematician. Given these adequate models of mathematical computation all that was needed was to make them work without a mathematician which can be considered an engineering problem, in contrast to a conceptual problem. Methods of emulating the behavior of the mathematician were quickly developed. Methods so serviceable and economically viable that they still serve today albeit with rapidly decreasing economy. So discovering a coherent model of computation from first principles which do not include a mathematician was not a burning concern. Early efforts in this direction such as asynchronous circuits, cellular automata, neural networks and the perceptron did not thrive.

We have been very successful at emulating the mathematician, but can computer science dispense with the ghost of the mathematician and model computation with its own unique conceptual foundations? We must first understand why the mathematician ghost persists

**The Confusions of Sequentiality
**Emulating the mathematician results in concepts of computation locked in a tangle of mutual dependencies that collectively resist the evolution of any individual concept.

** The Inherent Concurrency of Sequentiality
**Sequence, in and of itself, does not specify a computation. A sequence has to realize the correct dependency relations among the steps. A computation is specified as a progression of dependent operations as with an algebraic equation, a flow chart or a programmer’s jottings. This dependency specification inherently specifies operations that can occur simultaneously or in any order. The, “simultaneously”, specifies possible concurrency. The, “in any order”, means that the possible concurrency of the dependency specification can be mapped to a sequence of operations. But it also means that there can be a multitude of correct sequential orders that realize the dependency relations of the computation.

Below at left is a dependency specification. A and B can occur in any order before D but if either A or B occur after D the sequence is incorrect. C can occur in any order with A, B and D. At right are the eight correct sequences derivable from just the shaded portion of the dependency specification. The other 16 possible sequences are all incorrect. The only means of determining a correct sequence from an incorrect sequence in a typically enormous set of possible sequences (one might call this the chaos of sequentiality) is to refer to the dependency specification with its expression of concurrency.

The dependency relations and their possible concurrency must be understood to construct a correct sequence. The expression of sequence derives from the expression of concurrency. It cannot be otherwise.

It would seem that the dependency specification might make a more concise and coherent object of primitive consideration for computer science than sequence. So, why is it not? Blame the concurrency.

** The Fear of Concurrency
**Time indeterminacy:

*“The introduction of concurrency into computation opens Pandora’s box to release the possibility of non determinacy and a host of other complications, including deadlock and livelock …. Events within concurrent systems do not necessarily occur at predetermined times, but may occur in different orders depending upon the relative speeds of different system parts. The usual assumption, which is also physically realistic, is that relative speeds cannot be controlled so precisely that the designer or programmer can depend upon them to enforce sequencing requirements.”(17)*

Speed indeterminacy:

*“Unfortunately, asynchronous circuits are difficult to design because they are concurrent systems. When trying to understand a synchronous circuit, it is possible to pretend that events occur in lock-step. Variations in the speeds of functional elements can be ignored, since the clock is timed to allow for the worst-case delays. the lock-step model does not work for speed-independent designs — a correct design must exhibit proper behavior no matter what the speeds of its components. When several components are operating concurrently, there may be a very large number of possible execution paths, each corresponding to a different set of delays in the components. Non determinism resulting from unknown or varying delays is the essential problem in all concurrent systems. For a property to hold for a concurrent system it must hold for every possible execution.”(18)*

Execution path indeterminacy:

*“Although the architectural freedom of asynchronous systems is a great benefit, it also poses a difficult challenge. Because each part sets its own pace, that pace may vary from time to time in any one system and may vary from system to system. If several actions are concurrent, they may finish in a large number of possible sequences, Enumerating all the possible sequences of actions in a complex asynchronous chip is as difficult as predicting the sequences of actions in a schoolyard full of children. This dilemma is called the state explosion problem. Can chip designers create order out of the potential chaos of concurrent actions?”(19)*

To understand what these authors are talking about, consider again, the Boolean logic combinational network. It is a directly instantiated dependency specification with all the attendant concurrency, and it delivers incorrect behavior (it glitches). The relative speeds of the functions can vary and their outputs occur in different orders effecting a variety of execution paths causing a flurry of incorrect outputs (the exploding state space).

The above quotes imply that the only way to avoid the glitching is for all input transitions to arrive simultaneously or in specific orders, neither of which can be expressed in terms of Boolean logic nor assured by timing in a practical Boolean logic network. The stock solution is to isolate and contain the indeterminate chaos (glitching) of the combinational network by bounding it with a time interval within which the Boolean network will stabilize to a correct output. The time interval, recurring, sequences the behavior of the network, turning the Boolean network as a whole into a timed step of sequential computation. The chaos of concurrency is tamed with time and sequence. Sequence, thus, appears to be more fundamental than concurrency. But concurrency lurks in the Boolean combinational network within each sequential step. So, is not concurrency more fundamental than sequence?

From the mathematician’s point of view, any concurrent expression (a so-called partial ordering) can be reduced to a sequential expression (a so-called total ordering). The chaos of concurrency can be tamed simply by mapping it to a controlled sequence. Sequentiality appears to reside at a reductive conceptual bottom. But sequence control cannot be primitive. The most primitive sequence controller cannot itself be sequence controlled or “controlled” in any sense but must proceed in terms of its own intrinsic behaviors which may include concurrent behaviors which cannot be reduced to controlled sequence. So concurrency must, ultimately, reside at the reductive conceptual bottom.

On one hand it appears that concurrency can and should be characterized in terms of sequentiality; cooperating sequential processes(20), communicating sequential processes(21), Interleaved sequential processes (22)and so on.

On the other hand, as shown, sequentiality derives from concurrency, sequential steps contain concurrency, and sequential control cannot preclude all concurrency. Sequentiality cannot fully characterize concurrency and, so, cannot be primitive.

The difficulty is that characterizing concurrency in terms of sequentiality which is derived from concurrency circularly begs the question of primitivity and builds a house of cards in the clouds. Concurrency is fundamental to computer science no matter how you cut the pie and must be confronted directly and primitively.

Is the chaos of concurrency an existential inevitability or is it an artifact of a conceptual point of view? Why is a Boolean combinational network chaotic?

** The Mathematical Function
**A Boolean combinational network glitches because it is a network of mathematical functions. A mathematical function is a mapping rule for the mathematician that was never intended to behave on its own, and when directly instantiated becomes a continuously responsive mapping behavior, continually presented with input values and continually presenting a result value. A continuously responsive behavior can temporarily output an erroneous result if its inputs do not transition simultaneously. The erroneous outputs of functions early in a network of functions will present incorrect result values to subsequent functions resulting in a blizzard of incorrect values rippling through the network followed by a wavefront of correct values that eventually stabilize the output of the network to the correct resolution of the presented input values. There is no way to determine, from the network behavior itself ,when the stabilization occurs and, as lamented above, the blizzard of incorrectness cannot be avoided either in terms of the logic or with timing control. An external influence must determine when the computation begins and when the computation ends.

But, what if a function behavior could appreciate when a mapping starts and when a mapping finishes and output only a correct result. Might a concurrent network composed of such behaviors exhibit fully determined behavior?

But first, why is a function continually presented with data values to map?

** Passive Symbols: Variables and Their Values
**The notion of a variable continually asserting a value is not a mathematical concept. An algebraic variable does not assert a value but is an empty place holder to be substituted with a value when the equation is used(23). The mathematician substituting the input variables (filling its emptiness) of a function is the start of a computation and when the mathematician writes down the result value substituting the output variable (filling its emptiness) the computation is finished.

The notion of constantly asserted data values arises from a confluence of binary representation in mathematics, the two state representation of digital electronics, the correspondence of Boolean logic with binary switching and the notion of state space from physics.

A binary variable has two values mapped to the two states of electronic representation so an electronic binary variable is always asserting a data value and cannot not assert a data value. Consequently, an electronic Boolean logic switching function is continually presented with data values by its electronic variables.

In physics a state space consists of a collection of variables that continually assert values and which are continually interacting and transforming their values. Physicists imagine sampling a dynamic state space at an instant to observe a static state.

*“We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.”(24)*

The “state” is the record of progress of the computation of the universe and state space has come to be generally regarded as the formal record of progress of any computation. Computer science tailored this dynamic notion of a state space of variables continually asserting values to allow periodic instances of sampleable stability. The values of variables transition only during a computation step and the state space of variable values is static between steps. This allows predicting and observing the expected state after each step which brings us to the crux of the matter: computational confidence.

** The Mutual Dependency of State space and Sequence
**For a state space to be reliably observable there must be intervals of stability with no transitions occurring. The notion of controlled sequence provides such intervals. Sequence, in turn, relies on intervals of stable state. Each sequential step observes a part of the state, transitions a part of the state and leaves a stable state for the next step to observe.

If sequential steps are allowed to overlap (concurrent) the possibility arises that a step might observe a transitioning state (a race) and given the enormous variety of correct sequence, it becomes difficult to insure the integrity of state observation for overlapping steps.

If the state space fragments the possibility of simultaneous observation arises leading to the possibility of overlapping steps.

If any inherently concurrent behavior does exist, such as in a Boolean logic network, it must be isolated within a sequential step such as a time interval, a semaphore state, a channel access and so on.

** Computational Confidence
**Predictable observability is the touchstone of human computational confidence. The computer science notion of state space, entwining time, memory and sequence control in a tangle of mutual dependencies, is all about predictable observability. Concurrency messes with this touchstone of computational confidence in ways that effectively destroy it. Hence, the fear of concurrency. Yet concurrency exists and is fundamental to computer science. Might there be another path to computational confidence that does not preclude concurrency?

Continued in part 2.

References

1. Donald E. Knuth, *Fundamental Algorithms* (Reading, Mass., Addison-Wesley, 1969) p. 1.

2. Zenon W. Pylyshyn, “Theoretical Ideas: Algorithms Automata and Cybernetics,” in *Perspectives on the Computer Revolution*, ed. by Zenon W. Pylyshyn (Englewood Cliffs, N. J., Prentice-Hall, 1970) pp. 60-68.

3. Donald E. Knuth, *Fundamental Algorithms* (Reading, Mass., Addison-Wesley, 1969) p. 4-6.

4. M.S. Carberry, H.M. Khalil, J.F. Leathrum and L.S. Levy, *Foundations of Computer Science* (Potomac, MA, Computer Science Press, 1979) p.16.

5. R.R. Korfhage, “Algorithm,” in *Encyclopedia of Computer Science, eds Anthony Ralston, Chester Meek* (1st ed. 1976),(New York, Petrocelli/Carter, 1976) p. 49.

6. Donald E. Knuth, *Fundamental Algorithms* (Reading, Mass., Addison-Wesley, 1969) p. 5.

7. Ellis Horowitz and Sartaj Sahni, *Fundamentals of Computer Algorithms* (Potomac, Computer Science Press, MA, 1979) pp. 1-2.

8. Benjamin W. Wah, C. V. Ramamoorthy “*Theory of Algorithms and Computation Complexity with Applications to Software Design*,” in *Handbook of Software Engineering*, ed. by Charles R. Vick and C. V. Ramamoorthy (New York, Van Nostrand Reinhold, 1984) p. 88.

9. Kurt Maly, Allen R. Hanson, *Fundamentals of the Computing Sciences* (Englewood Cliffs, N.J., Prentice-Hall 1978) p. 41.

10. F. S. Beckman, *Mathematical Foundations of Programming* (Reading, Mass., Addison-Wesley,1980) p. 398.

11. E. V. Krishnamurthy, *Introductory Theory of Computer Science* (New York, Springer-Verlag, 1983) p. 3.

12. E. V. Krishnamurthy, *Introductory Theory of Computer Science* (New York, Springer-Verlag, 1983) p. 3.

13. John K. Rice, John R. Rice, *Introduction to Computer Science* (New York, Holt, Rinehart and Winston, 1969) p. 47.

14. John K. Rice, John R. Rice, *Introduction to Computer Science* (New York, Holt, Rinehart and Winston, 1969) p. 49.

15. Karl M. Fant, *Computer Science Reconsidered: The Invocation Model of Process Expression*, (Hoboken, New Jersey, Wiley Interscience, 2007). p. 7

16. Harold Abelson, Gerald Jay Sussman with Julie Sussman, *Structure and Interpretation of Computer Programs* (Cambridge Ma., MIT Press, New York, McGraw-Hill, 1985) p. 295.

17. Charles Seitz ed., *Resources in Parallel and Concurrent Systems*, ACM Press, New York, 1991, introduction, page ix.

18. David L. Dill, *Trace Theory for Automatic Hierarchical Verification of Speed-Independent Circuits*, (Cambridge, MIT Press, 1989). pp. 2,3.

19. Ivan E. Sutherland and Jo Ebergen, “*Computers Without Clocks*”, Scientific American, Vol. 287, No. 2, August 2002, pp. 62-69.

20. E. W. Dyjkstra “*Cooperating Sequential Processes*”, F. Genuys Ed. Programming Languages, Academic Press, New York, 1968, pages 42-112.

21. C. A. R. Hoare, *Communicating Sequential Processes*, (Englewood Cliffs, New Jersey, Prentice Hall International, 1985).

22. M. Ben-Ari, *Principles of Concurrent Programming*, (Englewood Cliffs, New Jersey, Prentice Hall International, 1982). p. 18.

23. W. Gellert, H. Kustner, M. Hellwich, H. Kastner eds., The VNR Concise Encyclopedia of Mathematics, Van Nostrand Reinhold Company (New York, 1977), P 40.

24. Laplace, Pierre Simon, A Philosophical Essay on Probabilities, translated by Truscott,F.W. and Emory,F.L., Dover (New York, 1951) p.4