General Purpose Computers use statically bound imperative commands on binary data. The software is defined by mutable binary data that is a shared medium to attack. Static procedural programs cannot detect mistakes in this mutable binary medium. When John von Neumann proposed this design he ignored the scientific alternative of his colleague at Princeton University. Alonzo Church defined his Lambda Calculus to scale Alan Turing's digital computer as independent atomic functions using a Universal Model of Computation found in Nature. Turing's imperative instructions are encapsulated as dynamically bound mathematical symbols by the other half of the Church-Turing Thesis. The Lambda Calculus compartmentalizes mathematics as independent digital objects dynamically bound into a DNA structure by a sequence of mathematical expressions as found on a university chalkboard.
Shared binary data is endlessly stretched to virtualize industrial society without the strength to survive attacks, mistakes, and design flaws that take place as software interacts in cyberspace. Over decades of unbelievable change, from an isolated batch processing to advanced global society, binary data proves unreliable and unsafe. When batch-processing mainframes became the computer on a chip, networking exposed society to globally inspired attacks. Without experts to follow best practices, the blue screen of death frustrated personal computer users and hours, days, and even weeks of hard work were lost. Next malware tricks emerged with remote hackers to pry of communities and service organizations. Private secrets held in networked systems that concentrate personal data in a large monolithic configuration are attacked for commercial gain.
These new requirements cannot be implemented in software. Unlike the
Titanic, the engine room of cyberspace must remain dry. It must be data
tight to remain media tight. Any leaks in the engine room lead to errors
and consequential disaster. A monolithic physical computer lacks
functional digital boundaries. When the engine floods everything else
fails. Shared physical memory means binary data cannot be trusted.
Monolithic software only survived behind locked doors. Everything
changed beyond expectations with personal computing and the internet. To
sail and survive in a sea of malware and hackers, computers must evolve
to support the requirements of global civilizations. The single hull of
the pioneer is enhanced to support the functional boundaries of
software. These functional boundaries are defined by the symbols on a
teacher's chalkboard.
Computers began with the Abacus and the slide rule. These functional
machines are explained by the λ-calculus. The λ-calculus defines the
rules of computational binding as found in nature. As Alonzo Church
tutored his graduate student Alan Turing, a λ-calculus application as
the atomic form of computation was expressed as a dynamically bound
single-tape Turing machine. The binary architecture of General-Purpose
Computer Science is not the same. Without the λ-calculus, von Neumann
overstretched and distorted Turing's α-machine. These binary computers
lack digital boundaries. Any mistake (a wrong instruction, some invalid
data, or deliberate human sabotage) is immediately shared within the
monolithic build. The engine room of the compilation is disabled for
normal operations and the application takes a big hit.
The binary computer is outdated. The pioneering assumptions for batch
processing do not satisfy the needs of individuals, societies, and
nations living as the paying passengers in cyberspace. The engine room
of the cyberspace must detect infections to prevent attacks and operate
safely. Neither fail-safe software nor obscure best practices mattered
when computers were guarded in experts behind locked doors in top-secret
computer rooms. Attacks were prevented by the 'air-gap' between
individually locked rooms. Software problems were hidden by patches to
home-grown programs. Infections could not spread beyond the locked doors
of a physical room if the staff is trusted.
Everything changed with the computer on a chip, quickly followed by the
Internet. Cost collapsed and personal computing took off. Programming
paradigms change when off-the-shelf and cloud-based software replaced
bespoke, home-grown software on private machines. Despite significant
hardware savings, software costs only grow because the software compiled
for a binary computer is inflexible, wedded to the past by a statically
bound memory space. Complexity grows as the network expands and software
fails because the detains are monolithically exposed. Binary computers
expose the compiled software to attack.
The functional modularity of mathematical symbols is vital for
mathematics to be understood to work reliably. As taught in school,
children can perform mathematics without the mistakes that take place in
a General-Purpose Computer. The Abacus and the slide rule are machines
that follow the same rules learned at school. Fail-safe computers,
including children, the Abacus, and a slide rule are all functional
machines. While unknown at that earlier time, both the Abacus and the
slide rule use the dynamic binding rules of the λ-calculus. They apply
the rules of the Church-Turing Thesis in form as well as function.
The reason binary computers fail, and monolithic software is so hard to
debug is because the λ-calculus is missing from the General-Purpose
Computer as a functional machine. It is the binary details of
General-Purpose Computer Science that interfere with and expose
monolithic software compilations to attack.
The binary computer is shared without individual security. One mistake
reverberates throughout the statically compiled, monolithic image and
then infects other builds both local and remote. Worse still the system
is beset by rapids that require data portage between each image because
functional boundaries do not exist in von Neumann's binary computer.
Shared binary data is endlessly stretched to virtualize industrial society without the strength to survive attacks, mistakes, and design flaws that take place as software interacts in cyberspace. Over decades of unbelievable change, from an isolated batch processing to advanced global society, binary data proves unreliable and unsafe. When batch-processing mainframes became the computer on a chip, networking exposed society to globally inspired attacks. Without experts to follow best practices, the blue screen of death frustrated personal computer users and hours, days, and even weeks of hard work were lost. Next malware tricks emerged with remote hackers to pry of communities and service organizations. Private secrets held in networked systems that concentrate personal data in a large monolithic configuration are attacked for commercial gain.
The shared binary computer assumptions made by von Neumann in WWII prove inadequate for society digitally interconnected through cyberspace. The age of global cyber society requires fail-safe computer science for a citizen's democracy to function. The requirements of paying passengers, the citizens must come first. Simple binary computers based on monolithic compilations do not meet this need. The machine, not only shares data, this data is unguarded and easily corrupted. Worse yet the monolithic compilations exchange binary data that can fraudulently attack the binary conventions that are different in separate compilations. Thus the rings of privileges that separate any two compilations can be confused and further attacked without detection. This is how a simple email can freeze a corporation or steal emails from a presidential campaign.
These new requirements cannot be implemented in software. Unlike the
Titanic, the engine room of cyberspace must remain dry. It must be data
tight to remain media tight. Any leaks in the engine room lead to errors
and consequential disaster. A monolithic physical computer lacks
functional digital boundaries. When the engine floods everything else
fails. Shared physical memory means binary data cannot be trusted.
Monolithic software only survived behind locked doors. Everything
changed beyond expectations with personal computing and the internet. To
sail and survive in a sea of malware and hackers, computers must evolve
to support the requirements of global civilizations. The single hull of
the pioneer is enhanced to support the functional boundaries of
software. These functional boundaries are defined by the symbols on a
teacher's chalkboard.
Computers began with the Abacus and the slide rule. These functional
machines are explained by the λ-calculus. The λ-calculus defines the
rules of computational binding as found in nature. As Alonzo Church
tutored his graduate student Alan Turing, a λ-calculus application as
the atomic form of computation was expressed as a dynamically bound
single-tape Turing machine. The binary architecture of General-Purpose
Computer Science is not the same. Without the λ-calculus, von Neumann
overstretched and distorted Turing's α-machine. These binary computers
lack digital boundaries. Any mistake (a wrong instruction, some invalid
data, or deliberate human sabotage) is immediately shared within the
monolithic build. The engine room of the compilation is disabled for
normal operations and the application takes a big hit.
The binary computer is outdated. The pioneering assumptions for batch
processing do not satisfy the needs of individuals, societies, and
nations living as the paying passengers in cyberspace. The engine room
of the cyberspace must detect infections to prevent attacks and operate
safely. Neither fail-safe software nor obscure best practices mattered
when computers were guarded in experts behind locked doors in top-secret
computer rooms. Attacks were prevented by the 'air-gap' between
individually locked rooms. Software problems were hidden by patches to
home-grown programs. Infections could not spread beyond the locked doors
of a physical room if the staff is trusted.
Everything changed with the computer on a chip, quickly followed by the
Internet. Cost collapsed and personal computing took off. Programming
paradigms change when off-the-shelf and cloud-based software replaced
bespoke, home-grown software on private machines. Despite significant
hardware savings, software costs only grow because the software compiled
for a binary computer is inflexible, wedded to the past by a statically
bound memory space. Complexity grows as the network expands and software
fails because the detains are monolithically exposed. Binary computers
expose the compiled software to attack.
The functional modularity of mathematical symbols is vital for
mathematics to be understood to work reliably. As taught in school,
children can perform mathematics without the mistakes that take place in
a General-Purpose Computer. The Abacus and the slide rule are machines
that follow the same rules learned at school. Fail-safe computers,
including children, the Abacus, and a slide rule are all functional
machines. While unknown at that earlier time, both the Abacus and the
slide rule use the dynamic binding rules of the λ-calculus. They apply
the rules of the Church-Turing Thesis in form as well as function.
The reason binary computers fail, and monolithic software is so hard to
debug is because the λ-calculus is missing from the General-Purpose
Computer as a functional machine. It is the binary details of
General-Purpose Computer Science that interfere with and expose
monolithic software compilations to attack.
The binary computer is shared without individual security. One mistake
reverberates throughout the statically compiled, monolithic image and
then infects other builds both local and remote. Worse still the system
is beset by rapids that require data portage between each image because
functional boundaries do not exist in von Neumann's binary computer.
Comments