Regulating Cyberspace


By K. J. Hamer-Hodges, FIEE

Cyberspace is the global utility of Civilization. It must be regulated to protect individuality and constitutional democracy. At present, the pilots of Cyber society are Cyber-titans. Their branded laws dictate the corrupt rules of Cyber-society.  Instead, free and equal citizens must own their own data to make unfettered, independent choices within the law of the land. However, titans confiscate access to private data and tilt the rules in favor of branded dictatorships. 

Digital-democracy must work for individuals. Citizens free to act, unhindered by the rules of self-interested Cyber-titans. This handful of global dictators rule medieval digital kingdoms with the iron gloves of dictatorial global empires. Through fear of malware and hacking, they confiscate private data that rightfully belongs to individuals in return for the false promise of security. Identities theft, malware, and hacking thrive in the digital cracks of branded Cyberspace. 

Titans farm society as digital peasants living in tithed hovels, indentured servants in Cyberspace. Individuality and freedom of choice are lost because private data is trapped by search engines, app stores, payment channels, data warehouses, and operating systems. Cyberspace lists like a sinking ship engulfed by criminals, commanded by paid henchmen, plagued by digital leaks, intimidated by unknown fears that serve the branded commercial interests of the Cyber-titans. 

Democracy in Cyberspace must stay true to ideals enshrined by the constitution and consecrated by the sacred blood of legendary patriots. Freedom in the future must protect the unskilled innocent individual without special skills in computer science. Everyone will benefit when hidden bias, criminal intimidation, and dictatorial forms of corruption are punished. Democracy is transparent, scientifically engineered by human rules of law and order, engineered for worst-case conditions. 

Undetected crime and opaque branded corruption through crafted malware and global hacking build on outdated computer-science developed after WW2 for batch processing mainframes.  Software crimes and digital hacking developed as networked software products took advantage of the default architecture of isolated mainframes. For Cyberspace computers become the first responders for program errors or crafted attacks performed in Cyberspace. General-purpose computers using von Neumann shared binary memory architecture do not meet the need because binary data meets no accepted scientific tests.

Civil society cannot be run binary data branded by Cyber-titans like Google, Amazon, Apple, and Microsoft. Instead of branded binary data, computers must adopt the scientific model taught to children at school. Free and independent individuals who use mathematics for their own data kept under lock and key access controls. This is the scientific way to share in Cyberspace as demonstrated from the Kindergarten schoolroom to the greatest university. To share a written constitution and a citizen's bill of rights through Cyberspace will be a great achievement for the 21st Century. To fail will lead to international catastrophe. 

In Americal explicit declarations define that power belongs to 'We the people.' But power in Cyberspace belongs to the titans. To satisfy American democracy, constitutional rights must remain certain. Every generation must fight for equality and justice, both at home and abroad because the evil forces of human nature are reborn with each generation. Democracy is an endless fight that moved from the street into global software to return as monstrous Cyber-dictatorships. 

Transparent computer science is a fight for the infallible automation of mathematics. Error-free mathematics calculated from symbols on a teacher's chalkboard. Symbolic computations infallibly automate by at school demonstrates the power of mathematical perfection from flawless expressions, purged of human error. This schoolroom model of computer science is devoid of malware and hacking. 

Released from the authoritarien controls needed for binary operating systems to work at all, individuals acquire infallibly automated, mathematical perfection. Individual symbols deconstruct software into protected function abstractions. Each represents by a symbol in a scientific namespace of related function abstractions. Mathematics, computed this wayat school as (a + b = c) or later in university as E=m*c^2. 

Computer science implemented this way, used object-oriented machine code. Infallibly automation is achieved by flawless expressions of logic as a DNA tree of computations. Each mathematical namespace has hierarchical computational paths, navigated by calculations made along the way. An object-oriented namespace is modular. The atomic, molecular components in precise scientific equasions. The DNA structure is computed in parallel, for example by children at school, navigating the expressions on the chalkboard. This fully transparent model purges all opaque power and eradicates malware. Power is decentralized as a level playing field of individual namespace systems.  Hackers and dictators can only attack themselves. Freedom exists for every individuals to flawlessly automate mathematical needs of infinite complexity for any private need. 

Cutting to the chase, democracy in Cyberspace depends on the infallible automation of mathematics. Like the Abacus and the slide, rule software must adopt the schoolroom model. Flawless mathematical calculations level the playing field equal for all. The universal computational model is the digital solution of scaleable Cyber-democracy worldwide. 

Nature's universal model of computation replaces the well-intentioned, best-effort of John von Neumann after WW2. Pervasive malware, unrestrained hacking, and dictatorial architectures are unacceptable characteristics for the survival of democracy morphed into Cyberspace. The criminal framework is inadequate for global Cyber-society to interact, grow, prosper. Cyber-society is the future of Civilization. It must be purged of crime and dictators. The future depends on trusted software, industrial-strength software that automatically resists malware and prevents dictatorial hacks. The schoolroom model of computer science is nature's only acceptable approach. 

Trusted software uses secure, hardware protected function abstractions to enforce national standards of law and order as enshrined by the constitution in Cyberspace. Each nation can protect its own standards independently on a common scientific platform. Extending laws of the land into Cyberspace is vital for nations and individuals to blossom as a Cyber-democracy instead of a Cyber-dictatorship. Trust depends on the full force of engineered science to enforce the schoolroom model.

Computer science changes everything. It bridges the gap between abstract ideas and physical reality. Thus, as the physical world merges with the abstractions of Artificial Intelligence and Virtual Reality through global connectivity, the laws of human interaction change. The traditional physics of communication is updated by the laws of dictatorial Cyber-titans. Corrupt administration will never protect nor preserve the laws that anchor each nation to Civilization. 

A program implements an algorithm as a personal human expression compiled into binary data that controls a computer. These parochial predilections become a working function of a product. It is the binary interactions that defines the software as a product. However, the Cyber-titans are intent on global commercial domination where domination is the keyword. They use unfair privileges of dictatorial operating systems to tilt activity in their favor by spying and restricting free choice. Since the beginning of Civilization, dictatorial architectures, rae used to dominate society. Systemic corruption allows malware and hacking to thrive. This is antithetical to American ideals of freedom and equality, and it leads in the wrong direction.  

American ideals, established after the revolution over King George, established power with the people. Without authorization by the people, there is no centralized government power. The tokens of power are term-limited keys and official chains of office. The written constitution and the bill of rights defend these sacred ideals. For Cyber society to succeed, Cyberspace must preserve and protect the constitution and all the laws of the land.  Power belongs to the people as enshrined in the constitution. No attempt is made. Human interactions must be trusted to operate under the laws of the land, but Cyberspace cannot be trusted. Deep-fakes are the latest example of why general-purpose computers cannot be trusted. 

The source of this problem is meaningless binary data. Binary information is meaningless. Software preferences are subjects of systemic binary corruption. Corruption is rampant because general-purpose computers see no value in binary information. The autocratic software and digital corruption are unavoidable. When, circa 1945, John von Neumann defined his architecture, still used today, his only objective was a batch process to automate table production for WW2 gunners. After seventy years of unprecedented change, binary data, adequate for the pioneers of general-purpose computers, still governs computer science, distorting life, first by subverting the laws of the land, and second by perverting the transparent course of justice. Cybersociety cannot progress this way, run by dictators using the inadequate science defined by the prior, pre-electronic age of batch processing.

Unreliable binary data and privileged, dictatorial operating system make this progression unavoidable. The dilemma of dictatorship surfaces from the privileged operating systems, while unguarded binary data allow undetected and unpunished Cybercrimes of huge magnitude. Either of the unchecked problems is fatal to civil society. However, they are both cured by the democratic standards of law and order, enforced by equal justice under the laws of the land. 

This fight for democracy in Cyberspace starts by purging the default privileges in general-purpose computers. These outdated machines remain damaged by the architecture of empires and dictators. These titled machines are dominated by proprietory operating systems, wielding autocratic, unfair, and unguarded privileges that exist by default in the binary architecture of these outdated WW2 machines. 

Unfair advantages offer unguarded criminal opportunities to the powerful and the crooked. They tilt cyberspace in favor of corruption and dictatorship. Already, Cyber-titans have emerged as medieval Barons, laws unto themselves in Cyberspace. Their parochial choices are then enforced by paid administrators, superuser henchmen who promote tribal fear of any alternative. 

The level playing field for computer science is the schoolroom model of mathematics. No default privileges exist. Mathematics is universally fair, and as taught to children and as used by scientists worldwide, there are no proprietary advantages. The best practices are all scientific. Mathematics is equally fair to one and all. In the schoolroom model, the children are the computer. The interact symbolically following the programs teach writes on a chalkboard. the students infallibly automate the mathematics through the science of immutable, abstract symbols. 

Each immutable symbol is flawlessly resolved by impeccable mathematical expressions. If the expression is false or the calculation wrong, the error is discovered, and the expression is corrected. The best computer which means the brilliant child is flawlessly programmed, as the teacher writes on the classroom chalkboard. The child infallibly-automates an exact mathematical result. 

This is the accepted, universal model of computation that exists in nature. It is a dynamic conjunction between science and life that results, in just one example, like the honeybee. In a hive, all bees are programed by the DNA from the current Queen, selected to lay eggs. The same DNA strings lead to hexagonal honeycombs and working in roles as a community that survive for generations on end. The Queen is not dynastic or autocratic; she is chosen by the hive to just lay eggs. This role is sustained by a sent she emits. The sent is a physical token of her exclusive authority to lay eggs.

The success of this role-playing, a universal computational model is demonstrated by the eternal Abacus. It is reinforced by the multifunctional slide rule. These flawless mathematical machines, like the hive, pass the tests of time. They survive generation upon generation into an endless future because they DNA all human error sources are detected and corrected scientifically, ranging from algorithmic errors to disruptive sabotage, perhaps by malware, or deliberately crafted, local and remote, human hacks. 

At the peak of the Industrial Age from 1820 to 1860, Charles Babbage proved infallible-automation by applying the schoolroom model to a polynomial Difference Engine. His clockwork engine perfectly printed pages of results, as flawless tables to more than sixteen-digit accuracy. Later, his Analytical engine defined the epitome of infallible-automation for computer science. 

This perfect union between software and hardware is object-oriented—a machine framework for modular software. The framework is as strong as the wood and wire of an Abacus, yet as flexible as the embedded, multifunctional scales on a slide rule. A digital function abstraction equally well implements the laws of mathematics, the laws of physics, and the laws of the land with certainty. Each abstraction is a flawless atomic machine bound objectively by Alonzo Church's Î»-calculus as the universal computation model found throughout nature. 

Function abstraction is the science in computer science, the junction between the two alternative visions, united by the Church-Turing Thesis. The interlocking machinery of infallible automation binds software material to hardware functions, bringing the algorithms to life using the object-oriented machine code of a Church-Turing Machine. 

The infallible automation of software is only achieved, function by function, using nature's universal computational module that navigates a DNA species that defines a living dynamic creature. Each creature is a Thread of life, from birth to death, that navigates a DNA string. Nature guarantees the free individuality and independent equality of each living creature. They all have the instinct to survive in hostile conditions. Hacking and malware exemplify human hostility in Cyberspace. Individual Threads must also survive to implement society's dreams as a democracy with freedom and justice for all. 

Cybersociety is run by software dictators. Cyber-titans like Google, Amazon, Microsoft, and Apple, and computer manufacturers like Intel, AMD, NVidia, and ARM run digital dictatorships using manipulative, unreliable binary computers. The command-oriented machines are one-sided. The general-purpose software is a dangerous drug that addicts the population and sustains their monopolies. They ignore and override the laws of both science and mathematics to operate as medieval Barons ruling their own digital kingdoms. To Cyber-titans, people have tithed peasants, farmed like sheep. They destroy national standards as they strive for global supremacy through worldwide uniformity.

The Civilization of Cyberspace is a fight against dictatorship and for digital democracy. Cybersociety must draw a hard line between software development and program developers and program computation. Only one side of this line is programmed. The other side is the universal computational model, nature's perfect solution to life everlasting. A hard-line separates theory from practice in a scientific computational framework first defined by Alonzo Church in 1936 as the laws of the Î»-calculus. The hard-line binds programmed to function as a digital object to keep the calculations in line and on track. Errors are detected on the spot by boundary checks, the electric fence of computer science that limits the framework of active computation. 

The first mathematical framework for calculations began at the birth of civilization as the Abacus. The universal and public acceptance of the Abacus built global trade, but the Abacus only adds and subtracts. After the Scotsman, John Napier published the powerful mathematical secrets of logarithms in 1614, William Oughtred's invented the slide rule, circa 1625. This object-oriented machine embeds logarithmic scales onto sliding rods. The functions of the slide rule drove the age of enlightenment to the Industrial Age and, after WW2, took America to the moon. These computers succeed through universal acceptance, and they survive forever because they infallible-automat eternal mathematics as a flawless machine. 

 survive flawlessly defined in 1936 by Alonzo Church, the founder of the Î»-calculus and his doctoral student Alan Turing, the founder of the Turing-Machine. The Turing-Machine, when combined with the Î»-calculus, creates a Church-Turing Machine. Like the Abacus and the slide rule, the framework of a digital computer is humanized as a digital slide rule. An easy to use, self-standing, symbolic machine that infallibly automates mathematical science through software function abstractions.

The Church-Turing Thesis explains this formula of function abstractions as a Church-Turing Machine. The future is controlled by citizens who unlock the access rights to private, individual, functional frameworks. By holding the keys to computational paths in Cyberspace, individuals run computer science. The Cyber-titans provide the function abstraction, but each individual is free to build their own destiny in Cyber society.  by scientific expressions, the same ones a teacher writes as symbols on a chalkboard. When Citizens control the computations, they cannot be cut-out of critical decisions like the 737 Max or overshadowed by a Cyber-titans who commandeers, steals, markets, and spy on any and all data worldwide.

The oxymoron of Virtual Reality (VR) has taken control of the 21st Century, but it cannot be trusted like any contradictions. Deep Fakes add the mystical powers of superhuman 'Artificial-Intelligence' to further confuse the conundrums of software-driven AI-VR that will run Cyber society into a ditch. The combination strikes the heart and mind of innocent civilians. Blindly trusting software presents is a conundrum for the 21st Century. How to tell the truth from fiction if all of history stored in Cyberspace can change? Revising history upsets stability. The story of civilized progress must be protected to be preserved. 

Consider another variation of software-defined Virtual Reality, the deadly changes made to the Boeing 737 Max. MCAS, the Maneuvering Characteristics Augmentation System, exemplifies the future to expect with general-purpose computer science. The corrupt binary data, the naive programmer, combined with incomplete testing and outside attacks to usurp critical aspects of cyber-society, is a centralized operating system. Virtualizing the life-supporting details, for example, the center of gravity of an aircraft, hides necessary changes, in this case, the engine's weight and wing mounting position. MCAS not only blindsided the crew. It dictatorially seized control from the pilots. Helplessly condemned, they witnessed the horrific vertical crash that killed all aboard. 

The caviler treatment of AI, VR, Cybercrimes, and Cyberwar destroys the written constitution. The growing complexity of each software illusions will replay the MCAS disaster repeatedly if computers do not improve. The problems of corruption start in the shifting sand of binary data. VR cannot be trusted because the binary data is tainted by a program that can neither be detected nor traced and where all power is coercive. Coercive powers allow malware and hackers to corrupt and destroy faith in the detailed workings of 21st Century Cyber-society. Trust is vital to the progress of society and the nation. When control is surrendered to unreliable, dictatorial software, even the written constitution is replaced, unknowingly, and incrementally by unreliable, coercive software. 

The problem exists because binary data and the imperative binary commands of machine code lack the industrial-strength of engineered disciplines in the tradition of Charles Babbage. At the height of the Industrial Age, two mechanical computers, the Difference Engine and the Analytical Engine, proved flawless computer science. Mathematics drives foolproof automation, and it drove Babbage's seminal machines. It actually began with the Abacus and later the slide rule, but the software in a general-purpose computer is not infallibly automated, and consequently, it cannot be trusted. A general-purpose lacks the mechanics needed for mathematical integrity. As children learn in school, the essentials mechanism of infallible automation is a symbolic function abstraction. Without the essential function abstraction mechanism, binary data is openly exposed, and foolproof automation is out of reach. Corruption is confirmed by pervasive, unresolved malware, and undetected human hacking. 

Binary data cannot be verified or validated. The coercive power of privileged commands unavoidably and permanently stain the result of every machine code step. The unfair forces support the medieval digital dictatorships run by the Cyber-titans. This cannot be the future of Cyber-society. Society led in this wrong direction surrenders a civil democracy for corruption and tyranny. It led to death on the Boeing 737 Max, it results in massive data thefts and stolen intellectual property, as well as identity theft, malware, hacking, election manipulation, and lost Cyberwars. It could lead to a nuclear conflict since enemies like Russia, China, Iran, and North Koria intend on America's downfall by manipulating the world through Cyberwars.

Trusted software engineered with calibrated, qualified, provable industrial-strength is vital in a world driven by VR. Corrupt software that includes human-inspired VR like Deep Fakes all originate from machine code. This is the digital leaks in the digital basement of computer science. To detect and prevent corruption, the machine for machine code must be engineered. To guarantee each function's scientific accuracy, each part must be bound as an abstraction that hides the implementation details from corruption. Like pure mathematics, object-oriented machine code detects errors including fraud, and forgery, symbolically. Symbolic modularity prevents accidental errors as well as deliberate malware, hacking, and a lost Cyberwar. Error detection takes place dynamically on-the-spot as non-slip, fail-safe software modules touch hardware. 

The skilled privileges needed by centralized operating systems make hacked or paid administration accounts ideal partner for dictators worldwide. Still, computer science, as defined scientifically by Alonzo Church and Alan Turing, predates and resolves von Neumann's carelessly but shared architecture. Indeed the Abacus, the first industrial-strength computer, dates from Babylon and the birth of Civilization, while the slide rule, circa 1625, is responsible for the Industrial Revolution, the infallible automation of factories, and the first moon landing by Appolo 11 in July 1969. In every case, integrity and industrial-strength are achieved by foolproof automation, a cyclic machine's flawless operations. The engineering detail is a mathematical framework for the dynamic acts of calculating scientific expressions like (a + b = c) by substituting values for variables as taught at school. Such scientific names exclude malware and hacking by default. 

However complex and for whatever purpose, each scientific expression becomes an atomic machine, coded as digital objects in a Church-Turing Machine. No different from the mathematics taught to children or as mechanized by the Abacus and the slide rule. All these computers are object-oriented machines. Machines that are corruption free, without special skills, infallibly automated for civilians. The symbol names are immutable digital keys, called capability keys by Jack Dennis in 1956. Capability keys are immutably digital assets. They move the fulcrum of software from dictators to users as strictly limited access right to Cyberspace. 

Human inspired corruption originates from both careless and crafted machine code. Human errors and deliberate misuse allow systemic malware, nefarious hackers, and phishing emails to rob individuals, pervert elections, destroy presidential candidates, and subvert digital media. Deep-Fakes, False News, digital fraud, and forged media in many forms originate as hardware meets software. Corruption stems from the execution of invalid machine instruction. Sin is rooted in the easy destruction of binary data and the silent theft of digital secrets from individuals, industry, government agencies, and even the NSA. Ransomware alone cost city governments and service providers over $140 million in the first six months of 2020. 

Everything from malware to Deep Fakes gets dramatically worse with the arrival of Artificial Intelligence. The takedown of democracy is incremental, but a massive data breach can punish several hundred millions of innocent citizens at a time. Unregulated cybercrime is a social disaster, a government responsibility, and counterproductive. When hidden by unfair privileges, society and industry reach a breaking point. Cybercrime and Cyberwars are unresolved and evolving national nightmares that spill onto the streets of major US cities. As the crisis boils, traditional law enforcement fails, and extremists, aided by False News, allow Cyber-dictators to fill the void. 

National discontent is aggravated by the mysterious powers and fraudulent privileges in Cyberspace. Cult experts and dictatorial superusers run Cyber-society off the rails from inside Cyberspace. The emergence of AI as a superhuman force only increases and hastens an unavoidable consequence. Society is lost in the baffling fog of corrupt or unreliable VR. Trusting citizens are cut adrift to search alone for Lorelei in the hope of salvation. 

The problem is far more profound than computer scientists admit. The illusions of Cyberspace all depend on the integrity of binary information. Infallible automation begins and ends as binary machine code first touches digital hardware. This is where the nuts and bolts of computer science interact. The lack of clockwork integrity in binary machine code leaves society helplessly exposed to digital corruption. Digital corruption is like a leaking basement. Corruption spreads like damp, growing mildew, and damaging mold, making every room in Cyberspace, connected to the Internet, uninhabitable.

All criminal fraud and every case of deliberate forgery, including Deep Fakes, emanate from easy binary corruption, and the inability to guarantee binary computations have not been manipulated. The binary data on a computational surface is where software and hardware meet. For infallible automation, these binary actions must be both data-tight and function tight. This is the only hope for trusted software and the foolproof automation of computer science. The Church-Turing Thesis explains how data-tight and function-tight computation apply the laws of the Î»-calculus as a meta-machine for modular, fail-safe software, and fault-tolerant hardware. This is the industrial strength needed for the future of the nation.

General-purpose computers lack the modularity needed for trusted digital protection. Cyberspace is filled with leaking hulls of sinking ships, ships without water-tight compartments. Binary data is easily corrupted because the machine code is despotic and transmits infections instantly around the world. Theft, fraud, forgery, and spying all thrive in the basement of computer science. Trusted software is impossible, and the dictatorial architecture crashes because the citizens, the pilot, and co-pilots of Cyber-democracy are cut out of the loop in a software crisis. 

The nature of arbitrary machine code must change before digital integrity is achieved. A Church-Turing Machine that binds software to hardware using the laws of the Î»-calculus solves the problem of a function-tight, data-tight basement using object-oriented machine code. Object-oriented software took off when Steve Jobs adopted Objective-C for his NeXt computer in circa 1985. However, it began in 1965 with Capability-based computer science refined as Object-Oriented Machine (OOM) code by the PP250 in 1972. OOM encapsulated binary machine code as data-tight and function tight, digital objects. Software errors are detected by the mathematical warning from the typed interactions between software and hardware. A Î»-calculus meta-machine crosscheck for dangerous actions when data is used or changed. Digital history is protected, and functionality is corroborated at every step. Any attempts by crooks, thieves, rogues, gangsters, or global enemies are detection and fraud, and forgery is prevented. When software integrity is guaranteed in computer science's binary basement, Cyber-democracy will grow and prosper forever. 

The general-purpose computer was designed after WWII when VR and AI were over the horizon. Blind trust in dubious software guarantees more civilians will die, but death is only one of the dire consequences. As Cyberwars are lost, the Cyber-titans steal America's written constitution and enslave the population. Dictatorial, proprietor, cult software takes charge by Cyber-titans' rules. Freedom is exchanged for incomprehensible best-practices needed to survive everyday life. Justice dies as the titans' software allows criminals to escape without a trace. 

General-purpose computers, filled with crime and deception, undermine America's future. Competing Cyber-titans ignore national priorities like the written constitution and three independent government branches. Instead, Apple, Google, Microsoft, and Amazon rule Cyberspace as fiefdoms to expand their global grip. Worse, these general-purpose computers allow censorship and spying on the global population. To stop cult dictators and corrupt henchmen who trade fear for obedience, individuals must run Cyberspace as a citizens' democracy. Democratic Cyber society prevents cybercrime, keeping private data private by enforcing nature's universal computation model returning the democratic control over our collective destiny.

Nature's universal computational model is made for individual creatures, the particular instances of a species. A species is not constituted as a shared monolithic compilation. Instead, each animal has a private copy of the species blueprint as a structured DNA. The DNA replaces the centralized dictatorial operating system with modules of object-oriented machine code. The DNA defines a symbolic hierarchy of functional relationships, bound dynamically by the laws of the λ-calculus. It is a decentralized, replicated, individual approach to calculation, and computation. Each individual inherits the DNA blueprint from the symbolic mathematical as a living, organic structure of types. This universal computational model is bound dynamically by the laws of the λ-calculus as incorporated in the Church-Turing Thesis (CTT). 

The CTT compares two alternative computer science methods, developed in 1936 by Alonzo Church and his postgraduate student Alan Turing. They are complementary ideas that exactly match in atomic form but differ in materials. When combined modular boundaries exist that tame the dynamic forces of wild software as programmed function abstractions. A λ-calculus meta-machine navigates the DNA to bind each function abstraction in turn to the universal thread of computation. 

Operating systems replaced the λ-calculus when they took control of computers in the formative period leading to batch processings mainframes, exemplified by the superuser invented by Unix and other time-sharing systems, as used today. 

The scientific balance between hardware and software distributes power uniformly and universally through object-oriented, capability-limited machine code. Each atomic thread of computation navigates the organized function abstractions in a DNA hierarchy—a λ-calculus meta-machine chain event variables to DNA nodes, the scientific function-abstractions of a species. Combined as a Church-Turing Machine (CTM), the playing field across networked Cyberspace is leveled. Monolithic, corrupt authorities evaporate, and crimes that orbit around unfair privileges disappear.

Unguarded binary data is the biggest threat to the future of the nation. Criminals and enemies use a binary sewer in GPC to infect virtualized life, causing problems no one sees coming; no one can prevent, and never end. Cyberwars are lost, and the written constitution is redrafted by the invisible, uncertain, obscure, and questionable workings of privileged, globally networked software. 

This is bad enough, but law and order are further distorted by enemy attacks on exposed binary data. Malware and hackers roam wild, planting ransomware as the latest form of highway robbery. Still, when artificial intelligence (A. I.) directs crimes, the sewer will overflow and flood the Cyber society. The Romans knew that plumbing is vital to a healthy community. At great expense, they flowed clean from rivers and lakes, across unbelievable aqueducts to cities, streets, and baths. The dirty water flowed out to farms and oceans. 

Software is Cyberspace's water, but without any functional plumbing, a general-purpose computer is deadly dangerous. Alonzo Church's universal model of computation flows data between software functions with the industrial-strength of mathematics to every corned a Cyber-civilization. This mathematical plumbing starts with the capability-limited, object-oriented machine code. For example, a machine code program is protected. It is read-only binary data, and while read-write binary data can change under strictly limited, functional conditions. Malware cannot hide in a CTM because it is detected and rejected at once, red-handed.  Each OOMC is processed by type-rules expressed by the object in a Church-Turing machine. These objects are once again function abstractions where the uppercase methods (LOAD, SAVE, CHANGE,  CALL, and RETURN) are firewall instructions between software modules.

  • λ-calculus Namespace (LOAD, Import, Release, Export, SAVE, Clean, Switch Namespace)
  • Computational Thread (Schedule, CHANGE, Halt, Kill, Synchronize)
  • Function Abstraction (CALL a λ-calculus Abstraction and RETURN Symbol Set)
  • Program (execute machine code)
  • Binary Data Words (read and/or write)

When these function-tight and data-tight objects are recognized by the machine code as a programmable computational framework. A λ-calculus meta-machine binds each data object to the machine to transparently limit the thread to a task. While naked binary machine code in a general-purpose computer has unlimited power, mathematical plumbing focuses on a CTM job. It detects errors in advance of any theft or damage. A CTM fights off malware and hackers case by case, on the spot. Once the undetected infection rate is reduced to zero, the software inherits the reliability of the hardware.

Patches do not limit new problems discovered as the network and functions evolve. A λ-calculus meta-machine from the Church-Turing Thesis provides the formal standards of plumbing that solves this problem. Patching, critical upgrades that remain uninstalled and major updates with incompatible changes are all avoided. Indeed all the so-called ‘best-practices’ that lead to a two-class society are removed if software types are managed by machine code. The best practices needed to sustain the proprietary systems all fall by the wayside leaving modern life plagued by infections from the open binary sewer that carries undetected and unpunished digital crimes through Cyberspace. The hidden absorption of corruption and the silent disease by malware for crimes, mixed with the secret weapons of global Cyberwar, all have the unfair advantage of superuser privileges and surprise attacks.

There is no science to an open digital sewer, binary machine code has no rules, and Cyber-society cannot build high or even survive on a sinking, stinking foundation. The unavoidable and unstoppable infections threaten the health of the nation and the happiness of the citizens. It is simply unacceptable, and the government is ultimately responsible. However, neither government nor the software in Cyberspace can cope with surprise ‘zero-day’ attacks. When everything is virtualized by unreliable software, only the centralized dictators who run general-purpose computer science, the henchmen, and the criminals have the upper hand. They will end America’s experiment with a citizen’s democracy, law and order, and the written constitution. Government by ‘We the People’ and three equal branches of government will degenerate to picking from a handful of global Cyber-titans who spy on the global population while they farm them like flocks of sheep.

The plain truth and the rock bottom problem is that software is water in Cyberspace. It cannot be trusted as a concrete image to work as designed. The general-purpose computer was designed before malware and hacking surfaced and lacks a hygienic system to allow the world to share a global Cyberspace in which to live and work. 

Every day, day after day, malware, hacking, and, most recently, ransomware mix with and contaminate and damage civil society's workings. Attempting to stop Cybercrimes by adding software only works if the software can recognize the signs of an impending attack. Still, the software cannot see the open binary sewers' problem, ignored by every general-purpose computer's basement. No matter how much is spent, binary sewers run down every street in the global digital village. 

Digital crimes stem from an information typing problem that is only solved by hardware type recognition and dynamic binding in machine code. The general-purpose computer treats software as linear, monolithic, indistinguishable binary data. Instead, the software must be engineered as a liquid, not a solid. Without plumbing, open sewers are unavoidable. The plumbing cannot leak and must begin where it matters most, in computer science's hardware basement on the liquid surface of computation as hardware and software fuses together. 

Digital corruption spreads when information is not recognized as liquid binary data. While this ‘liquid’ has no physical structure, there is a mathematical structure. Unfortunately, this structure is obfuscated by a compiler and ignored by the general-purpose computer. This is a mistake that was recognized by the birth of object-oriented programming. A Church-Turing Machine goes to the next and ultimate step of infallibly automating these objects as mathematical symbols bound by the laws of the λ-calculus. These binary objects hide their details as digital types, linked by names in a λ-calculus namespace. The program structure is washed out compilers but recovered by the interpretative Church-Instructions of a Church-Turing Machine. Structured object-oriented machine code navigates a directional DNA hierarchy of symbolic names that shape an application as a formal, dynamic species of type limited software components. 

The plumbing of object-oriented machine code is governed by the universal model of computation and the laws of the λ-calculus. This was side-steps in 1945 when von Neumann turned the Turing Machine into the general-purpose computer. The atoms of mathematics are named λ-calculus variables, and their movement is governed by binding rules of the λ-calculus. Not the laws of gravity like water, but the force of machine code bound dynamically by a λ-calculus meta-machine. The plumbing starts with the individual Church-Instructions bound dynamically to named objects by Capability Keys instead of statically compiled into linear memory pages. When the physical nature of a general-purpose computer is decomposed into named digital objects, computations can be scattered throughout Cyberspace, broadly speaking in the same way, a browser works but now secured and protected by the capability-limited, object-oriented machine code.

Digital pandemics thrive through unchecked infection and remote interference, while the ordinary course of life is made more complicated. Computer problems hang like fear over an innocent citizen's life, but life cannot thrive in fear. For Cyber society to blossom, the software must be trusted, and the only way to measure trust is through qualified functional components. These trusted components are the digital objects of the object-oriented machine code. System reliability is achieved in-depth and in detail by the type-qualified digital boundaries of each item as enforced by a λ-calculus meta-machine. 

For trust to grow, fear must evaporate. Therefore, the statically compiled binary images of monolithic software must be replaced by the object-oriented machine code of a Church-Turing Machine. In a Church-Turing Machine, the software is engineered like water, not like concrete. When piped and focused atomically, it works like water flowing through a Water-Mill; each function's power is maximized, precisely defined by the individual, symbols written on mathematicians chalkboard. The symbols are the atoms of software bound together by the laws of the λ-calculus that apply digital boundary types through the digital technology called Capability-Limited Addressing. 

This solution binds encapsulated digital symbols together with limited access rights defined by scientific expressions as a balanced, testable equation in the set of related equations as a coherent mathematical namespace. The implementation details of Capability-Limited Addressing are explained in my book about the PP250, the first Capability-Based computer that hit the market in 1972. Sadly, before the approach proved its full worth, the semiconductor computer killed all innovation. Backward compatibility became the computer industry's mantra, and since then, the binary machine, using von Neumann’s 1945 shared memory architecture, has hardily evolved. General-purpose computers are totally unready to meet the challenge of an A. I. driven world.

For example, the PP250 addressed centralized privileges, undetected malware, and remote hacking as a fail-safe computer for secure global communications. The plumbing starts in the basement to prevent all leaks as a data-tight, function-tight software machine. It was used for military communication by the UK Army, served in the first Gulf War, and achieved decades of calibrated software reliability. The success is atomic over monolithic because the software is controlled as a liquid instead of a solid. 

The digital plumbing of a λ-calculus meta-machine in a Church-Turing computer closes the open sewers in a general-purpose computer. Infections are detected and isolated as a malware type that causes errors. The source of each error is immediately removed from the namespace and updating the reliability of involved objects. All forms of digital corruption are healed by correcting and regenerating any unreliable items. Thus a Church-Turing Machine is like the Abacus, the slide rule, and Charles Babbage’s two mechanical computers. They all use mathematical machine code and object-oriented technology to connect the physical world to industrial-strength mathematics. The solution of a Church-Turing Machine applies the universal model of computation to keeps data private in a private individual namespace. 

In a Church-Turing Machine, Cybercrimes are caught on the spot by capability-limited addressing. Likewise, digital theft is prevented, and justice is both immediate and transparent. Sufficient to say a crude form of binary capabilities are employed every time one browses the web or uses the cloud, but these software identities are not fraud and forgery resistant. They are strings of binary characters, easily forged or sabotaged in a clickjacking attack, or hidden in a phishing email. 

Every Cybersecurity attack takes place in the gap between theory and practice. A gap exists between calibrated hardware and unreliable software. The hole in 1945 was small but is now vast, and digital crimes start, grow, and thrive in this expanding gap between good ideas and concrete implementations. The gap between theory and practice is closed by the data-tight and function tight architecture of a Church-Turing Machine.

Cybercrime is the fastest growing free enterprise globally, and together with the increasingly dangerous weapons of Cyberwar, they bring to an end American freedom, equality, and justice. For a Cyber society to thrive and blossom Cybercrime, Cyberwar must first be detected to be punished and regulated to be stopped. Law and order of a citizen’s Cyber-democracy meet the Church-Turing Thesis standards, not a centralized dictatorship of the Cyber-titan. Closing the gap between democratic theory and institutional practice brings civil law and order to computer science. When Capability-Limited Addressing is the essence of computer science, the object-oriented machine code stabilizes the software so that Cyberspace can be trusted. Because each symbol's reliability is calibrated at run time, the weakest link is always known and quickly addressed. When protected by digital boundaries, the characters in Cyberspace are fully plumbed as typed and bound digital objects to implement individual mathematical abstractions as calibrated, insulated mathematical (digital) items.

Like everything else in the world, software, when executed in a digital form as object-oriented machine code, the programs, and data, are subordinate to mother nature. Thus, in this digital form, the digital objects obey the laws of mathematical science. It is the object-oriented digital form of machine code in a Church-Turing Machine that recognizes the liquid form of atomic software and manages binary data as types. 

Water is defined by two hydrogen atoms and one oxygen atom, known by the shortcode H2O. A computational atom determined by both Alan Turing and Alonzo Church is a program with substituted variables. The expression can be written symbolically in many ways, but (a + b = c) is where most children start. When digital software is objectified this way, the limited type symbols match object-oriented digital units with typed limited access rights that are enforced individually. By implementing the digital boundaries, software obeys a λ-calculus meta-machine, and as with every other paramount concern, scientifically defined law and order are achieved. In a Church-Turing Machine, a λ-calculus meta-machine closes the open sewers and tames any wild rivers that flow through a global Cyber-society. 

The laws of plumbing computer science demand function-tight, data-tight computation, as expressed by the alternative sides of the Church-Turing Thesis. Since 1936 these two alternative computational mechanisms, the Turing Machine and the λ-calculus, have been considered self-standing, diametric alternatives. This view is different. They are the two sides of a universal computation model that exists throughout the known universe as defined by nature here on earth. The atomic coin of mathematics is symbols. The mathematical theory meets the concrete implementation on the computational surface of a computer. When a computer is networked, the laws of the Church-Turing Thesis demand a Church-Turing Machine be used. When considered this way, the two alternatives mesh together, like clockwork, as the plumbing of a Church-Turing Machine. 

The symbols of the λ-calculus exactly match fraud and forgery resistant digital Capability-Keys that program the access rights of capability-based, object-oriented machine code. Now software functions compute on the high-speed rails of mathematics, precisely as taught at school. The networked rails set in a mathematical namespace are defined by the linkages between symbols in mathematical expressions. Events drive computation steps as the protected calculations of a single equation with locally substituted values for each character. The equations are chained together by Capability-Keys that define the access rights to a symbol as a directional DNA hierarchy. Every time a sign is characterized by an equation, a linkage exists in the programmable DNA. This directional DNA hierarchy dynamically defines the application's framework as a living instance of species of software. Each example is a protected, anatomically living creature of typed functional symbols, the type of limited digital components of a scientifically secure, mathematical namespace.

By individually implementing each symbol in any expressions, as a type of functional digital object, an object-oriented programming language engineers the laws of the λ-calculus. Dynamic binding is materialized by a λ-calculus meta-machine. The rails of computation are restricted to navigating the directional DNA hierarchy of symbols in the high order mathematical expressions. For example, The PP250, as a telecommunication computer, could evaluate the equation:

[myCall = connectTo(myMother);] 

defined by one Church-Instruction as an indivisible (incorruptible) step of object-oriented machine code. 

This Church-Instruction explicitly using three Capability-Keys, one for each symbolic name. Each symbol (myCall, connectTo, myMother) is a node in the directional DNA hierarchy of a namespace. The telecommunication objects are imported to support that application. Other applications can coexist in the namespace that belongs to an individual. All the private data to the individual can be housed in this secure, multi-function, secure namespace.

Moreover, an individual can manage another λ-calculus namespace that is shared by a society of users. Some of these might implement civil applications, for example, ‘Homeland Security,’ Wall Street Corporations, or legal entities in a City. None of these functional namespaces are overshadowed by Cyber-titans. A unique namespace is the private digital shadow of any legal entity in a Cyber-society. The namespace's peculiar directional DNA hierarchy extends the legal entity privately, safely, and securely into global Cyberspace.

Each symbol supports or implements a mathematical expression that implements Cyberspace's shared and private and civil functionality. Other namespace services are addressed by other unique Capability-Keys as the immutable symbols to access functional digital objects, both local and remote, in the Church-Turing network called Cyberspace. The public and private functions of a namespace can all be written on a chalkboard. As first demonstrated by Babbage and later documented by Lovelace, mathematical machine code survives because it implements a virtual chalkboard of infinite knowledge. 

Unlike binary data, both algorithmic and atomic security are simultaneously expressed by the chalkboard's functional symbols. The plumbing of networked Cyberspace is made both clear and complete. Each name, including its machine type and its scientific relationships, is checked and regulated by scientific laws of λ-calculus binding. The individual Capability-Keys and the λ-calculus meta-machine infallibly automate these scientific rules. The Church-Instruction are defined dynamically by the Capability-Keys and offer functional firewalls between the object-oriented machine code objects. The Church-Instructions transparently navigate threads of computation through the namespace's symbols organized by the directional DNA hierarchy. Every digital movement, in, among, and between symbols is verified and validated in advance by the capability-limited addressing that shelters the object-oriented machine code. Errors are detected on-the-spot. The type-safe digital boundaries prevent harm by enforcing atomic justice, equality for all.

Prosperity and security are achieved by the flawless, infallible automation of computer science as pure mathematics. By taming any wild forces in Cyberspace conflict, free progress is achieved. Civilization began when artists, scientists, and engineers, work together to build a better life. Strong walls and locked doors guarded the beautiful Ishtar Gate to enter Babylon. Here, the Abacus's beauty, the first infallible machine, flawlessly simplified addition and subtraction for traders in markets throughout the world. 

These computer science pioneers secured the atomic structure of functional decimal numbers as a handful of beads on each Abacus rail. The abstraction of four fingers representing the number 1 and a thumb representing the number 5 counts from zero to nine while the rails can scale to any chosen size. By enforcing the liquid structure of a decimal number in a solid wooden framework, anyone learns to add and subtract without any special proprietary skills as are needed by a general-purpose computer that lacks a computational framework that maps from the theory to the practice.

The industrial strength of the Abacus correctly supports decimal arithmetic. The simplified machine is accepted worldwide and endures forever. The decimal structure never changes, and thus the Abacus lasts flawlessly into an endless future. The in-depth and detailed mathematical framework serves arithmetic and every member of society equally, consistently, and faithfully.

Conversely, monolithic software compiled for a general-purpose computer is proprietary. Each compilation creates a one-off binary build. Each build must be exhaustively tested. However and devoid of any external source of corruption, only possible for isolated mainframes. The lack of a programmable framework to keep computation on track exposes the binary details that allow criminals and hackers to attack the fluid binary image protected as a solid without plumbing. 

Without a computational structure interred deep in the machine code, industrial-strength cannot be achieved. Without physical resistance, hackers and criminals simply reach across the digital network to raid, attack, and destroy their chosen targets. Without on-the-spot detection, criminals, hackers, and enemies stretch their attacks far and wide, using unprotected binary machine code to achieve their goal.

The cost required to defend general-purpose computers from remote attacks is never-ending, and conversely, the productivity gain in fixing the exposure is incalculable. The benefit to society and the nation accumulate forever and removes the fear hanging over life. The attempts to fix the problem with networked software always fail, cost far more than can be afforded, and cannot be sustained by innocent civilians. Furthermore, encryption, the favorite solution of the Cyber-titans, only protects data in a static state. At the same time, the fight on the surface of Cyberspace is dynamic, incompatible with encryption.  

The architectural problem with the general-purpose computer is an inadequate framework for computation. Monolithic compilations lack essential digital boundaries to detect and reject digital corruption. When code is downloaded from a tainted server, it becomes a spanner dropped into the works. It results in undetected and unconstrained errors that crash the life-supporting application of a democratic society. Attackers just take advantage of the assumption of blind trust regarding the open binary sewers in every general-purpose computer. Blind faith was only adequate after WW2 when everything, including the software, was homegrown. John von Neumann and other inventors and followers defined the cults of general-purpose computer science. 

Attacks remain undetected by a general-purpose computer for weeks, months, even years before a crime is reported, typically by the press. The offenses seldom, if ever, result in an arrest or punishment. Consider the 448-page Report on the Investigation into Russian Interference in the 2016 Presidential Election. It was released by the DOJ in April 2019, four or five years after the attack. Special Counsel Robert S. Mueller, III explains how Hillary Clinton’s election team was hacked as early as April 2015. Hundreds of thousands of documents were stolen by the GRU in Russia. Many were published on Wiki Leaks. Despite this attack's scale and significance, it took years to identify, investigate, and document. It cost over thirty million dollars with no constructive end. 

This is only one of many corresponding attacks that succeed every day. Cybersociety fails when tormented by unresolved strife. The example proves three critical things. First, it is easy to send hundreds or thousands of ‘spearphishing’ emails to target individuals. After one hasty click and malware is installed. Second, the lack of on-the-spot substantially escalates the consequences. Third, the time, cost, and effort to followup and prosecute each loss is overwhelming. 

Social media is filled with crimes that distort digital data to achieve some devious end. The most insidious example is called Deep Fakes. It uses A. I. to forge and fraudulently misrepresent others' speech and opinion, the more famous, the better. Only when private data is reliably watermarked will this problem be tamed. It is another case of in-depth and detailed plumbing. The plumbing hardware of capability-limited addressing includes a watermark by origin namespace, date, and time to immediately identify fraud and forgery in Cyberspace. Rapid error detection avoids data loss and prevents any follow on attacks using the stolen data, be they security credentials or video images. 

The technology of capability limited addressing was proposed in 1965 at MIT by Jack Dennis. It was soon deployed for high-reliability global communications networks where the PP250 used object-oriented machine code as flawlessly as Babbage’s infallible mechanical engines. The built-in type checking performed by a λ-calculus meta-machine code kept the software on track. Data-tight and function tight error detection prevents data theft and limits internal digital corruption to one mathematical symbol at a time. Significantly, object-oriented machine code goes far further by decoupling functionality from computation and thereby reversing the slide into a Cyber-dictatorship clarifying ownership through watermark tracking on critical data to prevent fraud and forgery. 

Using Capability-Keys places the immutable digital tokens of mathematical power through functionality and private information into the individual user's hands. Thus the Capability-Keys reverses and deconstructs the centralizer architecture of the general-purpose computer. This is the framework for a citizen's democracy in Cyberspace. As a functional media secure Cyber-society, every aspect of society is liberated. 

Critically, untrained, wholly unskilled civilians can learn to write fail-safe, flawless software the same way taught at school. Programs that detect errors on-contact. Second, no obscure, skilled best-practices are needed to write programs that remain safe. The software is flawless, infallibly automated to the standards of Charles Babbage. Third, every civil abstraction that matters to society can exist secure in Cyberspace in ways that can never be undermined or bypassed. For example, a gold chain is a badge of office ruling over some functions such as a city mayor. The set of mayoral authorities is valid for a while. The local community rules the time-bound to the abstract media machine used by the mayor. 


test 

Skip to main content



Sections

Democracy Dies in Darkness

Get 1 year for $29


Sign in


Home

Share

233

Tech

Consumer Tech

Future of Transportation

Innovations

Internet Culture

Space

Tech Policy

Video Gaming

Space

NASA’s new rocket would be the most powerful ever. But it’s the software that has some officials worried.

This illustration made available by NASA shows the Space Launch System during liftoff. (AP)

This illustration made available by NASA shows the Space Launch System during liftoff. (AP)

By 

Christian Davenport

Oct. 31, 2020 at 7:00 a.m. EDT

NASA’s newest moon rocket is powered not only by four RS-25 engines that, combined, unleash 2 million pounds of thrust, but by two solid fuel side boosters that burn six tons of propellant a second at such enormous temperatures that during a recent test fire in the Utah desert, the flames turned sand to glass.


Follow the latest on Election 2020

When it launches, NASA’s Space Launch System rocket, a towering 322-foot behemoth — taller than the Statue of Liberty — would be the most powerful rocket ever flown, eclipsing both the Saturn V that flew astronauts to the moon and SpaceX’s Falcon Heavy, which has launched commercial and national security satellites as well as founder Elon Musk’s Tesla Roadster on a trip to Mars.


But as NASA moves toward the SLS’s first flight, putting the Orion spacecraft in orbit around the moon, it’s not the rocket’s engines that concern officials but the software that will control everything the rocket does, from setting its trajectory to opening individual valves to open and close.


AD

Computing power has become as critical to rockets as the brute force that lifts them out of Earth’s atmosphere, especially rockets like the SLS, which is really an amalgamation of parts built by a variety of manufacturers: Boeing builds the rocket’s “core stage,” the main part of the vehicle. Lockheed Martin builds the Orion spacecraft. Aerojet Rocketdyne and Northrop Grumman are responsible for the RS-25 engines and the side boosters, respectively. And the United Launch Alliance handles the upper stage.


All of those components need to work together for a mission to be successful. But NASA’s Aerospace Safety Advisory Panel (ASAP) recently said it was concerned about the disjointed way the complicated system was being developed and tested.


At an ASAP meeting last month, Paul Hill, a member of the panel and a former flight and mission operations director at the agency, said the “panel has great concern about the end-to-end integrated test capability and plans, especially for flight software.”


AD

Instead of one comprehensive avionics and software test to mimic flight, he said, there is “instead multiple and separate labs; emulators and simulations are being used to test subsets of the software.”


“As much as possible, flight systems should be developed for success, with the goal to test like you fly. In the same way that NASA’s operations teams train the way you fly and fly the way you train,” Hill said.


Also troubling to the safety panel was that NASA and its contractors appeared not to have taken “advantage of the lessons learned” from the botched flight last year of Boeing’s Starliner spacecraft, which suffered a pair of software errors that prevented it from docking with the International Space Station as planned and forced controllers to cut the mission short.


NASA has since said that it did a poor job of overseeing Boeing on the Starliner program and has since vowed to have more rigorous reviews of its work, especially its software testing.


AD

The SLS software concerns are the latest red flags for a program that has struggled to overcome cost overruns and setbacks. A slew of government watchdog reports over the years have painted a troubling picture of mismanagement.


Three years ago, the NASA inspector general reported in an audit that NASA had spent more than $15 billion on SLS, the Orion spacecraft and their associated ground systems between 2012 and 2016. It estimated the total would reach $23 billion.


The report chided Boeing, the main contractor, which it said “consistently underestimated the scope of the work to be performed and thus the size and skills of the workforce required.”


Another report, by the Government Accountability Office last year, found that despite Boeing’s poor performance, NASA continued to pay it tens of millions of dollars in “award fees” for scoring high on evaluations.


AD

NASA says now that the program is finally on track, with the vehicle undergoing a series of rigorous tests known as the “Green Run” at the Stennis Space Center in Mississippi that will culminate with a “hot fire” — the ignition and eight-minute burn of its engines scheduled for later this year.


Then it would be moved to the Kennedy Space Center in Florida, ahead of its first launch, scheduled for late 2021. NASA administrator Jim Bridenstine said “all of the elements that we need for a successful 2024 moon landing are underway as part of the agency’s Artemis program. And we’re moving rapidly to achieve that goal” — a dramatic White House-ordered acceleration of the original timetable that foresaw a moon landing in 2028.


For that deadline to be achieved, however, the flight software has to work perfectly. The first test is expected to come late next year, when the SLS would fly for the first time in the Artemis I mission, putting the Orion spacecraft without any crews on board in orbit around the moon


AD

“When it all comes down to it, flight software is the functional integration piece of the rocket,” Dan Mitchell, NASA’s senior technical leader for SLS avionics and software engineering, said in an interview. “The rocket doesn’t fly without flight software. The software commands all the valves and the engines. It takes readings of all the parameters inside the vehicle, the navigation and position information and uses all that information to control the fight.”


There was perhaps no better illustration of the significant role software plays in space flight, and how flaws in the coding can have severe consequences, than Starliner’s test flight.


Shortly after it reached orbit, the spacecraft, which had no astronauts on board, ran into trouble because the spacecraft’s flight computers were 11 hours off. With the spacecraft thinking it was at an entirely different point in the mission, it attempted to correct its course, burning precious fuel and forcing controllers to end the mission early without completing the main goal: docking with the International Space Station. Controllers later found another software problem that could have caused the service module to collide with the crew capsule after separation, potentially endangering astronauts, if any had been on board.


AD

Boeing was able to diagnose the problem, send up a software fix and ultimately bring the spacecraft down safely. Later, Boeing said its testing of the software was deeply flawed, allowing the two problems to go undetected in the spacecraft’s one million lines of code. It was an admission reminiscent of the software problems that plagued its 737 Max airplane, which suffered two crashes that killed 346 people combined and remains grounded worldwide.


Boeing officials have said that during the test flight, the Starliner was pulling its time from the rocket. During testing, officials were mainly focused on making sure the two vehicles were communicating correctly, but cut short the test so that it never uncovered that the spacecraft was reading the wrong time.


If the test had continued, “we would have caught it,” John Mulholland said earlier this year, when he was the Starliner program manager for Boeing. He’s since transferred to Boeing’s space station program.


AD

During the software test for the service module separation, Boeing didn’t use the actual hardware but rather an “emulator,” a computer system designed to mimic the service module. The problem was the emulator had the wrong thruster configuration programmed in at the time of the test, Mulholland said.


NASA officials in charge of the SLS program said they are confident the testing protocols for the SLS rocket and Orion spacecraft are far more robust. For starters, the program is set up differently. Boeing owns and operates the Starliner spacecraft and uses it to perform a service for NASA — namely flying its astronauts to the space station.


On the SLS program, by contrast, NASA owns and will operate the rocket, and is responsible for all the integrated testing.


Mitchell, the NASA senior technical leader, said the SLS team took the Starliner mishap “to heart.” As a result, they spent four days testing the various interfaces between the SLS and Orion, he said. “We methodically walked through requirement by requirement. ... It was a very, very detailed and fruitful interaction that we had across all the interfaces,” he said.


AD

The review turned up one issue with how the rocket’s second stage interpreted data from the first stage, he said, but that “has been determined to be a benign issue” that doesn’t require any modifications at this time.


NASA pushed back on the safety panel’s findings, saying in a statement that “all software, hardware, and combination for every phase of the Artemis I mission is thoroughly tested and evaluated to ensure that it meets NASA’s strict safety requirements and is fully qualified for human spaceflight.”


The agency and its contractors are “conducting integrated end-to-end testing for the software, hardware, avionics and integrated systems needed to fly Artemis missions,” it said.


Once the vehicle is moved to the Kennedy Space Center, testing will continue with a “countdown demonstration and wet dress rehearsal [by fueling the rocket] with the rocket, spacecraft, and ground systems prior to the Artemis I launch.”


Speaking to reporters in October, John Shannon, a Boeing vice president who oversees the SLS program, said the core stage holds “the brains” of the rocket, the avionics, flight computers and “the systems to control the vehicle.”


But he said the company’s portion of software development and testing was limited to what’s called the “stage controller,” or “ground software that commands the vehicle itself.”


Shannon said the systems have been “completed, tested in integration facilities at [NASA’s] Marshall Space Flight Center. We’ve had independent verification and validation on it to show that it works well with the flight software and the stand controller software. And it’s all all ready to go.”


It is helpful to know that the SLS will produce 8.8 million lbs. of maximum thrust, 15 percent more thrust than the Saturn V rocket.

https://www.nasa.gov/exploration/systems/sls/overview.html
How can anyone trust Boeing. In the past decade, they have shown to be arrogant and sloppy to the point of killing people. Yet they and their execs continue to get extra money to cover their expenses. Boeing needs to be “rode hard” on every project, be it space systems of aircraft safety. 
" Boeing, the main contractor, which it said “consistently underestimated the scope of the work to be performed and thus the size and skills of the workforce required.” Another report, by the Government Accountability Office last year, found that despite Boeing’s poor performance, NASA continued to pay it tens of millions of dollars in “award fees” for scoring high on evaluations."

ARE YOU KIDDING? THAT IS THE DEFINITION OF THE MILITARY-INDUSTRIAL BUSINESS PLAN.  A BIT LATE TO CRITICIZE IT AFTER THE PAST 50 YEARS OF OPERATIONS, NO?
NASA has of course long time experience with using contractors for the development of software for their missions. 

My former company (IBM) did much of it for the Apollo missions (lunar module) and the International Space Station. 

In software development, there are different maturity models used by different software development organizations.    A Capability Maturity Model has been developed to ascertain the degree of expertise for each organization.  The greater the level of maturity, the lesser chance of software failure. 

For the Space Station, the IBM Houston division used the highest level of capability, SEI Level 5, to develop all of the software.

With multiple contractors and sub-contractors each developing separate and to-be-integrated modules, perhaps each with different levels of maturity in their development efforts, this would seem to be a challenging environment to deliver a bullet-proof, fully tested and integrated system for this latest rocket. 

https://en.wikipedia.org/wiki/Capability_Maturity_Model
Especially with Boeing in the mix. 
“As much as possible, flight systems should be developed for success, with the goal to test like you fly. In the same way that NASA’s operations teams train the way you fly and fly the way you train,” Hill said.

This is really comforting that success is what they are going for.  
Distributing the project across multiple companies...Boeing had great success with integrating disparate elements on the 777(not).  SpaceX once again did what?  End to end astronauts up and back.  How much extra overhead is NASA paying for by retaining 4 + company's management teams etc.  Those 3rd homes aren't going to buy themselves.  
NASA will probably do the test flight, but then decide that private rockets are cheaper and better, and that the SLS is expendable.
It's a rule. The more an organization/project gets top heavy with herds of middle managers milling around ever-extending meeting schedules, the worse the software will be. That's guaranteed. Why, you ask? Simple, because code is bit and byte critical, and that's every bit and byte, something a lowest-common-denominator committee could never ever hope to understand.
If the software is from Boeing, it is automatically suspect. They hire bottom-dollar software engineers and it shows.
Yes, there are amazing scientists, engineers, and administrators at NASA. But obviously there has also been a degree of politics, both with "jobs in my state" and "My NASA is the greatest, re-elect me" to lose a 40 year lead in space.

Software??? That has always been an issue where differing centers won various parts of a spacecraft and fiercely demand their way ruled. One of the Mars Climate Orbiters crashed because JPL measured in English units and Lockheed Martin used metric. 

No wonder Space X is having great success -- they, not a Senator, decide where the plants and jobs go, and the whole project is under Gwynne Shotwell. 

40 year lead -- Are you kidding! Don't get me started about moving the LEM from Long Island New York and forgetting how to land stuff on the Moon.
As an EE that has lived through many hardware and software reviews, I can tell you software is the most challenging to evaluate, especially for an entity trying to oversee their subs like NASA. What I want to know is how much effort is being put on software scenario test simulations right now and particularly for simulations where one subs systems interconnect with one or more others. Is there the equivalent of a team of software engineers from all the subsystems prepared to sit together and interact in real time on real missions like there used to be for hardware? The NASA engineers need to be all over the subs particularly on software. Taking their word that all is ok has proven not to be sufficient for safety critical controls, especially in the case of Boeing. 
So Boeing runs its testing like tRump runs everything he does: half gassed.
It seems that it does not matter if the rocket fails.  The only job is to blame the other guy for the failure.
As I started reading the article, I thought it might be funny to make a cynical joke hoping that Boeing in not involved, given the deadly software-related flaws in their infamous commercial airliner. As I read on, my amusement turned to worry.   
'Publish and patch' is probably not the best software dev paradigm for a 322 foot tall rocket with six separate motors.
“Go fever” is always at the root of potential tragedy in the space industry. Just read this. A launch planned for 2028 now rescheduled for 2024. A manned space flight schedule that depends on everything testing perfectly. This should be rescheduled immediately to add in time to get it normally tested with time for repairs and fine-tuning. The timing of presidential terms can’t drive engineering and safety schedules.
You wrote, "As a result, they spent four days testing the various interfaces between the SLS and Orion, he said. “We methodically walked through requirement by requirement. ... It was a very, very detailed and fruitful interaction that we had across all the interfaces,” he said."

Four days?
Every requirement?
Every interface?

Did you ask how many requirements they covered? and how many are in the system?

This sounds implausible or outright wrong on its face.
I have to realize that Boeing hasn't messed up the software -- yet.
When you accelerate the timetable from 2028 to 2024 and try to do thing in a hurry, quality suffered.
Exactly...you can have speed or quality or price...you only get 2 out of 3
Boeing outsourced part of its 737 MAX software. 
Let's hope it has not done the same for SLS.
This article is a mess.  Doesn't WAPO employ editors anymore?
There is no more pointless boondoggle than the space program (well, one: building fighter planes even the Air Force doesn't want). Bread and circuses... 
Trump’s assets and emoluments keep increasing.

He found the Golden egg in the White House and power.
A few billion years from now when Earth is inside the sun's atmosphere, we'll be whistling a different tune.
Shorter: dinosaur old hardware guy management with no systems skills sidelines software engineers until end of project, then discovers they have a "software problem". 

No, guys, you don't. You have a systems problem. Software is a subsystem and must be designed in, not slapped on at the end of the project.
Uncoordinated contract for Trump’s 1%, especially with Trump’s Boeing 737 friends.
Finally someone gets it right.  The 737 MAX didn't have a software problem either.  But they didn't disclose what they had to save money and retrain pilots.
I don't think things will change at Boeing, which used to be a very good engineering company until it was infected by "leadership" from ex-GE financial management.
After the Boeing 737 MAXX software debacle with the FAA and consumer concerns that grounded their Fleet to Parking lots, NASA and contractors need to be concerned with Boeing being the builder and self auditors and evaluators as approved for the MAXX, especially with Trump’s buy-in and emoluments from Boeing.


Once either Starship or New Glenn becomes operational, ditch SLS.  We can't afford to be launching at $850 Million a pop, when an equally viable alternative costs $100 Million or less. 

Starship is re-usable, SLS does not stand a chance.  Given the two companies track record (Boeing vs SpaceX), I would place my bets on SpaceX (I can't imagine anyone putting their bets on Boeing).
Bush and Trump’s Party have given a free-hand in self monitoring which is a plus for Trump’s 1% buddies and not citizens.

Trump always believes bigger is better, in Stormy weather or not.
Are there any trump appointees involved with any of this?
Trump’s emoluments from Boeing and the Stock Market.
It's rocket science. What could go wrong? 
We have a Stable Genius at the helm knowing how to manipulate people to his dark side of the Moon.
If the rocket crashes, just use ctrl+alt+del
Trump doesn’t even know what ctrl+alt+del even means.  He connives but doesn’t know anything but manipulating anything he does.


I wonder in what language this software's being written.  The government seems to like Cobol.  Just because it's obsolete shouldn't bother anyone.
Common Business Oriented Language (COBOL) is totally unsuited for this project.  COBOL is designed for business applications. It's not written in COBOL.   COBOL has been around for along time, since the late 50s, and many business applications are written in COBOL.  These legacy systems are still around today, and not just in government, but in private businesses as well.
I believe this was a joke statement as Trump’s supporters like conspiracy or Deep State theories spreading around.
Probably HAL/S.
Too many things can go wrong. The project is unnecessary.
Many things are risky and unnecessary. High school romance comes to mind, as does recreational drug use. Nevertheless, they're fairly popular. So avoiding risk and addressing necessities would not appear to be the sole motivations behind behaviors. You'd better do a little more work on this thesis before you submit it to the committee.
“The rocket doesn’t fly without flight software. The software commands all the valves and the engines. It takes reasons of all the parameters inside the vehicle, the navigation and position information and uses all that information to control the fight.”

I can see the interchange now:
  
Dave  "Open the pod bay doors, Hal."
Hal:  "I'm sorry, Dave.  I can't do that."
Dave:  "What's the problem, Hal?"
Hal:  "I think you know what the problem is better than I do."
Dave:  "Don't tell me.  That fat Putzo used cheap Chinese parts instead of German made.  Is that the problem??"
Hal:  "I can't answer that, Dave.  I was programmed by Mike Pence."
Dave:  "Damn I'm screwed."
Hal:  "I'm afraid you are correct, Dave.  MAGA controls everything now.  Long live MAGA."
The country will die if Trump is elected by hook or crook in 3 days.

Martial law is not fun having lived for about 9- years in 6 countries like that, especially as Americans.

In the United States of America, Trump’s rich and White policies will hurt only 90 percent of the country.

Hunger games will become the norm pitting people like Gladiator types fill the battle ground on reality circus nights, under the lights.

Trump has sort of done that with his actions or inactions during the Pandemic, Impeachment buying off Senators, economic disasters and telling bold faced lies.

Four more years will put people’s mental state at the breaking point.

Trump cannot even remember Michael Cohen, Mike Flynn, John Kelly, Paul Manafort and hundreds of others he has Used and thrown into the trash or under the bus.

He has pardoned people that are higher class than he is but they don’t have a gold T on great and spacious buildings.
WHAT? I thought you said feet/second ... now you say it should have been meters/second? Oooops...

Well, in the end I hope the software engineers do a better job than the developers of my laptop's operating system when pushing new updates!
The US needs to be an undisputed leader in space, this thing has got to work flawlessly.
I think Boeing said something similar about the 737 Max.
Put Trump in the first seat as he is more knowledgeable than Generals that fly because of his 737 and an Uncle that was a scientific wonder.

Trump has fantastic DNA with his grandfather’s business acumen in the Pacific Northwest businesses with women.
John Glenn commented when asked about his orbital flight- "How would you like to be sitting on top of the lowest bid on a government contract?". Still a good question.
Except these days those contracts are often 'no bid' contracts given to which ever company the government thinks can best do the work. And they spread the work around amongst different companies. And in a way that makes a lot of sense. There aren't enough major contracts these days to constantly put everything out for bid. And it makes sense to keep a certain number of companies in business, as long as they can do the work. Even though contracts may be no bid, these companies are still well aware that they have to keep costs somewhat 'competitive.'
Spam in a can. 
Hmmmm....we're looking into it.  All of the subsystems performed optimally in the lab.  And we can say the same for their interfaces.  But instead of ETT (estimated time-to-touchdown), some joker coded ETI (estimated time to impact) in the rocket chicken dusting system, and while that is of no consequence, the overall launch system acted like a cheese drill.  Beyond that we have no further comment at this time.

However we fully believe that this is safe for human spaceflight.
Software?   At a billion dollars a pop and years behind schedule this is the launch platform the country can't afford.
My feeling is build a little, test in the real world a little.  What I mean is a step by step from a baby rocket to the big cheese.  Doing it this way is too risky.

And remember that little oops called a 737 max?

Someone told me it was mostly tested in a simulated manner...

Have we learned anything yet?
Software. I recall that several years ago there was a rocket launched very normally from Cape Canaveral, I believe, which then exploded unexpectedly some 70 or so seconds into flight. What had happened puzzled the heck out of the engineers for months. The problem was finally found buried in the controlling computer code where there was a semi-colon that should have been a colon. 

The Devil's in the details and the bugs can eat you...
Software should be required to have a colonoscopy.  Is that what you're saying?
It wouldn't hurt...
"And it’s all all ready to go.”  Up in smoke?
I can understand concern about the software.

If engineers built buildings the way programmers build software the first woodpecker to come along would destroy civilization.
Think I will pass on flying in a "disjointed" rocket.  With questionable software.  
Nice blinding-red rocket! Is the MAGA label on the back side? Somebody at NASA wants to make Dear Leader happy.
Hmmm.  Happy to think I am not a passenger.

Oh, great -- Boeing built the rocket's core stage.

Even if Boeing isn't responsible for the programming that controls this most potentially dangerous portion of the massive rocket, reading the words "Boeing" and "software" in the same paragraph should give everyone pause.
Hope that Boeing realizes that if they don’t get this project to work, everyone will know that it can’t even match 1960’s technology, and it will pretty much be over for them as an engineering company. 
Basically this is old news.  What you should be discussing is:  1) How a certain senator from Alabama has been instrumental in pushing this project despite its numerous flaws, delays and cost overruns 2) The 2024 schedule for the first flight will not be met 3) SLS uses a single flight architecture that is obsolete 4) Additional new hardware flaws have been recently discovered during the current SLS testing which will hold up the planned test fire 5) Congress is highly unlikely to fund what NASA has been requesting. 

And please correct your typo ("It takes reasons").
Contractors never commit enough resources to do the job. How else would they make money?
Boeing builds the main rocket and presumably does the software? Better hope the Orion capsule has a reliable ejection system.

The problem with NASA and its contractors is that the all of the manned space centers and many of the industrial sites are located in the deep South.  This enervated a business elite that never gave up their Antebullum business model of cheap goods made with cheap labor.  

This sentiment runs deep:  During the early settlement of the southern colonies, there was no law but that of the plantation owner.  As the country grew, although America at large was a democracy, the plantation owners were the hand that moved the glove of government in the South.  A study of Kentucky officials during that era revealed that every judge, state representative, and higher official was a plantation owner.  It was the threat this aristocracy posed to democracy that motivated Abraham Lincoln to oppose it.  Since Appomattox, the regional business elite formed a political alliance with the 'Whig' wing of the Republican Party until they finally joined it during the Reagan years.

NASA now has a technical elite that is focused on applied research rather than the programs like SLS.  This was a change from the late 1970s when most NASA engineers were working the program.  Now, the work is dumbed down with a barn of low-skilled engineers hired to execute it.  What is gone is the strategic vision that marked those like Wernher von Braun at NASA, lesser notables in industry like Philip Bono, and writers like Arthur C Clarke and Willy Ley.  There is no cadre of capable technical people to steer the effort.

The fulfillment of such a system would come with future bases on the Moon and Mars, which would resemble the situation of the plantations of the early South as being beyond the law and the inhabitants as helots.  Elon Musk says that he has no intention of abiding by international law if he sets up his base on Mars.  But this scenario was foreseen by science fiction writer Robert Heinlein in "The Moon is a Harsh Mistress," where the inhabitants revolted to secure their rights.

Wow.  The 1619 version of the space race. Or it could be that Canaveral missile range was selected because of weather and proximity to ocean.  NASA is Houston due to LBJ and politics.

You're absolutely correct in your facts, but it was a mistake to have located the Marshall, Johnson, and Stennis space centers in the deep South.  


"NASA has since said that it did a poor job of overseeing Boeing on the Starliner program, and has since vowed to have more rigorous reviews of its work, especially its software testing"

Time and time again, the government pays these contractors to have their own internal oversight, systems, plans and procedures to meet contractual requirements, but things like this continues to happen. Many times they blame their problems on the government.
Trusting Boeing to get things right after the 737 Max fiasco is really quite stupid. It’s not just the lives of a few astronauts at stake here. It’s the entire future of NASA and US manned exploration and colonization of the moon’s surface.
Instead of one comprehensive avionics and software test to mimic flight, he said, there is “instead multiple and separate labs; emulators and simulations are being used to test subsets of the software.”
Are you telling us that this thing is going thru integration test when it takes off?  Oh my… Someone probably got an award for saving money.  This is a classic Dilbert fubar.
A prototype for the De-Trafficator perhaps.
Another reasonable question to ask is why NASA, after SpaceX, Blue Origin and even ULA are working on or have already demonstrated to the point of it becoming routine, "reusable rocket" technology the SLS is a one-time-use, throw everything away booster?
This is a much bigger rocket than what has been developed for orbital insertions.
An engineering and design issue, not a "laws of physics don't allow it" issue.
Only one of the those companies listed has any experience with orbital return, which includes not only the Falcon 9 first stage (and side boosters for the Falcon Heavy), but also the fairing halves and the Dragon vehicles (both Cargo and Crew). Hopefully Blue Origin, now that they have the bugs worked out on the BE-4 engine and should be going to production soon, can put more resources into the New Glenn. ULAs current idea of reusability is catching the BE-4 engines from a Vulcan-Centaur.

At the current rate, SpaceX will launch their developmental Starship/Super Heavy prior to the SLS, which makes this even more infuriating. Watching how New Space operates through iteration - including e2e testing throughout - is fascinating to watch. Old Space needs to adapt, including the contracting mechanisms of the government. The fact that NASA and the USSF have now agreed to use flight-proven rockets is a very good start.
Problems? Oh, it's Boeing... again.
I am in my 70s. This is just another boondoggle NASA has given us like the James Webb Telescope, the Orion Capsule, and now their heavy launch vehicle. They are years behind schedule and billions over budget. This is typical of NASA  who has not a good leadership and project management team since Jimmy Carter was president. How long are we going to pay for overly expensive, badly managed programs? I am sorry, but these project should be contracted out to private companies and then they would be held responsible. These private companies like Space X, and Blue Horizon need to take over the design and testing of rockets. NASA 's mission should be reduced to a technology development agency like DARPA. Right now we are using space vehicles that operate on the same principle as the V-2. We poke a hole in the atmosphere on the way up, and then burn on he way down. We fight the atmosphere coming and going. We need to use the atmosphere both when  we fly to orbit and coming home. Nasa would be better served developing the means for the design of a space plane. A second needed project is to R&D a high thrust/long burring engine that can deliver us to Mars in 3 month. Keeping people in space for a year each way is asking for disaster. We will never explore space using chemical rockets and we have had 50 years to research it with no results as far as I can see.