The Fulcrum of Cyberspace

The seminal book "Understanding Media: The Extensions of Man" written in 1964 by Marshall McLuhan (1911-1980), argues that the form of media changes civilization's development more than the content, and instead of focusing on the content transmitted through various forms of media, then the electric light, radio, and television, one should focus on the electronic media itself, today it is cyberspace. Consider, for example, how social media warps a child's development and how trust and lies change adults perceive and understand the world. Is it information content that polarized society into warring clans, or do individuals hounded by crime and disinformation look for alternatives?

Further, McLuhan explained that inventions are extensions of the body, in the case of the wheel, and the mind, through media that began with speech, then written, printed, and transmitted words. He died before cyberspace and A.I. changed individuals, families, communities, and society, proving his extensive ideas. However, changes caused by cyberspace running A.I. software will move further and faster than can be appreciated today. Together, they become a weapon of mass destruction, like atomic bombs, that brings civilization to an unpleasant end. 

Civilization, like freedom and democracy, depends on the support of individuals for moral character and every shade of human color. But individuality is overwhelmed in a polarized A.I. cyber society driven instead by fear of criminal gangs into the malevolent arms of industrialists and governments. Individuality is vital to preserving freedom, equality, and justice and must exist in cyberspace for democracy to survive.  Only the power of individual freedom, equality, and justice sustains democracy to power the progress of civilization. A polarized society lives in turmoil unless and until it is run by a dictator. Polarization is a state of play as democracy is undermined by the dictatorial binary computer.

The digital medium spreads these existential messages because power in the binary computer is centralized. Individuals are helpless, crushed by opaque complexity, and fearful of undetected, unprosecuted crimes, locked to a gang of binary dictators. They inevitably turn to tribes and cults to regain some power over the future. Instability results are why binary computers foster and encourage dictatorships in China, Russia, Iran, and North Korea. Individuals are further suppressed by the centralized power system, aided by further technology like Deep Fakes, cameras everywhere, and easy cybercrime. The medium drives Big Brother's message; polarization and tribalism are early warning signs of social catastrophe. 

Experts run cyberspace, toying with superuser powers that abuse individual freedom, equality, and justice, stealing away privacy, publically sharing secrets, eavesdropping and spying, redefining history, and enslaving users to colluding industrialists, criminals, and government tyrants, the warlords who rule tribal cyberspace. McLuhan was right, the medium and the message are loud and clear, and the results will only worsen for innocent citizens and democratic society.

The binary computer is defined by industrial dictators to stay in power. Backward compatibility undermines progress, polarizing society in the process. It is the machine code that rings this message. Centralized binary computers offer individuals no power, privacy, or security to defend their independence. Everything hinges on dangerous digital sharing that began when hardware, and memory systems, were expensive. Centralized, shared memory is the technology of dictators and criminals. Ignoring the Church-Turing Thesis and the Lambda Calculus is an unforgivable scientific mistake that now threatens humanity's future and civilization's progress.

It is still possible if government action is taken to quickly to replace the binary computer. A combination of urgent regulations and well-considered laws is needed that enforce individuality in computer science. Machine code is the fulcrum around which everything turns. Binary machine code is the lowest but the most significant of all programming languages. Everyone is different, proving a lack of science and the cause of their criminal unreliability. In absolute terms, machine code defines a digital computer's functionality, quality, performance, reliability, and security. This power is inherited from the hard edge of a computer using machine registers and microprogrammed control commands. Step-by-step commands expend electrical energy as programmed work. Mistakes pass unnoticed as binary commands are blindly trusted. The opportunity to try again, as learned at school, is not an option. The instructions are built for speed, but using the Lambda Calculus not only makes this possible a far better scientific model of functional computation defined by the Lambda Calculus more than compensates for the functional fail-safe over the dangerous binary instructions.

The machine code is executed directly by the hardware at this lowest level. It is not a virtual machine. It is a physical computer with real-world consequences and hard limitations. Machine code is the only language a computer understands, and it is used to materialize everything in virtual reality. So, for example, special are invented to support the superuser and administer page-based virtual memory. These privileged instructions were only intended for the superuser but are constantly attacked using ransomware and malware to invade a superuser. 

The process of creating machine code is called assembly. It involves translating imperative statements into machine code by hand or using a command interpreter. While binary machine code is powerful and efficient, it is also extremely dangerous and difficult to write and debug. This is where programming languages like C++, Java, and Python help to program the shared physical complexity of the exposed binary machine into logical virtual machines that are easier to understand and program as time-shared virtual machines sharing the same computer hardware but unscientifically controlled by a dictatorial operating system working as an unfair superuser.

This proprietary approach results in many brands of computers, operating systems, unfair superusers, and digital cracks and logical gaps used by hackers and malware crisscrossing cyberspace.  Branded binary computers have a recurring cost problem of endless unscientific change and vendor incompatibility. There are different clouds, and data is formatted and stored in inconvenient, problematic ways. Look back in time, almost 200 years, when Ada Lovelace wrote the first program. Until then, computers were only mechanical, but Charles Babbage proposed a programmable alternative directly using mathematics and logic as the machine code. Ada's program is found here. This machine code is universal and never ages. Ada's code lasts forever and can stretch uniformly and globally without any cracks and gaps needed by criminals and dictators. A mathematical, scientific implementation of cyberspace has the same flawless ability with individual privacy as first taught and learned at one's own school desk. 

This is the same desirable result enforced scientifically by the Lambda Calculus. It is the perfection of computer science defined by the Church-Turing Thesis, achieved by encapsulating and thus hiding volatile binary machine code within the functional mathematical and logical powers of the Lambda Calculus. Science simplifies, secures, and democratizes computers and programs as transparent and reusable machines for unskilled citizens. Further, the vital benefits of readability, privacy, security, functionality, productivity, and reliability add enhanced functional performance, automatically built into the simplified result that world without the complexity of opaque incompatible compilers or an unfair superuser operating system. The inherent simplicity of Turing's alpha machine is preserved and protected transparently by removing all the opaque baggage from decades of attempts to salvage von Neumann's shared binary architecture. Just one algorithm in one abstraction runs at a time from a limited list of named relationships to the specific digital objects, required using the need-to-know rules of top security systems. 

The PP250 successfully demonstrated these improved, dependable results decades ago in the 1970s and 80s, addressing the demanding requirements for fault-tolerant hardware and fail-safe, modular function abstractions as the first global computer architecture by enforcing the laws of science. The telephone network demanded fail-safe software and fault-tolerant hardware as a public utility. Cyberspace should be regulated the same way as a public utility. This single step would lead to sanity in cyberspace. ARM has seen the light, but a deeper understanding is needed to cut the cord, walk away from von Neumann's mistake, and design an updated Dream Machine.

For this, the traditional binary 'Turing commands' were augmented with six new 'Church-Instructions', creating a Church-Turing Machine, a computer using Lambda Calculus machine code. This machinery replaces a linear shared address space with named tokens to modularize and protect every digital boundary. Thus the PP250 used the computational model of a Lambda Calculus Namespace, with concurrent but private execution Threads, secure computational Calls to a Lambda Calculus Function abstractions, including a corresponding secure Return instruction, an unlock access rights instruction to Load the typed binary constraints of individually approved digital objects, and finally, a Save token instruction to store a freshly minted token in a list of immutable tokens when dynamically created digital objects are required. 

This Lambda Calculus model of computation is functional, distributed, individual, concurrent, and private. Fundamentally different from the monolithic, shared, centralized architecture of a traditional von Neumann binary computer. PP250 became a Church-Turing Machine needing no superuser or centralized operating software, distributes power democratically, evenly, uniformlally, and scientificly to each Namespace users, provided they own the key or keys to correctly unlock a Namespace. The machine code translates symbolic names into private, limited address space, for one algorithm at a time, as Alan Turing first proposed. These computations are safe from spies, crooks, strangers, and outsiders. Trust is materialized because, like the real world, individuals are in control. The PP250 machine code used named tokens to authorize the approved access rights to approved digital objects. A Namespace table converted the immutable tokens into typed binary access rights held in machine registers as needed. 

Now each sequence of binary commands is assembled as a well-ordered, programmed list of imperative 'do this and then do that' instructions, but encapsulated by the typed and dynamically bound digital limits of a hierarchical security structure, where Functions belong to named Abstractions that belong to a Namespace. Every binary program has a meaningful compound long name that can be used globally and a short name used locally with a local memory map formalized to store the digital objects in the namespace. The named objects replace physical addressing to make the machine code easily readable using names explaining everything programmed to happen. The machine is functional because functional programming is built in as the default hardware mechanism, as first demonstrated by Charles Babbage in the 1830s when Ada Lovelace wrote the first functional program that could pass expressions instead of fixed (binary) values. 

These binary programs are no longer only hand carved by skilled, well-trained experts but can also be composed by trial and error. Written by students and amateurs as first learned at school using transparent symbols and clear functional meanings. While the resulting binary blocks of digital information are still stored in computer memory, their individuality, privacy, and security is ensured by type-limited digital boundaries the PP250 enforced using Capability-Based Addressing. 

Grouping software as individual, modular functions in named abstractions of a private application namespace limits function to those required enforced as a digital structure of carefully linked tokens prevents malware interference. The function abstractions are assembled as object-oriented programs in a class hierarchy. Malware is consequently excluded; even malware downloads are digitally isolated by the hierarchy. Device driver abstractions that connect to attached physical hardware coexist within the Namespace without requiring a privileged mode of computation. A need-to-know each compound name is the only required privilege, confirmed as program assembly occurs. So, unlike the shared arrangements of a binary computer, these digital objects are hidden as function abstractions using Alonzo Church's magical Lambda Calculus. The process of running a binary program is always constrained by the digital ironwork of immutable tokens. At the same time, the Namespace map limits access to locally listed tokens that belong to a function abstraction. 

The Lambda machine code of a Church-Turing machine is the fulcrum around which everything turns, shaping digital technology in the form of defined software structures that exclude malware. Complex digital forms dynamically evolve incrementally in a preordained sequence that can grow and shrink but cannot change. When the Lambda machine blocks malware, even A.I. malware is blocked, and all that is left to deal with is traditional fraud and forgery. Lambda machine code is vital to constrain A.I. to ethical tasks and not engage in malware attacks. 

Nevertheless, even if all machine instructions were fail-safe and function-safe, it is still possible for attackers to exploit vulnerabilities in other parts of the system and find ways to circumvent security controls. This means that the comprehensive approach to cybersecurity must cover secure coding practices, fail-safe machine instructions, robust security controls, regular security testing, and continuous monitoring, all part of a broad cybersecurity strategy.

Depending on one mechanism alone is a mistake, and each mechanism must be robust, reducing the range and number of failures. A fail-safe computer is an essential part of this comprehensive cybersecurity strategy, reducing the range of potential failures and vulnerabilities and increasing the reliability and security of software, computer systems, and cyberspace.

A complete solution must include best practices. For example, secure coding practices, regular security testing, and user education. These are other critical components of a comprehensive cybersecurity strategy. In conjunction, thorough testing and validation of all mechanisms to ensure they work as intended and do not introduce new vulnerabilities or unintended consequences.

Implementing a fail-safe computer is insufficient but is an essential first step in improving digital security by reducing the range of potential failures and vulnerabilities in computer systems. A fail-safe computer prevents and mitigates errors or failures by ensuring that critical operations and functions are executed in a fail-safe manner.

Moreover, a fail-safe computer must be carefully designed, implemented, and tested to ensure it works as intended and does not introduce new vulnerabilities or unintended consequences. To ensure their effectiveness and safety, it is essential to thoroughly test and validate the fail-safe mechanisms.

Investing time and effort to ensure the computer is fail-safe and functionally correct reduces all other cybersecurity costs. By reducing the range of failures and vulnerabilities, a fail-safe system minimizes the risks of cyber attacks, data breaches, and other costly security incidents in terms of financial and reputational damage.

However, cybersecurity is an ongoing process, and the threat landscape constantly evolves. Attackers are continually developing new techniques and tools to exploit vulnerabilities in computer systems, and it is essential to stay vigilant and adapt to new threats as they emerge. Beyond the cost of the hardware, other costs include training, incident response planning, and compliance requirements. They all contribute to the overall cost of cybersecurity as a comprehensive cybersecurity strategy that addresses all aspects of cybersecurity risk.

A functionally fail-safe design can reduce the likelihood of errors and vulnerabilities in machine code, thus reducing the need for secure coding practices, regular security testing, and user education. A fail-safe design can help ensure that the code is correct, rather than relying on developers to identify and fix errors.

Moreover, if the machine hardware works as a teacher, it can provide feedback to developers and users on how to write code securely, reducing the need for extensive user education and training. The key test is to allow criminals, amateur programmers, and children to write their own code without trauma, catastrophe, or any undetected crimes.

Even with a functionally data-tight, function-safe, application-safe, network-safe, and fail-safe design where the machine acts as a teacher, the need for regular security testing and monitoring remains. As the network expands and the software evolves, vulnerabilities change over time, and it is essential to identify and address new vulnerabilities as they emerge.

Even with a fail-safe design and a machine that works as a teacher, there is still a need for secure coding practices. While a fail-safe design can help reduce the likelihood of errors, developers must write secure code that follows best practices. This is where typed boundary checks are vital as a security control to guarantee best practices during execution.

Best secure coding practices are nominally based on industry standards, academic research, and practical experience that, when generalized, can be built into the computer hardware. For example, the hardware can include memory protection and privilege separation to help prevent vulnerabilities such as buffer overflows and privilege escalation. Furthermore, the hardware can provide feedback on how to write and use code securely. For example, the device can provide warnings or error messages when the code does not adhere to best practices, even indicating how to make it safe.

It is essential that practices are built into the hardware based on scientific standards instead of branded industry standards. When the machine obeys the namespace and function abstractions of lambda calculus, there are no scientific gaps in the digital scope of cyberspace.

Lambda calculus is a mathematical computation model proposed as a foundation model for the theory of programming. It describes the behavior of programs as a theoretical model. When combined into the Church-Turing Thesis, a practical implementation is created in hardware, where the bottom-up Turing Machine acts as the lambda in the lambda calculus. Machine code errors are now limited to on function abstraction at a time.

Moreover, lambda calculus changes the addressing structure of computer science from purely physical at the machine code level to logical at all higher levels. This logical system is defined by names that remain unknown secrets until an introduction takes place. The introduction is always based on a need-to-know that ensures all code is well-structured and follows best practices for functional programming. Based on a need to know, cyber security automatically prevents unwanted interference at every level in cyberspace - atomically, functionally, and between applications, subsystems, systems, networks, and humans. Now a common mechanism exists at every level of cybersecurity that is comprehensive, facilitating network security, user security, easy education, and automated incident response plans not directly related to the lambda calculus or the binary Turing Machine.

Therefore, lambda calculus adds the tools to design and implement secure and well-structured code, essential to the broader context of cybersecurity as a comprehensive strategy addressing every risk created by cyberspace.

This includes a scientific model to reason about concurrent and distributed systems where the essential mechanisms exist in hardware as six additional lambda calculus machine instructions to change the application namespace, the computational thread, the function abstraction in a thread, and redefine the lambda machine without complexity or abstract challenges.

Furthermore, lambda calculus ensures all code is well-structured and follows best practices for concurrent and distributed programming, including solutions that sustain network security, simplifying firewalls, intrusion detection systems, secure authentication, authorization, and encryption as approved point-to-point connections.

The human factor problems can only be solved when the computer is trusted. Trust is a critical factor in cybersecurity, and trust in the computer is essential to building trust in the digital ecosystem of A.I,-enabled cyber society. It requires secure hardware and firmware, safe software development practices, regular security testing and auditing, and other measures all built into the computer as a future-safe cyber ship.

This cyberspace ship addresses all human factors, including guard rails that facilitate user education, awareness, and behavior, all critical to trusting the digital ecosystem. It involves digital watermarks so users can recognize and avoid fraud, forgery, phishing, and social engineering attacks. Implementing strong access controls and building a digital security culture based on trusted computers, organizations, and communities. This demands the namespace separation between groups with divergent interests that is only found with lambda calculus.

While lambda calculus and fail-safe computing offer many advantages, expecting them to completely replace the binary computer is challenging because binary computing is cyberspace's backbone and is deeply entrenched in our technology infrastructure. Thus the pragmatic approach integrates fail-safe computing and lambda calculus into existing systems to improve their power, security, and reliability. It involves new hardware or firmware to support the technologies and building software that exploits them, as demonstrated by ARM and Morello.

At the same time, it is essential to research and develop other computing models and technologies that offer improved security and resilience. This may include approaches to distributed computing, quantum computing, or other emerging technologies that might transform the digital landscape. Integrating these technologies into existing systems while continuing to invest in research and developing new technologies to improve security and resilience in the digital ecosystem.

Comments