Peer-reviewed publications of David Lynn Abel

Recent peer-reviewed science journal publications, book chapters, books, lectures, and invited internet SciTopic pages relating to Primordial BioCybernetics, Protocellular Metabolomics and Initial BioSemiotics, of David Lynn Abel, Director, The Gene Emergence Project.

(The following list includes an Abstract when the panel is clicked)

   Abel, David Lynn, Selection in molecular evolution., Studies in History and Philosophy of Science, 2024, 107, 54-63,
Abstract:

Evolution requires selection. Molecular/chemical/preDarwinian evolution is no exception. One molecule must be selected over another for molecular evolution to occur and advance. Evolution, however, has no goal. The laws of physics have no utilitarian desire, intent or proficiency. Laws and constraints are blind to “usefulness.” How then were potential multi-step processes anticipated, valued and pursued by inanimate nature? Can orchestration of formal systems be physico-chemically spontaneous? The purely physico-dynamic self-ordering of Chaos Theory and irreversible non-equilibrium thermodynamic “engines of disequilibria conversion” achieve neither orchestration nor formal organization. Natural selection is a passive and after-the-fact-of-life selection. Darwinian selection reduces to the differential survival and reproduction of the fittest already-living organisms. In the case of abiogenesis, selection had to be 1) Active, 2) Pre-Function, and 3) Efficacious. Selection had to take place at the molecular level prior to the existence of non-trivial functional processes. It could not have been passive or secondary. What naturalistic mechanisms might have been at play?
 (Open full PDF of Paper in New Window)

   Abel, David Lynn, Why is Abiogenesis Such a Tough Nut to Crack?, Archives of Microbiology and Immunology 2024, 8, 338-364
Abstract:

Natural science explores the roles of the four known forces of physics, statistical mechanics, mass/energy phase changes, mass transfer, and the application of the laws of physics and chemistry to most any problem. But there is one problem a purely physico-chemical approach does not and logically cannot address: abiogenesis’ pursuit and acquisition of functionality. The laws of motion do not perceive, value or pursue “usefulness.” The physics definition of “work” has absolutely nothing to do with utility. Pragmatism is not an issue in an inanimate environment. Yet, every process in life is highly functional and extremely sophisticated in its achievement of function. No basis for evolution exists yet in abiogenesis. Neither molecular stability nor mass self-replication of an RNA analog produces the slightest “biosystem,” let alone a proto-metabolism. Mere complexity doesn’t DO anything. Any hope of real advancement in abiogenesis research requires addressing the problem of an inanimate environment having valued and pursued “usefulness” and “functionality” prior to computational success (the “halting problem”). What is our naturalistic mechanism for this?
 (Open full PDF of Paper in New Window)

   Abel, David Lynn ed, The First Gene: The Birth of Programming, Messaging and Formal Control., New York, N.Y.: Longview Press Academic; 2011
Abstract:

“The First Gene: The Birth of Programming, Messaging and Formal Control” is a peer-reviewed anthology of papers that focuses, for the first time, entirely on the following difficult scientific questions: *How did physics and chemistry write the first genetic instructions? *How could a prebiotic (pre-life, inanimate) environment consisting of nothing but chance and necessity have programmed logic gates, decision nodes, configurable-switch settings, and prescriptive information using a symbolic system of codons (three nucleotides per unit/block of code)? The codon table is formal, not physical. It has also been shown to be conceptually ideal. *How did primordial nature know how to write in redundancy codes that maximally protect information? *How did mere physics encode and decode linear digital instructions that are not determined by physical interactions? All known life is networked and cybernetic. “Cybernetics” is the study of various means of steering, organizing and controlling objects and events toward producing utility. The constraints of initial conditions and the physical laws themselves are blind and indifferent to functional success. Only controls, not constraints, steer events toward the goal of usefulness (e.g., becoming alive or staying alive). Life-origin science cannot advance until first answering these questions: *1-How does nonphysical programming arise out of physicality to then establish control over that physicality? *2-How did inanimate nature give rise to a formally-directed, linear, digital, symbol-based and cybernetic-rich life? *3-What are the necessary and sufficient conditions for turning physics and chemistry into formal controls, regulation, organization, engineering, and computational feats? “The First Gene” directly addresses these questions.
 (Book is here on Amazon Site)

   Abel, David Lynn, Primordial Prescription: The Most Plaguing Problem of Life Origin Science, New York, N. Y.: LongView Press Academic; 2015.
Abstract:

This is the second major work by this author (The First Gene: The Birth of Programming, Messaging and Formal Control) and it addresses the most fundamental questions remaining for life origin research: How did molecular evolution generate metabolic recipe and instructions using a representational symbol system? How did prebiotic nature set all of the many configurable switch-settings to integrate so many interdependent circuits? How did inanimate nature sequence nucleotides to spell instructions to the ribosomes on how to sequence amino acids into correctly folding protein molecular machines? How did nature then code these symbol-system instructions into Hamming block codes, to reduce noise pollution in the Shannon channel? What programmed the error-detection and error-correcting software that keeps life from quickly deteriorating into non-life from so many deleterious mutations? In short, which of the four known forces of physics organized and prescribed life into existence? Was it gravity? Was it the strong or weak nuclear force? Was it the electromagnetic force? How could any combination of these natural forces and force fields program decision nodes to prescribe future utility? Why and how would a prebiotic environment value, desire or seek to generate utility? Can chance and/or necessity (law) program or prescribe sophisticated biofunction? The most plaguing problem of life origin science remains: What programmed, in a prebiotic environment, the Primordial Prescription and Processing of such sophisticated, integrated biofunction? That is the subject of this book.  (Book is here on Amazon)

   Abel, David Lynn, Constraints vs. Controls: Progressing from description to prescription in systems theory. Open Cybernetics and Systemics Journal. 2010;4:14-27.
Abstract:

The terms constraints and controls should not be used interchangeably. Constraints refer to the cause-and-effect deterministic orderliness of nature, to local initial conditions, and to the stochastic combinatorial boundaries that limit possible outcomes. Bits, bifurcation points and nodes represent “choice opportunities”, not choices. Controls require deliberate selection from among real options. Controls alone steer events toward formal pragmatic ends. Inanimacy is blind to and does not pursue utility. Constraints produce no integrative or organizational effects. Only the purposeful choice of constraints, not the constraints themselves, can generate bona fide controls. Configurable switch-settings allow the instantiation of formal choice contingency into physicality. While configurable switches are themselves physical, the setting of these switches to achieve formal function is physicodynamically indeterminate—decoupled from and incoherent with physicodynamic causation. The mental choice of tokens (physical symbol vehicles) in a material symbol system also instantiates non physical formal Prescriptive Information (PI) into physicality.  (Open full PDF of paper in New Window)

   Abel, David Lynn, Moving 'far from equilibrium' in a prebitoic environment: The role of Maxwell’s Demon in life origin. In: Seckbach J, Gordon R, eds. Genesis - In the Beginning: Precursors of Life, Chemical Models and Early Biological Evolution. Dordrecht: Springer; 2012:219-236
Abstract:

Can we falsify the following null hypothesis?

    “A kinetic energy potential cannot be generated by Maxwell’s Demon from an ideal gas equilibrium without purposeful choices of when to open and close the partition’s trap door.”

If we can falsify this null hypothesis with an observable naturalistic mechanism, we have moved a long way towards modeling the spontaneous molecular evolution of life. Falsification is essential to discount teleology. But life requires a particular version of “far from equilibrium” that explains formal organization, not just physicodynamic selfordering as seen in Prigogine’s dissipative structures. Life is controlled and regulated, not just constrained. Life follows arbitrary rules of behavior, not just invariant physical laws. To explain life’s origin and regulation naturalistically, we must first explain the more fundamental question, “How can hotter, faster moving, ideal gas molecules be dichotomized from cooler, slower moving, ideal gas molecules without the Demon’s choice contingency operating the trap door?”  (Open full PDF of Paper in New Window)

   Abel, David Lynn, The capabilities of chaos and complexity. Int J Mol Sci. 2009;10 (Special Issue on Life Origin):247-291
Abstract:

To what degree could chaos and complexity have organized a Peptide or RNA World of crude yet necessarily integrated protometabolism? How far could such protolife evolve in the absence of a heritable linear digital symbol system that could mutate, instruct, regulate, optimize and maintain metabolic homeostasis? To address these questions, chaos, complexity, self-ordered states, and organization must all be carefully defined and distinguished. In addition their cause-and-effect relationships and mechanisms of action must be delineated. Are there any formal (non physical, abstract, conceptual, algorithmic) components to chaos, complexity, self-ordering and organization, or are they entirely physicodynamic (physical, mass/energy interaction alone)? Chaos and complexity can produce some fascinating self-ordered phenomena. But can spontaneous chaos and complexity steer events and processes toward pragmatic benefit, select function over non function, optimize algorithms, integrate circuits, produce computational halting, organize processes into formal systems, control and regulate existing systems toward greater efficiency? The question is pursued of whether there might be some yet-to-be discovered new law of biology that will elucidate the derivation of prescriptive information and control. “System” will be rigorously defined. Can a lowinformational rapid succession of Prigogine’s dissipative structures self-order into bona fide organization?  (Open full PDF of Paper in New Window)

   Abel, David Lynn, Trevors JT. Self-Organization vs. Self-Ordering events in life-origin models. Physics of Life Reviews. 2006;3:211-228
Abstract:

Self-ordering phenomena should not be confused with self-organization. Self-ordering events occur spontaneously according to natural “law” propensities and are purely physicodynamic. Crystallization and the spontaneously forming dissipative structures of Prigogine are examples of self-ordering. Self-ordering phenomena involve no decision nodes, no dynamically-inert configurable switches, no logic gates, no steering toward algorithmic success or “computational halting”. Hypercycles, genetic and evolutionary algorithms, neural nets, and cellular automata have not been shown to self-organize spontaneously into nontrivial functions. Laws and fractals are both compression algorithms containing minimal complexity and information. Organization typically contains large quantities of prescriptive information. Prescriptive information either instructs or directly produces nontrivial optimized algorithmic function at its destination. Prescription requires choice contingency rather than chance contingency or necessity. Organization requires prescription, and is abstract, conceptual, formal, and algorithmic. Organization utilizes a sign/symbol/token system to represent many configurable switch settings. Physical switch settings allow instantiation of nonphysical selections for function into physicality. Switch settings represent choices at successive decision nodes that integrate circuits and instantiate cooperative management into conceptual physical systems. Switch positions must be freely selectable to function as logic gates. Switches must be set according to rules, not laws. Inanimacy cannot “organize” itself. Inanimacy can only self-order. “Self-organization” is without empirical and prediction-fulfilling support. No falsifiable theory of self-organization exists. “Self-organization” provides no mechanism and offers no detailed verifiable explanatory power. Care should be taken not to use the term “self-organization” erroneously to refer to low-informational, natural-process, self-ordering events, especially when discussing genetic information. © 2006 Elsevier B.V. All rights reserved. (Open full PDF of Paper in New Window)

   Trevors JT, Abel DL, Chance and necessity do not explain the origin of life. Cell Biol Int. 2004;28(11):729-739.
Abstract:

Where and how did the complex genetic instruction set programmed into DNA come into existence? The genetic set may have arisen elsewhere and was transported to the Earth. If not, it arose on the Earth, and became the genetic code in a previous lifeless, physicalechemical world. Even if RNA or DNA were inserted into a lifeless world, they would not contain any genetic instructions unless each nucleotide selection in the sequence was programmed for function. Even then, a predetermined communication system would have had to be in place for any message to be understood at the destination. Transcription and translation would not necessarily have been needed in an RNA world. Ribozymes could have accomplished some of the simpler functions of current protein enzymes. Templating of single RNA strands followed by retemplating back to a sense strand could have occurred. But this process does not explain the derivation of ‘‘sense’’ in any strand. ‘‘Sense’’ means algorithmic function achieved through sequences of certain decision-node switch-settings. These particular primary structures determine secondary and tertiary structures. Each sequence determines minimum-free-energy folding propensities, binding site specificity, and function. Minimal metabolism would be needed for cells to be capable of growth and division. All known metabolism is cybernetic e that is, it is programmatically and algorithmically organized and controlled. (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Universal Plausibility Metric (UPM) & Principle (UPP). Theoretical Biology and Medical Modeling 2009 6(27):1-10
Abstract:

BACKGROUND: Mere possibility is not an adequate basis for asserting scientific plausibility. A precisely defined universal bound is needed beyond which the assertion of plausibility, particularly in life-origin models, can be considered operationally falsified. But can something so seemingly relative and subjective as plausibility ever be quantified? Amazingly, the answer is, “Yes.” A method of objectively measuring the plausibility of any chance hypothesis (The Universal Plausibility Metric [UPM]) is presented. A numerical inequality is also provided whereby any chance hypothesis can be definitively falsified when its UPM metric of ξ is < 1 (The Universal Plausibility Principle [UPP]). Both UPM and UPP pre-exist and are independent of any experimental design and data set. Conclusion: No low-probability hypothetical plausibility assertion should survive peer-review without subjection to the UPP inequality standard of formal falsification (ξ < 1).  (Open full PDF of Paper in New Window)

   Abel, David Lynn, The GS (Genetic Selection) Principle. Frontiers in Bioscience. 2009;14(January 1):2959-2969
Abstract:

The GS (Genetic Selection) Principle states that biological selection must occur at the nucleotide-sequencing molecular-genetic level of 3'5' phosphodiester bond formation. After-the-fact differential survival and reproduction of already-living phenotypic organisms (ordinary natural selection) does not explain polynucleotide prescription and coding. All life depends upon literal genetic algorithms. Even epigenetic and “genomic” factors such as regulation by DNA methylation, histone proteins and microRNAs are ultimately instructed by prior linear digital programming. Biological control requires selection of particular configurable switch-settings to achieve potential function. This occurs largely at the level of nucleotide selection, prior to the realization of any isolated or integrated biofunction. Each selection of a nucleotide corresponds to pushing a quaternary (four-way) switch knob in one of four possible directions. Formal logic gates must be set that will only later determine folding and binding function through minimumfree-energy sinks. These sinks are determined by the primary structure of both the protein itself and the independently prescribed sequencing of chaperones. Living organisms arise only from computational halting. Fittest living organisms cannot be favored until they are first computed. The GS Principle distinguishes selection of existing function (natural selection) from selection for potential function (formal selection at decision nodes, logic gates and configurable switch-settings). (Open full PDF of Paper in New Window)

   Abel, David Lynn, Is life unique? Life. 2012;2(1):106-134
Abstract:

Is life physicochemically unique? No. Is life unique? Yes. Life manifests innumerable formalisms that cannot be generated or explained by physicodynamics alone. Life pursues thousands of biofunctional goals, not the least of which is staying alive. Neither physicodynamics, nor evolution, pursue goals. Life is largely directed by linear digital programming and by the Prescriptive Information (PI) instantiated particularly into physicodynamically indeterminate nucleotide sequencing. Epigenomic controls only compound the sophistication of these formalisms. Life employs representationalism through the use of symbol systems. Life manifests autonomy, homeostasis far from equilibrium in the harshest of environments, positive and negative feedback mechanisms, prevention and correction of its own errors, and organization of its components into Sustained Functional Systems (SFS). Chance and necessity—heat agitation and the cause-and-effect determinism of nature’s orderliness—cannot spawn formalisms such as mathematics, language, symbol systems, coding, decoding, logic, organization (not to be confused with mere self-ordering), integration of circuits, computational success, and the pursuit of functionality. All of these characteristics of life are formal, not physical. (Open full PDF of Paper in New Window)

   Abel, David Lynn, Trevors JT. Three subsets of sequence complexity and their relevance to biopolymeric information. Theoretical Biology and Medical Modeling. 2005;2:29-45.
Abstract:

Genetic algorithms instruct sophisticated biological organization. Three qualitative kinds of sequence complexity exist: random (RSC), ordered (OSC), and functional (FSC). FSC alone provides algorithmic instruction. Random and Ordered Sequence Complexities lie at opposite ends of the same bi-directional sequence complexity vector. Randomness in sequence space is defined by a lack of Kolmogorov algorithmic compressibility. A sequence is compressible because it contains redundant order and patterns. Law-like cause-and-effect determinism produces highly compressible order. Such forced ordering precludes both information retention and freedom of selection so critical to algorithmic programming and control. Functional Sequence Complexity requires this added programming dimension of uncoerced selection at successive decision nodes in the string. Shannon information theory measures the relative degrees of RSC and OSC. Shannon information theory cannot measure FSC. FSC is invariably associated with all forms of complex biofunction, including biochemical pathways, cycles, positive and negative feedback regulation, and homeostatic metabolism. The algorithmic programming of FSC, not merely its aperiodicity, accounts for biological organization. No empirical evidence exists of either RSC of OSC ever having produced a single instance of sophisticated biological organization. Organization invariably manifests FSC rather than successive random events (RSC) or low-informational self-ordering phenomena (OSC).  (Open full PDF of Paper in New Window)

   Abel, David Lynn, Complexity, self-organization, and emergence at the edge of chaos in life-origin models. Journal of the Washington Academy of Sciences. 2007;93(4):1-20 .
Abstract:

"Complexity:' "self-organization," and "emergence" are terms used extensively in Iife-oriGn literature. Yet orecise and auantitative definitions of these tenns are L sorely lacking. "Emergence at the edge of chaos" invites vivid imagination of spontaneous creativity. Unfortunately, the phse lacks scientific substance and explanatory mechanism. We explore the meaning, role, and relationship of co~nplexity at the edge of chaos along with self-organization We examine their relevance to life-origin processes. The high degree of order and panern found m 'hecessity" (the regularities of nature desnibed by tlme "laws" of physics) @tally reduce the uncertainty and infommation retaining potential of spontaneously- ordered physical matrices. No asnf-yet undiscovered law. therefore, will be able to explain the high infornlation content of even the simplest prescriptive genome. Maximum complexity corresponds to mndn~nnes when defined 6onm a Kolmogorov perspective. No empirical evidence exists of nndornnes (maximum complexity) generating a halting wmpu(ational program. Neither order nor complexity is the key to function. Co~nplexity demonstrates no ability lo compute. Genetic cybernetics inspired Turing's. von Na~mann's, and Wiener's developn~ent of computer science. Genetic cybernetics cannot he explained by the chance and neces~ity of physicodynamics. Genetic algorithmic control is fundamentally formal, not physical. But like other expressions of lormality, it can be instantiated into a physical matrix of retention and channel h.ansmission usmg dynamically- inen confiprable switches. Neither parsi~nonicnls law nor cmplexity can pmgram the elficacious decision-node logic-gale senings of algorithmic organization observed in all known living organisms.  (Open full PDF of Paper in New Window)

   Abel, David Lynn, The biosemiosis of prescriptive information. Semiotica. 2009(174):1-19
Abstract:

Exactly how do the sign/symbol/token systems of endo- and exobiosemiosis di¤er from those of cognitive semiosis? Do the biological messages that integrate metabolism have conceptual meaning? Semantic information has two subsets: Descriptive and Prescriptive. Prescriptive information instructs or directly produces nontrivial function. In cognitive semiosis, prescriptive information requires anticipation and ‘‘choice with intent’’ at bona ?de decision nodes. Prescriptive information either tells us what choices to make, or it is a recordation of wise choices already made. Symbol systems allow recordation of deliberate choices and the transmission of linear digital prescriptive information. Formal symbol selection can be instantiated into physicality using physical symbol vehicles (tokens). Material symbol systems (MSS) formally assign representational meaning to physical objects. Even verbal semiosis instantiates meaning into physical sound waves using an MSS. Formal function can also be incorporated into physicality through the use of dynamically-inert (dynamically-incoherent or -decoupled) con?gurable switch-settings in conceptual circuits. This article examines the degree to which biosemiosis conforms to the essential formal criteria of prescriptive semiosis and cybernetic management.  (Open full PDF of Paper in New Window)

   Abel DL, Trevors JT. More than Metaphor: Genomes are Objective Sign Systems. In: Barbieri M, ed. BioSemiotic Research Trends. New York: Nova Science Publishers; 2007:1-15
Abstract:

Genetic cybernetics preceded human consciousness in its algorithmic programming and control. Nucleic acid instructions reside in linear, resortable, digital, and unidirectionally read sign sequences. Prescriptive information instructs and manages even epigenetic factors through the production of diverse regulatory proteins and small RNA’s. The “meaning” (significance) of prescriptive information is the function that information instructs or produces at its metabolic destination. Constituents of the cytoplasmic environment (e.g., chaperones, regulatory proteins, transport proteins, small RNA’s) contribute to epigenetic influence. But the rigid covalently-bound sequence of these players constrains their minimum-free-energy folding space. Weaker H-bonds, charge interactions, hydrophobicities, and van der Waals forces act on completed primary structures. Nucleotide selections at each locus in the biopolymeric string correspond to algorithmic switch-settings at successive decision nodes. Nucleotide additions are configurable switches. Selection must occur at the genetic level prior to selection at the phenotypic level, in order to achieve programming of computational utility. This is called the GS Principle. Law-like cause-and-effect determinism precludes freedom of selection so critical to algorithmic control. Functional Sequence Complexity (FSC) requires this added programming dimension of freedom of selection at successive decision nodes in the string. A sign represents each genetic decision-node selection. Algorithms are processes or procedures that produce a needed result, whether it is computation or the end products of biochemical pathways. Algorithmic programming alone accounts for biological organization.  (Open full PDF of Paper in New Window)

   Abel, David Lynn, The ‘Cybernetic Cut’: Progressing from Description to Prescription in Systems Theory. The Open Cybernetics and Systemics Journal. 2008;2:252-262 Open Access
Abstract:

Howard Pattee championed the term “epistemic cut” to describe the symbol-matter, subject-object, genotypephenotype distinction. But the precise point of contact between logical deductive formalisms and physicality still needs elucidation. Can information be physical? How does nonphysical mind arise from physicality to then establish formal control over that physicality (e.g., engineering feats, computer science)? How did inanimate nature give rise to an algorithmically organized, semiotic and cybernetic life? Both the practice of physics and life itself require traversing not only an epistemic cut, but a Cybernetic Cut. A fundamental dichotomy of reality is delineated. The dynamics of physicality (“chance and necessity”) lie on one side. On the other side lies the ability to choose with intent what aspects of ontological being will be preferred, pursued, selected, rearranged, integrated, organized, preserved, and used (cybernetic formalism).  (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Formalism > Physicality (F > P) Principle. In: Abel David Lynn, ed. In the First Gene: The birth of Programming, Messaging and Formal Control. New York, New York: Ed. LongView Press-Academic, 2011: Biological Research Division; 2011:447-492
Abstract:

The F > P Principle states that “Formalism not only describes, but preceded, prescribed, organized, and continues to govern and predict Physicality.” The F > P Principle is an axiom that defines the ontological primacy of formalism in a presumed objective reality that transcends both human epistemology, our sensation of physicality, and physicality itself. The F > P Principle works hand in hand with the Law of Physicodynamic Incompleteness, which states that physicochemical interactions are inadequate to explain the mathematical and formal nature of physical law relationships. Physicodynamics cannot generate formal processes and procedures leading to nontrivial function. Chance, necessity and mere constraints cannot steer, program or optimize algorithmic/computational success to provide desired nontrivial utility. As a major corollary, physicodynamics cannot explain or generate life. Life is invariably cybernetic. The F > P Principle denies the notion of unity of Prescriptive Information (PI) with mass/energy. The F > P Principle distinguishes instantiation of formal choices into physicality from physicality itself. The arbitrary setting of configurable switches and the selection of symbols in any Material Symbol System (MSS) is physicodynamically indeterminate— decoupled from physicochemical determinism. (Open full PDF of Paper in New Window)

   Abel, David Lynn, Is Life Reducible to Complexity? In: Palyi G, Zucchi C, Caglioti L, eds. Fundamentals of Life. Paris: Elsevier; 2002:57-72.
Abstract:

What exactly is complexity? Is complexity an adequate measure of 'genetic instructions' and 'code'? How do complex stochastic ensembles such as random biopolymers come to 'specify' function? All known life is instructed and managed by bio-information. The first step in understanding bioinformation is to enumerate the different types of complexity. Since biopolymers are linear sequences of monomers, emphasis in this chapter is placed on different types of sequence complexity. Sequence complexity can be 1) random (RSC), 2) ordered (OSC), or 3) functional (FSC). OSC is on the opposite end of the spectrum of complexity from RSC. FSC is paradoxically close to the random end of the complexity scale. FSC is the product of non-random selection pressure. FSC results from the equivalent of a succession of algorithmic decision node 'switch settings.' FSC alone instructs sophisticated metabolic function. Self-ordering processes preclude both complexity and sophisticated function. Bio-information is more than mere complexity or a decrease in comparative uncertainty in an environmental context. Life is also more than the self-replication of gibberish. Life is the 'symphony' of dynamic and highly integrated algorithmic processes which yields homeostatic metabolism, development, growth, and reproduction. Apart from our non-empirical protolife models, algorithmic processes alone produce the integrated biofunction of metabolism. All known life and artificial life are program-driven. Shannon-based 'information theory' should have been called 'signal theory.' It cannot distinguish 'meaningful' signals from gibberish. In biology, meaningful signals are metabolically functional signals. Shannon theory lacks the ability to recognize whether a sequence is truly instructional. It cannot distinguish quantitatively between introns and exons. Nucleic acid is the physical matrix of recordation of the switch settings that constitute genetic programming. Progress in understanding the derivation of bioinformation through natural process will come only through elucidating more detailed mechanisms of selection pressure 'choices' in biofunctional decision-node sequences. The latter is the subject of both 'BioFunction theory' and the more interdisciplinary 'instruction theory'.  (Open full PDF of Paper in New Window)

   Durston KK, Chiu DK, Abel DL, Trevors JT. Measuring the functional sequence complexity of proteins. Theoretical biology & medical modelling. 2007;4:47.
Abstract:

Background: Abel and Trevors have delineated three aspects of sequence complexity, Random Sequence Complexity (RSC), Ordered Sequence Complexity (OSC) and Functional Sequence Complexity (FSC) observed in biosequences such as proteins. In this paper, we provide a method to measure functional sequence complexity. Methods and Results: We have extended Shannon uncertainty by incorporating the data variable with a functionality variable. The resulting measured unit, which we call Functional bit (Fit), is calculated from the sequence data jointly with the defined functionality variable. To demonstrate the relevance to functional bioinformatics, a method to measure functional sequence complexity was developed and applied to 35 protein families. Considerations were made in determining how the measure can be used to correlate functionality when relating to the whole molecule and submolecule. In the experiment, we show that when the proposed measure is applied to the aligned protein sequences of ubiquitin, 6 of the 7 highest value sites correlate with the binding domain. Conclusion: For future extensions, measures of functional bioinformatics may provide a means to evaluate potential evolving pathways from effects such as mutations, as well as analyzing the internal structural and functional relationships within the 3-D structure of proteins. (Open full PDF of Paper in New Window)

   Abel, David Lynn, What is ProtoBioCybernetics? In: Abel DL, ed. The First Gene: The Birth of Programming, Messaging and Formal Control. New York, N.Y.: LongView Press-Academic: Biolog. Res. Div.; 2011:1-18
Abstract:

Cybernetics addresses control rather than mere constraints. Cybernetics incorporates Prescriptive Information (PI) into various means of steering, programming, communication, instruction, integration, organization, optimization, computation and regulation to achieve formal function. “Bio” refers to life. “Proto” refers to “first.” Thus, the scientific discipline of ProtoBioCybernetics specifically explores the often-neglected derivation through “natural process” of initial control mechanisms in the very first theoretical protocell. Whether an RNA World, Peptide World, Lipid World, or other composomal Metabolism-First model of life-origin is pursued, selection for biofunction is required prior to the existence of a living organism. For gene emergence, selection for potential biofunction (programming at decision nodes, logic gates and configurable switch-settings) quickly becomes the central requirement for progress. (Open full PDF of Paper in New Window)

   Abel, David Lynn, The three fundamental categories of reality. In: Abel DL, ed. The First Gene: The Birth of Programming, Messaging and Formal Control. New York, N.Y.: LongView Press-Academic: Biolog. Res. Div.; 2011:19-54
Abstract:

Contingency means that events could unfold in multiple ways in the midst of, and despite, cause-and-effect determinism. But there are two kinds of contingency: Chance and Choice/Selection. Chance and Necessity cannot explain a myriad of repeatedly observable phenomena. Sophisticated formal function invariably arises from choice contingency, not from chance contingency or law. Decision nodes, logic gates and configurable switch settings can theoretically be set randomly or by invariant law, but no nontrivial formal utility has ever been observed to arise as a result of either. Language, logic theory, mathematics, programming, computation, algorithmic optimization, and the scientific method itself all require purposeful choices at bona fide decision nodes. Unconstrained purposeful choices must be made in pursuit of any nontrivial potential function at the time each logic gate selection is made. Natural selection is always post-programming. Choice Contingency (Selection for potential (not yet existing) function, not just selection of the best already-existing function) must be included among the fundamental categories of reality along with Chance and Necessity. (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Cybernetic Cut and Configurable Switch (CS) Bridge. In: Abel DL, ed. The First Gene: The Birth of Programming, Messaging and Formal Control. New York, N.Y.: LongView Press--Academic, Biol. Res. Div.; 2011:55-74
Abstract:

The Cybernetic Cut delineates perhaps the most fundamental dichotomy of reality. The Cybernetic Cut is a vast ravine. The physicodynamics of physicality (“chance and necessity”) is on one side. On the other side lies the ability to choose with intent what aspects of ontological being will be preferred, pursued, selected, rearranged, integrated, organized, preserved, and used to achieve sophisticated function and utility (cybernetic formalism). The Cybernetic Cut can be traversed across the Configurable Switch (CS) Bridge. Configurable switches are especially designed and engineered physical devices that allow instantiation of nonphysical formal programming decisions into physicality. The flow of traffic across the CS Bridge is one-wayonly. Physicodynamics never determines formal computational and control choices. Regulation, controls, integration, organization, computation, programming and the achievement of function or utility always emanate from the Formalism side of the Cybernetic Cut.  (Open full PDF of Paper in New Window)

   Abel, David Lynn, What utility does order, pattern or complexity prescribe? In: Abel DL, ed. The First Gene: The Birth of Programming, Messaging and Formal Control. New York, N.Y.: LongView Press--Academic, Biol. Res. Div.; 2011:75-116
Abstract:

“Order,” “pattern,” “complexity,” “self-organization,” and “emergence” are all terms used extensively in life-origin literature. Sorely lacking are precise and quantitative definitions of these terms. Vivid imagination of spontaneous creativity ensues from mystical phrases like “the adjacent other” and “emergence at the edge of chaos.” More wish-fulfillment than healthy scientific skepticism prevails when we become enamored with such phrases. Nowhere in peer-reviewed literature is a plausible hypothetical mechanism provided, let alone any repeated empirical observations or prediction fulfillments, of bona fide spontaneous “natural process selforganization.” Supposed examples show only one of two things: 1) spontaneous physicodynamic self-ordering rather than formal organization, or 2) behind-the-scenes investigator involvement in steering experimental results toward the goal of desired results. The very experiments that were supposed to prove spontaneous selforganization only provide more evidence of the need for artificial selection. Patterns are a form of order. Neither order nor combinatorial uncertainty (complexity) demonstrate an ability to compute or produce formal utility. Physical laws describe lowinformational physicodynamic self-ordering, not high-informational cybernetic and computational utility. (Open full PDF of Paper in New Window)

   Abel, David Lynn, Linear Digital Material Symbol Systems (MSS). In: Abel David Lynn, ed. The First Gene: The Birth of Programming, Messaging and Formal Control. New York, N.Y.: LongView Press--Academic, Biol. Res. Div.; 2011:135-160
Abstract:

Nonphysical, formal, linear digital symbol systems can be instantiated into physicality using physical symbol vehicles (tokens) in Material Symbol Systems (MSS). Genetics and genomics employ a MSS, not a twodimensional pictorial “blueprint.” Highly functional molecular biological MSS’s existed prior to human consciousness in tens of millions of species. Genetic code is conceptually ideal. Not all signals are messages. Encoding employs a conversionary algorithm to represent choices using a symbol system. Encoding/decoding is formal, not physicodynamic. Symbols must be purposefully chosen from alphabets of symbols to generate meaning, instructions, and control. Formal rules must first be generated, and then both sender and receiver must voluntarily adhere to those arbitrary rules. Neither law nor random variation of duplications can generate a meaningful/functional MSS. All known life is cybernetic (controlled, not just constrained) and semiotic (message dependent). Even protocells would require controls, biosemiosis, regulation, and an extraordinary degree of organization that mere mass/energy interactions, or chance and necessity, cannot produce. (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Genetic Selection (GS) Principle. In: Abel David Lynn, ed. The First Gene: The Birth of Programming, Messaging and Formal Control. New York, N.Y.: LongView Press--Academic; 2011:161-188
Abstract:

The GS (Genetic Selection) Principle states that biological selection must occur at the nucleotide-sequencing molecular-genetic level of 3'5' phosphodiester bond formation. After-the-fact differential survival and reproduction of already-programmed, already-living phenotypic organisms (natural selection) does not explain polynucleotide sequence prescription and coding. All life forms depend upon exceedingly-optimized genetic algorithms. Biological control requires selection of particular physicodynamically indeterminate configurable switch settings to achieve potential function. This occurs largely at the level of nucleotide selection, prior to the realization of any isolated or integrated biofunction. Each selection of a nucleotide corresponds to a quaternary (four-way) switch setting. Formal logic gates must be set initially that will only later determine folding and binding function through minimum Gibbs-freeenergy sinks. The fittest living organisms cannot be favored until they are first programmed and computed. The GS Principle distinguishes selection of existing function (undirected natural selection) from selection for potential function (formal selection at decision nodes, logic gates and configurable switchsettings). (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Birth of Protocells. In: Abel David Lynn, ed. The First Gene: The Birth of Programming, Messaging and Formal Control. New York, N.Y.: LongView Press--Academic, Biol. Res. Div.; 2011:189-230
Abstract:

Could a composome, chemoton, or RNA vesicular protocell come to life in the absence of formal instructions, controls and regulation? Redundant, low-informational selfordering is not organization. Organization must be programmed. Intertwined circular constraints (e.g. complex hypercylces), even with negative and positive feedback, do not steer physicochemical reactions toward formal function or metabolic success. Complex hypercycles quickly and selfishly exhaust sequence and other phase spaces of potential metabolic resources. Unwanted cross-reactions are invariably ignored in these celebrated models. Formal rules pertain to uncoerced (physiodynamically indeterminate) voluntary behavior. Laws describe and predict invariant physicodynamic interactions. Constraints and laws cannot program or steer physicality towards conceptual organization, computational success, pragmatic benefit, the goal of integrated holistic metabolism, or life. The formal controls and regulation observed in molecular biology are unique. Only constraints, not controls, are found in the inanimate physical world. Cybernetics should be the corner stone of any definition of life. All known life utilizes a mutable linear digital material symbol system (MSS) to represent and record programming decisions made in advance of any selectable phenotypic fitness. This fact is not undone by additional epigenetic formal controls and multi-layered Prescriptive Information (PI) instantiated into diverse molecular devices and machines. (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Universal Plausibility Metric and Principle. In: Abel David Lynn, ed. The First Gene: The Birth of Programming, Messaging and Formal Control. New York, N.Y.: LongView Press--Academic; 2011:305-324
Abstract:

Mere possibility is not an adequate basis for asserting scientific plausibility. A precisely defined universal bound is needed beyond which the assertion of plausibility, particularly in life-origin models, can be considered operationally falsified. But can something so seemingly relative and subjective as plausibility ever be quantified? Amazingly, the answer is, “Yes.” A method of objectively measuring the plausibility of any chance hypothesis (The Universal Plausibility Metric [UPM]) is presented. A numerical inequality is also provided whereby any chance hypothesis can be definitively falsified when its UPM metric of ξ is < 1 (The Universal Plausibility Principle [UPP]). Both UPM and UPP pre-exist and are independent of any experimental design and data set. No low-probability hypothetical plausibility assertion should survive peer-review without subjection to the UPP inequality standard of formal falsification (ξ < 1).  (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Formalism > Physicality (F > P) Principle. Scopous Sci-Topic Paper 2011
Abstract:

The F > P Principle states that “Formalism not only describes, but preceded, prescribed, organized, and continues to govern and predict Physicality.” The F > P Principle is an axiom that defines the ontological primacy of formalism in a presumed objective reality that transcends both human epistemology, our sensation of physicality, and physicality itself. Formalism set in motion and controls physicality.  (Open full PDF of Paper in New Window)

   Abel, David Lynn, Examining specific life-origin models for plausibility. In: Abel David Lynn, ed. The First Gene: The Birth of Programming, Messaging and Formal Control: LongView Press Academic; 2011:231-272
Abstract:

All models of life-origin, whether Protometabolism-First or preRNA / RNA World early informational self-replicative models, encounter the same dead-end: no naturalistic mechanism exists to steer objects and events toward eventual functionality. No insight, motive, foresight or impetus exists to integrate physicochemical reactions into a cooperative, organized, pragmatic effort. Inanimate nature cannot pursue the goal of homeostasis; it cannot scheme to locally and temporarily circumvent the 2nd Law. This deadlock affects all naturalistic models involving hypercycles, composomes and chemotons. It precludes all spontaneous geochemical, hydrothermal, eutectic, and photochemical scenarios. It affects the Lipid, Peptide and Zinc World models. It pertains to Co-evolution and all other code-origin models. No plausible hypothetical scenario exists that can convert chance and/or necessity into an organized protometabolic scheme. In this paper the general principles of previous chapters are applied to the best specific models of life origin in the literature. Tibor Ganti’s chemoton model and the pre-RNA and RNA World models receive more attention, as they are the most well-developed and preferred scenarios.  (Open full PDF of Paper in New Window)

   D'onofrio DJ, Abel DL, Redundancy of the genetic code enables translational pausing. Frontiers in Genetics. 2014;5:140
Abstract:

The codon redundancy (“degeneracy”) found in protein-coding regions of mRNA also prescribes Translational Pausing (TP). When coupled with the appropriate interpreters, multiplemeaningsandfunctionsareprogrammedintothesamesequenceofcon?gurable switch-settings.ThisadditionallayerofOntologicalPrescriptiveInformation(PIo)purposely slows or speeds up the translation-decoding process within the ribosome. Variable translation rates help prescribe functional folding of the nascent protein. Redundancy of the codon to amino acid mapping, therefore, is anything but super?uous or degenerate. Redundancy programming allows for simultaneous dual prescriptions of TP and amino acid assignments without cross-talk. This allows both functions to be coincident and realizable. We will demonstrate that the TP schema is a bona ?de rule-based code, conforming to logical code-like properties. Second, we will demonstrate that this TP code is programmed into the supposedly degenerate redundancy of the codon table. We will show that algorithmic processes play a dominant role in the realization of this multi-dimensional code. (Open full PDF of Paper in New Window)

   D'Onofrio DJ, Abel DL, Johnson DE. Dichotomy in the definition of prescriptive information suggests both prescribed data and prescribed algorithms: biosemiotics applications in genomic systems. Theor Biol Med Model. 2012;9(1):8
Abstract:

The fields of molecular biology and computer science have cooperated over recent years to create a synergy between the cybernetic and biosemiotic relationship found in cellular genomics to that of information and language found in computational systems. Biological information frequently manifests its “meaning” through instruction or actual production of formal bio-function. Such information is called Prescriptive Information (PI). PI programs organize and execute a prescribed set of choices. Closer examination of this term in cellular systems has led to a dichotomy in its definition suggesting both prescribed data and prescribed algorithms are constituents of PI. This paper looks at this dichotomy as expressed in both the genetic code and in the central dogma of protein synthesis. An example of a genetic algorithm is modeled after the ribosome, and an examination of the protein synthesis process is used to differentiate PI data from PI algorithms.  (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Cybernetic Cut and Configurable Switch (CS) Bridge. Scirus Sci-Topic Page 2012
Abstract:

The Cybernetic Cut [1,2] delineates one of the most fundamental dichotomies of reality. Physicodynamics (physicality: Jacques Monod’s “chance and necessity;” mass/energy interactions alone) lie on one side of a great divide. On the other side lies formalism—the abstract, conceptual, nonphysical ability to choose with intent what aspects of ontological being will be preferred, pursued, selected, rearranged, integrated, measured, calculated, computed, and organized into pragmatic utility.  (Open full PDF of Paper in New Window)

   Abel, David Lynn, Prescriptive Information (PI) [Scirus SciTopic Page]. 2009;
Abstract:

Semantic (meaningful) information has two subsets: Descriptive and Prescriptive. Prescriptive Information (PI) instructs and programs. When processed, PI is used to produce nontrivial formal function. 1 Merely describing a computer chip does not prescribe or produce that chip. Thus mere description needs to be dichotomized from prescription.
Computationally halting cybernetic programs and linguistic instructions are examples of Prescriptive Information. “Prescriptive Information (PI) either tells us what choices to make, or it is a recordation of wise choices already made.” (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Genetic Selection (GS) Principle [Scirus SciTopic Page]. 2009;
Abstract:

The Genetic Selection (GS) Principle states that selection must occur at the molecular/genetic level, not just at the fittest phenotypic/organismic level, to produce and explain life. In other words, selection for potential biofunction must occur upon formation of the rigid phosphodiester bonds in DNA and RNA sequences. This is the point at which functional linear digital polynucleotide syntax is prescribed. The selection of each nucleotide out of a phase space of four options constitutes the setting of a quaternary (four-way) configurable switch. The specific setting of these configurable switches in nucleic acid primary structure (monomeric sequencing) determines not only amino acid sequencing in protein primary structure, but also translational pausing (TP). TP in turn determines how translated biopolymeric strings will fold into three-dimensional molecular machines.  (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Universal Plausibility Metric (UPM) & Principle (UPP) [Scirus SciTopic Page]. 2010;
Abstract:

The Universal Plausibility Metric1-3 is an objective quantification of the plausibility of extremely low-probability chance hypotheses, models, theories and scenarios. The fact that a possibility has an extremely low probability of occurrence does not necessarily establish its implausibility. Mere possibility, on the other hand, is not an adequate basis for asserting scientific plausibility. Thus a method of objectively measuring the plausibility of any improbable hypothesis is needed. This is provided by The Universal Plausibility Metric (UPM: ξ [xi, pronounced “zai” in American English, “sai” in UK English]). (Open full PDF of Paper in New Window)

   Abel, David Lynn, The Law of Physicodynamic Incompleteness [Scirus SciTopics Page]. 2010;
Abstract:

Many versions of a certain null hypothesis have been published in peer-reviewed scientific literature over the last fifteen years. The world’s scientific community repeatedly has been invited to falsify this null hypothesis.
“If decision-node programming selections are made randomly, or by law, (chance and/or necessity), rather than with purposeful intent, no nontrivial (sophisticated) function will spontaneously arise.”
If only one exception to this null hypothesis were published, the hypothesis would be falsified. Falsification would require an experiment devoid of behind-the-scenes steering. Any artificial selection hidden in the experimental design would disqualify the experimental falsification.  (Open full PDF of Paper in New Window)

  Abel, David Lynn, The Universal Determinism Dichotomy (UDD) [Scirus SciTopics Page]. 2011;
Abstract:

The Universal Determinism Dichotomy (UDD) states that all effects arise from one of two categories of causation: either Physicodynamic Determinism, or Choice Determinism. “Chance and necessity” (mass/energy interactions) comprise the Physicodynamic Determinism category of causation. Chance, however, is generally not regarded as a true cause of any effect. It is merely a probabilistic description of what might happen as a result of complex, poorly understood, interactive Necessity (physical law-like determinism).
The classic cause-and-effect chains involving initial conditions, the effects of force fields and the laws of motion are aspects of Physicodynamic Determinism (PD). Although the physical world seems ruled by physical cause-and-effect determinism, a seemingly independent phenomenon, contingency, is also frequently observed. Contingency1-11 means that events can occur in multiple ways despite the monotonous/redundant constraints of physical law, constant initial condition constraints, and set probability bounds. But, there are two kinds of Contingency: 1) Chance Contingency and 2) Choice Contingency.  (Open full PDF of Paper in New Window)

   Abel, David Lynn, Functional Sequence Complexity (FSC) [Scirus SciTpics Paper] 2011
Abstract:

Sequence complexity has three subsets: Random (RSC), Ordered (OSC), and Functional (FSC).1 Functional Sequence Complexity is measured in “Fits.”
Fits are “functional bits.”2-4 To understand Functional Sequence Complexity (FSC), one must first digest the essence of the other two subsets of sequence complexity. Random Sequence Complexity (RSC) lies at the opposite end of a bi-directional sequence complexity vector from Ordered Sequence Complexity (OSC).
 (Open full PDF of Paper in New Window)

   Abel, David Lynn, Constraints vs. Controls [Scirus SciTopic Paper] 2011
Abstract:

“Constraints” refer to the cause-and-effect deterministic orderliness of nature, to local initial conditions, and to the stochastic combinatorial boundaries that limit possible outcomes.2-4 The only “options” offered by constraints are slight statistical variation (distribution curves). “Necessity” is the result of physicodynamic cause-and-effect determinism. The physical laws contribute to overall constraints. Constraints severely limit degrees of freedom. Empirical evidence is sorely lacking for unchosen forced physical constraints producing nontrivial formal function or organization of any kind.

“Controls,” on the other hand, steer events toward the goal of formal utility and final function.5-11 Controls first require the uncertainty that can only come from freedom from constraint. In addition, controls require the exercise of choice contingency. Mere freedom from constraint is not sufficient to generate bona fide controls. Deliberate purposeful selections must be made from among real options to produce formal and final function. At the moment of purposeful selection of one option over others, a true control is introduced.
 (Open full PDF of Paper in New Window)

   Abel, David Lynn, Life origin: The role of complexity at the edge of chaos. Lecture given at the Headquarters of the National Science Foundation, Arlington, VA, Jerry Chandler and Kay Peg, Chairmen. 2006
Abstract:

Points of confusion in scientific literature
“Complexity” is a garbage-can catch-all term we use to explain everything we don’t understand and can’t reduce.
What exactly is "Complexity"”?
(This is a pdf of the PowerPoint Presentation.)
 (Open full PDF of Paper in New Window)

   Abel, David Lynn, Trevors JT. More than metaphor: Genomes are objective sign systems. Journal of BioSemiotics. 2006;1(2):253-267
Abstract:

Genetic cybernetics preceded human consciousness in its algorithmic programming and control. Nucleic acid instructions reside in linear, resortable, digital, and unidirectionally read sign sequences. Prescriptive information instructs and manages even epigenetic factors through the production of diverse regulatory proteins and small RNA’s. The “meaning” (significance) of prescriptive information is the function that information instructs or produces at its metabolic destination. Constituents of the cytoplasmic environment (e.g., chaperones, regulatory proteins, transport proteins, small RNA’s) contribute to epigenetic influence. But the rigid covalently-bound sequence of these players constrains their minimum-free-energy folding space. Weaker H-bonds, charge interactions, hydrophobicities, and van der Waals forces act on completed primary structures. Nucleotide selections at each locus in the biopolymeric string correspond to algorithmic switch-settings at successive decision nodes. Nucleotide additions are configurable switches. Selection must occur at the genetic level prior to selection at the phenotypic level, in order to achieve programming of computational utility. This is called the GS Principle. Law-like cause-and-effect determinism precludes freedom of selection so critical to algorithmic control. Functional Sequence Complexity (FSC) requires this added programming dimension of freedom of selection at successive decision nodes in the string. A sign represents each genetic decision-node selection. Algorithms are processes or procedures that produce a needed result, whether it is computation or the end products of biochemical pathways. Algorithmic programming alone accounts for biological organization.  (Open full PDF of Paper in New Window)

   Abel, David Lynn, The capabilities of chaos and complexity. Society for Chaos Theory: Society for Complexity in Psychology and the Life Sciences; Aug 8-10, 2008; International Conference at Virginia Commonwealth University, Richmond, VA.
Abstract:

There is no Abstract currently available for this paper. The Lecture is NOT Downloadable at this time.

   Abel, David Lynn, To what degree can we reduce "life" without "loss of life"? In: Palyi G, Caglioti L, Zucchi C, eds. Workshop on Life: a satellite meeting before the Millenial World Meeting of University Professors. Vol Book of Abstracts. Modena, Italy: University of Modena; 2000:4.
Abstract:

There is no Abstract currently available for this paper. The Lecture is NOT Downloadable at this time.

   Abel, David Lynn, Short definitions of life. In: Palyi G, Zucchi C, Caglioti L, eds. Fundamentals of Life. Paris: Elsevier; 2002:15.
Abstract:

There is no Abstract currently available for this paper. This is NOT Downloadable at this time.