Search for the Primitive c. 1500–1900
The Renaissance Revolution: The Archimedean Revival
The Scientific Revolution wasn't a revolution nor a rejection of the past, it was the victory of Archimedes over Aristotle.
Galileo Galilei (1564–1642) explicitly aligned himself with Archimedes, whom he called “the divine.” His early work La Bilancetta (The Little Balance) was a reconstruction of Archimedes’ method for measuring specific gravity. In his De Motu (On Motion), Galileo used Archimedean hydrostatics to attack Aristotelian physics, arguing that bodies rise or fall due to their specific gravity relative to the medium, not because of absolute “lightness” or “heaviness.”
Galileo’s genius was to idealize the world. He imagined “thought experiments” in a void, a conceptual space he derived from the atomists but treated with the rigor of geometry. By abstracting away friction, he realized that the “Ultimate It” of motion was conservation. He famously demonstrated that the path of a projectile is a parabola, a synthesis of uniform horizontal motion (inertia/impetus) and accelerating vertical motion (gravity). Galileo removed the “quality” from physics; motion was not a change in the body’s nature (like an apple turning red), but simply a change in relation to space.
While Galileo focused on the kinematics of particles, René Descartes (1596–1650) attempted to reconstruct the ontology of the “It.” Descartes rejected the void entirely, returning to a Plenum theory but one stripped of Aristotelian qualities. For Descartes, space was matter (extension). To explain planetary motion without a void, he proposed “Vortices,” swirling whirlpools of subtle matter (ether) that carried planets like boats in a river.
Descartes’ contribution was the strict mechanization of the universe. There were no “souls” in magnets, no “desires” in stones, and no “sympathies.” There was only matter in motion, transferring motion through direct contact. This stripped the “Ultimate It” of any remaining mystical properties, setting the stage for a purely mathematical treatment.
The Newtonian Synthesis: The Limit of the World
Isaac Newton (1642–1727) stands as the synthesizer who integrated the discrete atoms of Democritus, the void of the Stoics, the inertia of Buridan/Galileo, and the mathematics of Archimedes into a single coherent system. But Newton was also an alchemist and a theologian, and his physics was deeply informed by his quest for the divine structure of reality.
Newton was not a sterile materialist. He spent more time on alchemy and biblical chronology than on physics. His alchemical studies, influenced by the Arab alchemist Jabir ibn Hayyan (Geber) and the text Summa Perfectionis, conditioned him to think about “active principles” in matter, forces that could operate across space, like fermentation or attraction. This alchemical mindset likely made him more open to the concept of Gravity, an invisible force acting across a void, which the strict mechanists like Descartes rejected as “occult.”
Newton faced a metaphysical problem. If he accepted the Cartesian view that space is just the relation between bodies, then motion is relative. But Newton believed in true motion (inertial forces like centrifugal force). To anchor physics, Newton introduced Absolute Space and Absolute Time, containers that exist independently of the matter within them.
This concept was heavily influenced by the Cambridge Platonist Henry More (1614–1687). More argued against Descartes, claiming that if God is infinite/omnipresent, and space is infinite, then Space must be an attribute of God. More called space the “Spirit of Nature.” Newton adopted this, viewing Absolute Space as the “Sensorium of God,” the infinite, immovable stage upon which the divine will operates and the atoms move. This allowed Newton to embrace the Void of the atomists without succumbing to their atheism; the Void was not “nothing,” it was the presence of God.
Newton’s definition of the “Ultimate It” was formalized as Mass (quantity of matter). He stripped the impetus theory of its medieval baggage. Motion was no longer a quality inside the body; it was a state (status). A body in motion is just a body in a different relationship to Absolute Space.
However, Newton introduced a “ghost” back into the machine: Gravity. Unlike the contact mechanics of Descartes or Democritus, Gravity acted across the void. Newton himself was uncomfortable with the cause of gravity (“I feign no hypotheses”), as it smelled of the “unseen force” (Adrishta) of the Vaisheshika. Yet the mathematics worked.
In the Principia (1687), Newton united the celestial and the terrestrial. The force that pulled the Vaisheshika atom, the Mohist arrow, and the Galilean cannonball was the same force that held the moon. The “stuff” of the universe was discrete corpuscles, moving in a divine, absolute Void, governed by immutable mathematical laws.
The Metaphysical Insurrection: Leibniz, Monads, and the Proto-Information Age
Long before the quantum revolution dissolved matter into wave functions, Gottfried Wilhelm Leibniz mounted a formidable challenge to the materialist atomism that underpinned Newtonian physics. While Newton envisioned a universe of absolute space filled with hard, impenetrable particles acting under divine laws, Leibniz proposed a reality constructed of Monads, simple, immaterial substances that perceived the universe from their unique perspectives.
The divergence between the Newtonian “atom” and the Leibnizian “monad” constitutes a fundamental disagreement about the nature of existence. Newton’s atoms were physical “stuff” occupying absolute space, inert lumps waiting for a force to move them. In contrast, Leibniz’s monads possessed no spatial extension and no constituent parts. They were, in a modern sense, units of proto-information, defined not by their shape or mass, but by their internal state of perception.
Leibniz’s assertion that “one cannot in any way distinguish one place from another, or one bit of matter from another bit of matter in the same place” without reference to their internal properties foreshadows the indistinguishability of particles in quantum mechanics. However, a more striking insight arises when viewing the monad through the lens of information theory. Modern theorists like Gregory Chaitin have drawn parallels between Leibniz’s metaphysics and Algorithmic Information Theory (AIT). The monad does not just “exist”; it computes its state based on an internal program, reflecting the entire universe. This “It from Bit” perspective suggests that the fundamental building block of reality might be a logical unit instead of a particle.
Leibniz’s fascination with the binary system, Characteristica Universalis, which he developed with a deep theological conviction that “1” represented God and “0” the void, was a computational metaphysical structure, a universal language of calculation that could resolve all disputes, essentially anticipating the universal Turing machine centuries before its time. The monad, therefore, can be reinterpreted as a processor of information, a proto-qubit, where perception is the output of a specific algorithm running within the substance.
The distinction between monads was also a question of complexity. In his Monadology, Leibniz addresses the problem of “bare” monads versus “souls” or “minds.” He posits the famous “Mill Argument”: if we could blow up the brain to the size of a mill and walk inside, we would see mechanical parts pushing against one another, but we would find no “perception.” This suggests that consciousness or information processing is an emergent property of the monad’s unity, not a mechanical result of aggregate matter. The “mill” lacks the unified internal state that defines the monad. This effectively argues that a purely materialist description of the universe (like a mill or a clock) fails to account for the presence of information and perception.
Furthermore, Leibniz’s conception of the monad was intrinsically linked to his denial of the vacuum. If monads are the centers of force and perception, and space is merely the relation between them, then a “void” is a logical absurdity. This contrasts with the Newtonian requirement of a vacuum to allow atoms to move without infinite drag. Leibniz’s “plenum,” a universe full of matter/force, required a different physics, one of continuous pressure and transmission, which would eventually resurface in the field theories of the 19th century.
The Leibniz-Clarke correspondence (1715–1716) crystallized the conflict over the “container” of this matter. Samuel Clarke, acting as Newton’s proxy, defended Absolute Space as a sensorium of God, a rigid stage upon which the drama of physics unfolded. Leibniz countered with a relational view: space is nothing but the order of coexistences, and time is merely the order of successions.
Leibniz argued that if space were absolute, God would have had to make an arbitrary choice about where to place the universe (e.g., why here and not five meters to the left?), violating the Principle of Sufficient Reason. If the universe were shifted five meters in absolute space, and all relations between objects remained identical, the two states would be indiscernible. By the Identity of Indiscernibles, they must be the same state. Therefore, absolute space is a fiction.
This relational framework lay dormant for two centuries until the crisis of the ether and the advent of General Relativity vindicated the idea that space has no existence independent of the matter it contains. The persistence of the Newtonian absolute frame, however, would drive the 19th-century obsession with the luminiferous ether, a theoretical dead-end that required a complete conceptual revolution to escape.
The Theology of Efficiency: The Principle of Least Action
While Leibniz debated the nature of substance, a parallel revolution was occurring in the description of motion. The transition from vector mechanics (forces) to analytical mechanics (energy and action) began not with a mathematical postulate, but with a theological assertion regarding the budget of Creation.
Pierre Louis Moreau de Maupertuis, seeking to unify the laws of light and matter, proposed the Principle of Least Action in 1744. He defined “Action” as the product of mass, velocity, and distance (), and asserted that “Whenever there is any change in nature, the quantity of action necessary for that change is the smallest possible.”
For Maupertuis, this was proof of a wise Creator. A blind mechanism might be inefficient, but a divine Architect would surely operate with maximum economy. This teleological nature, where a particle seems to “know” its destination and chooses the optimal path, stood in stark contrast to the causal chains of Newtonian force. It introduced a “final cause” into physics, suggesting that the future state of a system determines its current trajectory.
Maupertuis framed “Action” not merely as a physical quantity but as Nature’s “budget” or “fund” (). He argued that nature saves up this quantity, treating action as a resource that must be expended sparingly. This economic metaphor was radical; it shifted the focus from the instantaneous push-and-pull of forces to a holistic assessment of the entire path of motion. The particle does not just react to the immediate force; it minimizes the cost of the entire journey.
The reception of Maupertuis’s principle was not universally reverent. It sparked one of the most vicious intellectual feuds of the Enlightenment, culminating in the Diatribe of Doctor Akakia by Voltaire.
Voltaire, a champion of Newtonian empiricism and a skeptic of metaphysical overreach, viewed Maupertuis’s grandiose theological claims as vanity masquerading as science. The conflict was exacerbated by a priority dispute involving Johann Samuel König, whom Maupertuis, using his power as President of the Berlin Academy, had declared a forger for claiming Leibniz had anticipated the principle. This abuse of institutional power incensed Voltaire.
In Dr. Akakia, Voltaire mercilessly lampooned Maupertuis. He did not attack the mathematics of the principle but rather the metaphysical arrogance of its author. He mocked Maupertuis’s proposals to dig a hole to the center of the Earth to study its rotation, to build a city where only Latin was spoken to preserve the language, and to dissect giants in Patagonia to understand the nature of the soul. Voltaire treated the Principle of Least Action as the delusion of a man who “contends that the existence of God can only be proved by an algebraic formula.”
The satire was so devastating that Frederick the Great, Maupertuis’s patron, ordered the pamphlet burned and Voltaire arrested, effectively ending Maupertuis’s public credibility. This episode illustrates a crucial moment in the history of physics: the purging of overt theology from physical laws. While the principle survived, its metaphysical baggage was jettisoned by the mathematicians who followed. The “Action” remained, but the “God” who minimized it was slowly replaced by the abstract requirements of the calculus of variations.
Leonhard Euler, though a friend and defender of Maupertuis, began the process of stripping the principle of its theological gloss. In his 1744 work, Euler formulated a variational principle for mechanics, the Methodus inveniendi, which laid the groundwork for the calculus of variations. Euler showed that the path of a particle minimizes the integral of momentum over distance. However, Euler maintained a geometric, intuitive approach, relying on diagrams and the geometric interpretation of small variations.
It was Joseph-Louis Lagrange who transformed this into a purely analytical machine. In his Mécanique Analytique (1788), Lagrange boasted that “No figures will be found in this work.” This was a deliberate methodological break. Lagrange sought to liberate mechanics from geometry (which was tied to intuition) and ground it entirely in analysis (algebra and calculus). He introduced generalized coordinates and the Lagrangian function
showing that the equations of motion could be derived simply by extremizing the action integral
This shift was profound. The “force” central to Newton’s schema, a vector pushing a body, was replaced by a scalar quantity, “energy,” defined over a field of possibilities. The particle does not “feel” a force; it explores the landscape of energy and “selects” the path of stationarity. This formulation allowed for the solution of complex systems (like fluids or constrained rigid bodies) where identifying individual vector forces was intractable.
William Rowan Hamilton brought this evolution to its zenith in the 19th century. He recognized a deep formal analogy between geometric optics and classical mechanics. Just as light follows the path of least time (Fermat’s Principle), matter follows the path of least action. Hamilton’s formulation
and his canonical equations treated position and momentum on equal footing, creating a phase space that would later become the natural language of quantum mechanics.
Hamilton’s “characteristic function” (essentially the Action as a function of coordinates) described surfaces of constant action propagating through space, exactly like wave fronts in optics. In this view, the particle’s trajectory is merely the “ray” perpendicular to these wave fronts. This was a ghost of a wave theory of matter, haunting classical mechanics nearly a century before De Broglie. The “teleology” was no longer divine foresight but a property of the wave fronts propagating through configuration space, a concept that lay dormant until Schrödinger awakened it in 1926 to construct wave mechanics.
Maupertuis’s principle of least action, stripped of its theological justification by Euler → Lagrange → Hamilton, replaced vector forces with a scalar variational principle and revealed the formal identity of optics and mechanics, thereby planting the seed of wave mechanics a century early.
The Thermodynamics of Reality: Energy, Entropy, and the Statistical Turn
While analytical mechanics refined the description of reversible motion, a separate revolution was dismantling the concept of the eternal, static universe. The laws of thermodynamics introduced the concept of Energy as the fundamental currency of physical interactions and Entropy as the arbiter of time’s direction.
In the mid-19th century, the caloric theory, which treated heat as a subtle, indestructible fluid, collapsed under the weight of experimental anomalies. Julius Robert Mayer, James Prescott Joule, and Hermann von Helmholtz independently converged on the principle of the conservation of energy.
Mayer, a physician, arrived at the concept via physiology, noting the color of venous blood in the tropics and deducing a relationship between heat and work. He formulated the indestructibility of “force” (energy), stating that “Energy can be neither created nor destroyed.” Joule provided the experimental rigor, measuring the mechanical equivalent of heat with paddle wheels and falling weights. Helmholtz generalized this to all physical forces, including electricity and magnetism.
This unification had a metaphysical cost: it implied a universe of constant quantity but degrading quality. The first law promised eternal energy; the second law, formulated by Clausius and Thomson (Lord Kelvin), promised inevitable decay. Rankine’s “mechanical theory of heat” attempted to bridge these by proposing molecular vortices, but the trend was clear: the universe was running down.
Ludwig Boltzmann’s contribution was to bridge the chasm between the deterministic dynamics of atoms and the irreversible behavior of heat. By interpreting entropy () as a measure of statistical probability,
Boltzmann introduced a radical shift: the laws of thermodynamics are statistical certainties instead of immutable.
This challenged the deterministic worldview inherited from Newton and Laplace. In Boltzmann’s statistical mechanics, a system could theoretically spontaneously order itself (e.g., all air molecules rushing to one corner of the room), but it is overwhelmingly unlikely to do so. This marked the erosion of the “It” as a definitive, trackable entity. In a gas, the individual particle loses its narrative importance, replaced by distribution functions and probabilities.
Boltzmann faced fierce opposition from mathematicians like Zermelo and Loschmidt. Loschmidt argued the “Reversibility Paradox”: if the laws of motion are time-reversible (as Newton’s are), how can they produce irreversible entropy increase? Zermelo argued the “Recurrence Paradox”: given infinite time, any mechanical system must return to its initial state (Poincaré recurrence), rendering permanent entropy increase impossible. Boltzmann’s defense, that the timescales for such recurrence are astronomically longer than the age of the universe, introduced a new kind of physical reality: the “statistically emergent” reality, where the macroscopic “It” behaves differently than its microscopic constituents.
The Field and the Ether: The Crisis of Propagation
Newtonian gravity assumed action-at-a-distance: mass influenced mass instantaneously across the void. This “spooky” interaction was philosophically repugnant even to Newton, who called it an absurdity, but it was mathematically successful. The 19th century saw the rise of Field Theory, which sought to fill the void with a medium of transmission, returning to a Cartesian plenum but with sophisticated mathematics.
Michael Faraday, lacking formal mathematical training, visualized lines of force permeating space. For Faraday, the “field” was the primary physical reality, not the bodies it acted upon. He rejected action-at-a-distance, proposing that magnetic and electric effects were transmitted contiguously through a medium. He viewed charge not as an inherent property of a particle, but as a state of tension in the field, a “polarized pair.”
James Clerk Maxwell translated Faraday’s intuition into the language of differential equations. Maxwell’s equations demonstrated that electric and magnetic fields propagated as waves at the speed of light, unifying optics and electromagnetism. However, this triumph birthed a new “It”: the Luminiferous Ether.
If light is a wave, it must wave something. The ether was postulated as an all-pervasive, elastic solid that filled the vacuum. It had to be rigid enough to support high-frequency transverse waves (light) yet tenuous enough to allow planets to pass through it without drag.
Maxwell himself spent considerable effort constructing mechanical models of the ether. He utilized analogies of “molecular vortices” and “idle wheels” to explain how the stress of the magnetic field could be transmitted through a mechanical medium. These were not meant to be literal descriptions, but they reinforced the conviction that the “field” was a state of a mechanical substance.
The “Ether Drag” hypothesis attempted to reconcile the motion of matter through this medium. Augustin-Jean Fresnel proposed a partial drag coefficient
to explain why Arago’s experiments failed to detect the earth’s motion. This coefficient suggested that the ether was entrained inside moving transparent bodies. When Fizeau tested this experimentally by passing light through moving water, he confirmed Fresnel’s coefficient. This seemed to validate the ether, but it resulted in a bizarre physical picture: a solid ether that was stationary in the vacuum but partially dragged by moving glass or water. By the late 19th century, the ether had become a monster of mechanical contradictions, a “chimerical thing,” to borrow Leibniz’s phrase, yet it was the unquestioned foundation of physics.
Global Interludes: Forgotten Vanguards of the Fin de Siècle
The narrative of physics is often confined to the axis of London, Berlin, and Paris. However, crucial advancements in the understanding of matter and waves were occurring elsewhere, challenging the Western monopoly on scientific innovation and anticipating technologies that would not be realized for decades.
In Calcutta, Sir Jagadish Chandra Bose was conducting experiments that would not be matched in the West for nearly half a century. While Marconi was focusing on long-wave radio for trans-Atlantic communication (using wavelengths of hundreds of meters), Bose was exploring the optical properties of “invisible light” in the millimeter range (5 mm to 2.5 cm, or roughly 60 GHz).
Bose’s apparatus was a marvel of miniaturization and precision. He developed “collecting funnels,” what we now call pyramidal horn antennas, to direct these waves, and dielectric lenses to focus them. Perhaps most remarkably, he constructed polarizers using twisted jute fibers. This work on the optical rotation of microwaves in twisted structures pioneered the study of chiral media, effectively anticipating the field of artificial dielectrics and metamaterials by a century.
In 1895, Bose demonstrated these waves publicly in Calcutta, ringing a bell and igniting gunpowder remotely through walls and the body of the Lieutenant Governor. When invited to the Royal Institution in London in 1897 by Lord Rayleigh, Bose impressed the scientific elite with his compact millimeter-wave spectrometer. However, Bose’s philosophy diverged sharply from the commercialism of the West. He refused to patent his inventions, believing that scientific knowledge was a public good to be shared freely. In a letter to Rabindranath Tagore, he expressed disdain for the “greed for money” he witnessed in Europe, where a telegraph company proprietor urged him to withhold details from his lecture to secure a patent.
Crucially, Bose invented the mercury coherer, a self-recovering detector that was vastly superior to the filings-based coherers used by Marconi. While Marconi’s design required a mechanical “tapper” to reset the device after every signal, Bose’s mercury device restored itself automatically. Evidence suggests Marconi’s receiving device in his famous transatlantic transmission was a direct copy of Bose’s design, a fact obscured by Bose’s refusal to engage in patent wars. While Marconi received the Nobel Prize and commercial dominance, Bose’s work laid the true foundational physics for high-frequency communication, radar, and Wi-Fi, contributions that were only formally recognized by the IEEE nearly a century later.
In 1904, the same year J.J. Thomson was promoting his “Plum Pudding” model (where electrons were embedded in a diffuse sphere of positive charge like raisins in a cake), Japanese physicist Hantaro Nagaoka proposed a radically different architecture: the “Saturnian Model.”
Nagaoka visualized the atom as a massive, positively charged central sphere surrounded by a ring of electrons, analogous to Saturn and its rings. He derived this model not from scattering data (which didn’t exist yet) but from a theoretical investigation into the stability of rings, drawing on Maxwell’s earlier work on the stability of Saturn’s rings. Nagaoka argued that such a system would be quasi-stable and could explain spectral lines through the vibrations of the electron ring.
While the model was criticized for its ultimate instability, classical electrodynamics predicted the radiating electrons would lose energy and spiral into the nucleus, it correctly anticipated the existence of the atomic nucleus seven years before Rutherford. Western historiography often treats the nuclear model as a purely Rutherfordian discovery, yet the conceptual leap to a dense core was already present in Nagaoka’s work.
Crucially, when Ernest Rutherford published his seminal paper on the scattering of alpha particles in 1911, he explicitly cited Nagaoka. Rutherford noted that the electrostatic potential required to explain the large-angle scattering of alpha particles was identical to the potential in Nagaoka’s “central attracting mass.” In fact, the “Rutherford Model” and the “Nagaoka Model” are mathematically indistinguishable regarding the central potential; Rutherford supplied the experimental proof for the structure Nagaoka had theoretically conceived. Nagaoka also pioneered the use of spectroscopy to investigate the nucleus itself, studying hyperfine interactions in mercury lines decades before nuclear spin was understood, a contribution that makes him a grandfather of nuclear structure physics.
The Collapse of Classical Certainty (1887–1905)
As the 20th century approached, the mechanical worldview faced a crisis from which it would never recover. The Ether, intended to be the absolute reference frame of the universe, the “It” that held the light, refused to be found.
The 1887 Michelson-Morley experiment was designed to detect the “ether wind” created by the Earth’s motion. Using an interferometer of unprecedented sensitivity (floating on a pool of mercury to dampen vibrations), they measured the speed of light in perpendicular directions. They expected a shift in the interference fringes due to the earth plowing through the stationary ether. They found nothing.
This “null result” was catastrophic for the ether theory. It suggested that the earth was always at rest relative to the ether, which was physically impossible given its orbit around the sun. To save the phenomena, George Francis FitzGerald and Hendrik Lorentz independently proposed a radical hypothesis: matter contracts in the direction of motion through the ether. This “Lorentz contraction” was initially proposed as a physical deformation, the pressure of the ether physically squashed the atoms, shortening the interferometer arm just enough to hide the effect of the wind.
This period saw a flurry of experiments attempting to detect the ether through second-order effects. The Trouton-Noble experiment (1903) looked for a torque on a charged capacitor moving through the ether; the Rayleigh-Brace experiment (1902) looked for double refraction in moving media. All returned null results. The ether had become a “conspiracy theory” of nature: a medium that existed but arranged every possible physical effect to make itself undetectable.
To make Maxwell’s equations invariant in moving frames, Lorentz introduced a mathematical variable he called “Local Time”
For Lorentz, this was a mathematical fiction, a calculation trick to simplify the equations. He did not believe that time actually slowed down; “true” time remained the absolute time of Newton. The “local time” was just what a moving observer thought was time because their clocks were affected by the ether wind.
Henri Poincaré came agonizingly close to formulating Special Relativity before Einstein. In his 1904 address at the St. Louis Congress of Arts and Science, Poincaré formulated the “Principle of Relativity” as a general law of nature. He argued that no experiment, mechanical or electromagnetic, could ever detect absolute motion.
Poincaré interpreted Lorentz’s “local time” physically. In 1900, he proposed a thought experiment: observers in a moving frame synchronize their clocks by exchanging light signals. If they assume light travels at the same speed in both directions (ignoring the ether wind), they will set their clocks to “local time” rather than “true time.” Poincaré realized that this synchronization error was exactly what was needed to make the principle of relativity hold.
Poincaré also identified that the Lorentz transformations formed a mathematical group, meaning fully consistent physical laws could be built upon them. He even derived the correct transformation equations (which he named after Lorentz) and noted that nothing can exceed the speed of light. In his 1905/1906 paper “Sur la dynamique de l’électron,” he essentially possessed the entire mathematical apparatus of Special Relativity.
However, Poincaré never fully abandoned the Ether. He viewed it as a “convenient hypothesis” or a convention, rather than discarding it entirely. He treated the relativity of simultaneity as a result of “perfect compensation” by the ether, rather than a fundamental property of spacetime. He maintained a distinction between “apparent” phenomena (measured by observers) and “real” phenomena (in the ether). It remained for the patent clerk in Bern to take the final, radical step: to declare the Ether superfluous and make “local time” the only time, dissolving the absolute container once and for all.
The Archimedean revival displaced Aristotelian qualities with mathematical relations, culminating in Newton’s synthesis of inertial matter in absolute space. Leibniz countered with a relational, informational ontology, while the Principle of Least Action recast motion as a global optimization rather than local causation. Thermodynamics dissolved determinism into probability, and field theory replaced particles as primary reality yet the ether’s failure exposed the final limit of classical substance. By 1900, the “Ultimate It” had become unstable: neither matter, nor force, nor medium, but something deeper awaiting reformulation.