EPOCH
EPOCH © 2025 by Stéphane Fosse - This book is published under the terms of the CC BY-SA 4.0 licenseChapter 1
Before 1930
The Dawn of Computing
The period from 1700 to 1930 radically transformed our humanity. These two centuries, marked by industrialization and its social upheavals, laid the foundations for the computing that was about to emerge.
The 18th century saw Europe ignited by an unprecedented intellectual fervor. The Enlightenment imposed a worldview where science and reason took the throne. Inventors, driven by frenzied curiosity, sought to tame the universe through numbers and mechanisms. Scientific academies multiplied like buzzing hives of new ideas, weaving their networks across the old continent.
Around 1760, Great Britain was shaken by the first convulsions of the industrial revolution. Nascent factories swallowed workers whole and machines gradually replaced human hands. Industrialists, confronted with the growing complexity of their enterprises, demanded precise tools to juggle their figures. This thirst for arithmetic precision pushed inventors to devise mechanical solutions to the most laborious calculations.
The end of the century witnessed the emergence of the first mechanized devices in textile mills. Joseph-Marie Jacquard transformed weaving in 1801 with his punched cards that dictated patterns for the looms to reproduce. These seemingly innocuous perforated cardboard rectangles carried within them the seed of future computer systems. Each hole, each solid space represented binary information that the machine interpreted without error, a striking prefiguration of digital memories.
The 19th century opened in a whirlwind of change. Empires stretched their tentacles to the four corners of the globe, trade intensified, creating a visceral need for rapid communications. The railway, that steel monster devouring space, imposed clock synchronization and rigorous schedule management. Modern states swelled their administrations, which soon drowned beneath mountains of paperwork.
George Boole, an English mathematician with an incisive mind, published his theory of logical algebra in 1847. Though seemingly of interest only to a few eccentric scholars, his work contained the key that would one day unlock the doors of computing. His binary system, reducing every proposition to true or false, patiently awaited visionary minds to seize upon it.
The electric telegraph, copper wires stretched from one continent to another, shattered the barriers of time and space after 1840. What had taken months now required only minutes. Samuel Morse imposed his code of dots and dashes, the first standardization of binary language. Submarine cables, copper serpents laid on the ocean floor, connected Europe to America as early as 1866, creating the first global network of instantaneous communication.
The adoption of universal time in 1884 at the International Meridian Conference in Washington perfectly illustrated how technology reshaped our perception of the world. The 24 time zones imposed an unprecedented temporal discipline on humanity. The entire world set its watches to Greenwich, subject to the gentle tyranny of synchronized clocks.
The final decades of the 19th century saw companies grow into bureaucratic behemoths. Banks, insurance companies, and department stores collapsed beneath registers and files. The 1890 American census illustrated this problem: the federal administration took seven years to process the data from the previous census. Herman Hollerith, a brilliant young engineer, designed his electromagnetic tabulating machine. Thanks to it, the results of the 1890 census were available in only six weeks. His Tabulating Machine Company, founded in 1896, would later become IBM.
The Great War radically transformed the relationship with information technologies. In the shadow of the trenches, mathematicians and linguists broke enemy codes. In France, Lieutenant Georges Painvin deciphered the German ADFGVX code in 1918, thwarting the Spring Offensive and perhaps saving Paris. Ballistic calculations, necessary for long-range artillery like the German Paris-Geschütz, demanded unprecedented precision. The war had shown that victory belonged to those who mastered information.
The interwar period saw the birth of new work organization methods. Frederick Taylor dissected workers’ movements with a stopwatch while Henry Ford imposed his assembly line. The office followed the factory in this race toward rationalization. Burroughs accounting machines, National cash registers, Monroe calculators invaded administrations. The noise of keys, the chime of totals, the sliding of carriages now punctuated the daily life of white-collar workers.
The 1920s saw the explosion of telecommunications. Wireless telegraphy (TSF) carried voices and music into every home. The BBC (British Broadcasting Corporation) broadcast its first regular programs in 1922, followed by numerous stations around the world. The telephone, Graham Bell’s 1876 invention, left behind its status as a gadget for the wealthy and began its conquest of households. Switchboard operators, predominantly women, became the first human interfaces between machines and their users, a striking metaphor for future human-machine relationships.
Social transformations accompanied these technological developments, and mass literacy changed the relationship with knowledge. In France, the Jules Ferry laws of 1881-1882 imposed compulsory education. In the United States, the literacy rate exceeded 80% as early as 1870. This educated population demanded ever more books, newspapers, and information. Knowledge became a commodity that now circulated at the speed of electricity.
Galloping urbanization created unprecedented organizational challenges. Paris grew from 500,000 to 3 million inhabitants between 1800 and 1930. New York exploded from 60,000 to 7 million souls during this period. These sprawling metropolises demanded colossal infrastructure. Baron Haussmann redesigned Paris, imposing his straight avenues on the old medieval quarters. In New York, engineer John Augustus Roebling threw his suspension bridge between Manhattan and Brooklyn in 1883. These technical feats required diabolically precise calculations and methodical resource management.
The world of work metamorphosed under these intersecting influences. The tertiary sector swelled to dominate the American economy by the 1920s. Information workers—accountants, stenographers, operators—formed a new social class. Women flooded into these professions, altering gender relations in society. The typist, bent over her Remington, became an emblematic figure of this bureaucratic modernity.
In 1929, on the eve of the stock market crash that would shake the global economy, computing remained officially unborn. Yet all the ingredients for its future explosion were assembled. The need for calculation intensified in every field. The necessary mathematical theories had matured. Nascent electronics promised extraordinary performance. The ground was ready for the advent of programmable calculators that would soon transform the world more radically than the steam engine had done a century earlier.
Binary System
The historical journey of the binary system reveals how this mathematical idea evolved into the foundation of modern computing. Thomas Harriot, an English mathematician, sketched around 1600 the first outlines of a numerical representation based solely on 0 and 1. His research on combinations led him to observe that any number could be expressed as a sum of powers of 2. These works, never published, remained in obscurity until the 1920s.
A century later, Gottfried Wilhelm Leibniz rediscovered this system by chance. In the 1670s, while working intensively on number division and prime numbers, binary notation appeared to him as an elegant solution for certain mathematical calculations. Initially, he used it simply to illustrate his theorems without imagining its practical use. In his 1697 letter to the Duke of Brunswick, Leibniz proposed creating a commemorative medal and developed a theological reading of binary where 1 symbolizes being and 0 represents nothingness. This medal never came to fruition, but the letter launched a series of publications on this system.
In 1703, in the Mémoires de l’Académie Royale des Sciences, Leibniz published his “Explanation of Binary Arithmetic”. This text outlined the principles of the system and demonstrated how to perform basic operations while acknowledging the cumbersomeness of these long sequences of 0s and 1s for everyday use.
The 18th century saw mathematicians like Jean Bernoulli and Leonard Euler take up the binary system. Euler notably used it to study the properties of numbers of the form 2n + 1. Other researchers established connections between this system and other number bases such as octal and hexadecimal.
The practical application of binary emerged in the 19th century with the advent of telecommunications. Morse’s telegraph, patented in 1837, relied on two electrical states, presence or absence of current, to transmit information. Its code combined dots and dashes, foreshadowing the future use of binary in digital communications. Émile Baudot took this a step further in 1870 with his explicitly binary telegraphic code. In 1901, Charles Bouton demonstrated the usefulness of binary for analyzing the game of Nim, paving the way for its application in game theory.
The revolution came with 20th-century electronics. The Eccles-Jordan flip-flop circuit, created in 1919, gave physical existence to the 0 and 1 states. This crucial invention made the first binary memory circuits possible. In 1937, Claude Shannon demonstrated in his thesis the connection between Boolean algebra and the binary system for designing electronic switching circuits.
Early computers like ENIAC (1946) still operated in decimal, with ten vacuum tubes per digit. The 1947 Burks-Goldstine-von Neumann report changed the game by demonstrating binary’s advantages: simpler circuits, increased reliability, and natural harmony with logical operations. This recommendation influenced all subsequent computers.
The 1950s and 1960s saw the rise of electronic memories, strengthening binary’s dominance. Magnetic core memories and later semiconductor memories naturally relied on two distinct states. Some manufacturers attempted to explore four- or sixteen-level memories in the 1970s and 1980s; Intel experimented with four-level ROMs in certain processors, but these attempts remained marginal compared to the robustness of binary storage.
Binary extended beyond computer hardware. Digital telecommunications adopted it for its noise resistance and signal regeneration capability. Optical media such as CD-ROMs and DVD-ROMs also used it. Its success stems from its conceptual simplicity, which makes circuits more reliable, its compatibility with Boolean algebra, which makes it the natural language of logical operations, and its robustness against disturbances, explained by the distinction of only two states.
Research on other numerical systems has not disappeared, however. The ternary system (with three states) was the subject of extensive studies in the Soviet Union in the 1950s. More recently, quantum computing has opened new horizons with its qubits capable of existing in a superposition of states. Despite these alternatives, the binary system remains the soul of classical computing. Its ability to encode any information using only two symbols gives it remarkable universality.
Punch Card
In 1801, Joseph-Marie Jacquard, a Lyon-based weaver, was searching for a way to automate his looms. He invented perforated cardboard cards to encode weaving patterns. Without knowing it, he had just laid the first stone of a major idea. These cards were not intended for calculations, but they already embodied the powerful concept of transforming an instruction into holes readable by a machine.
It was almost a century later, in 1890, that the U.S. Census Bureau faced a colossal problem. The population was growing at a breakneck pace and the previous census had stretched over eight years. A statistician named Herman Hollerith then proposed the bold solution of adapting Jacquard’s principle to counting people. His system reduced processing time to just two and a half years. A staggering gain that did not go unnoticed at the time. The success was such that Hollerith founded his own company in 1896, the Tabulating Machine Company. It would later merge with others to give birth to a giant we all know: IBM.
At the beginning of the 20th century, the punch card became established in the administrative landscape of large organizations. No more manual bookkeeping! Treasurers and accountants marveled at these machines capable of sorting, counting and adding in record time. Inventory management, payroll calculations, everything went through them.
Hollerith’s first cards had only 45 columns, a dimension quickly deemed insufficient. In 1928 at IBM, Clair D. Lake replaced circular holes with rectangular ones, thus making it possible to go from 45 to 80 columns on the same surface. An innovation that would leave a lasting mark on computing.
This 80-column standard left a surprising imprint on our current digital world. When IBM designed its first terminals in the 1960s, such as the famous 3270, each line displayed exactly 80 characters. A choice dictated by compatibility with existing equipment. This technical constraint propagated like a shock wave through the decades. Even today, many code editors limit lines to 80 characters by default, or at least offer it as an option. Terminal emulators perpetuate this tradition. Programming best practice guides recommend this limit to improve readability. The shadow of the punch card still looms over our screens.
Between the two world wars, a true mechanographic data processing industry developed. IBM dominated in the United States, while in Europe, companies like Powers (future Remington Rand) or Bull in France attempted to carve out a market share. Their business model relied on renting machines rather than selling them, ensuring stable revenue and strict control over this strategic technology.
This new technique created new professions. Mechanographic rooms came alive under the nimble fingers of keypunch operators, mostly women, who transformed raw data into punch cards with the precision of goldsmiths. Others verified each card, while operators adjusted sorters and tabulators. These departments became the beating heart of large organizations.
The heyday of the punch card came in the 1950s-1960s. After the arrival of the first electronic computers, it remained the preferred medium for data input and storage. IBM’s first models, such as the 650 or the 1401, were designed to integrate with existing punch card installations, gently accompanying the transition to the electronic era.
The history of this technology also includes its dark chapters. During World War II, punch card systems were used for census and population control in occupied Europe. Deutsche Hollerith-Maschinen Gesellschaft, IBM’s German subsidiary, supplied machines used to organize deportations. A chilling reminder that any technology can also be used for terrible purposes.
The decline began in the 1970s. The arrival of interactive terminals and magnetic media gradually spelled the end for these cardboard rectangles. The punch card established the principle of binary information storage (hole or no hole, 1 or 0), clearly separated the data medium from the processing machine, and demonstrated the importance of standardization. The organizational methods developed around it influenced the design of early computers and structured the organization of work in computing centers.
Charles Babbage
Young Charles Babbage was born into a wealthy London household in 1791. His education at Cambridge, where he received rigorous mathematical training, made him aware of the limitations of manual calculation. With some classmates, he founded the Analytical Society to modernize mathematics education at the university. This first experience already reflected his desire to renew established methods.
One evening in 1821, Charles was discussing with his friend John Herschel. Both were exhausting themselves verifying mathematical tables riddled with errors. “Imagine these calculations performed by a steam engine”, Babbage suggested. This remark, tossed into the conversation, marked the beginning of an extraordinary intellectual adventure.
Babbage set to work. His first creation, the difference engine, was designed to automatically calculate polynomials using the method of finite differences. The 1822 prototype worked well enough to convince the British government to fund a more ambitious version. A real tour de force since the device not only performed operations but also printed the results, avoiding any transcription errors.
The cogwheels meshed together in a complex mechanical ballet. But London artisans, despite their expertise, could not manufacture parts with the required precision. The project became bogged down. Babbage, far from being discouraged, turned to an even bolder idea.
In 1834, his analytical engine took shape on paper. This new machine no longer followed a fixed program but could be reprogrammed for different calculations. Its architecture bore a striking resemblance to 20th-century computers: a computing unit (the mill), memory (the store), and input and output devices.
Fascinated by Jacquard looms, Babbage borrowed the principle of punched cards to control his machine. These cards defined both the data and the instructions to be executed. The analytical engine incorporated everything that constitutes the essence of modern computing, namely the stored program, conditional loops, and subroutines. It compared numbers, made decisions, and executed the four basic arithmetic operations.
His meeting with Ada Lovelace gave a new dimension to the project. Daughter of Lord Byron and a brilliant mathematician, Ada translated an Italian article describing the machine. Her notes added to the translation detailed an algorithm for calculating Bernoulli numbers. This text remains etched in history as the first computer program ever written.
The analytical engine would never progress beyond the stage of plans and diagrams. Victorian England, despite its industrial might, could not manufacture a mechanism of such complexity and precision. The project’s cost frightened public financiers. Yet until his death in 1871, Babbage never stopped refining his drawings, obsessed with his vision.
A hundred years later, Howard Aiken created the Harvard Mark I, drawing direct inspiration from Babbage’s work. The designers of ENIAC and the first electronic computers adopted his architectural principles without always knowing their origin. Babbage had understood everything: the separation between hardware and software, the notion of debugging, code reuse. He had also anticipated the questions of reliability and performance that still haunt computing today.
In 1991, the Science Museum of London finally built a working version of his difference engine No. 2. It functioned perfectly, proving the accuracy of Babbage’s calculations and the coherence of his mechanical design. What we rediscover is not merely an inventor, but a multifaceted thinker.
His writings on industrial economics, his critiques of the British university system, his interest in applying mathematics to industry reveal a mind ahead of its time. Between the mechanical calculators of the 17th century and electronic computers, Charles Babbage’s work traced a visionary path, too often overlooked.
Boolean Algebra
At the beginning of the 19th century, the stagnation of British mathematics contrasted sharply with continental ferment. A dispute between Newton’s and Leibniz’s disciples had paralyzed work across the Channel. While Europe calculated using Leibnizian differential notation, the British clung to Newtonian fluxions, which were less practical and less fruitful.
Around 1810, a few innovative minds founded the Analytical Society at Cambridge. Babbage and his colleagues broke the intellectual isolation by importing methods from the continent. This called into question the very foundations of algebra: what did negative numbers and imaginary quantities truly mean? An answer emerged in symbolic algebra. It no longer derived its legitimacy from operations on numbers, but established formal laws applicable to any symbols whatsoever.
It was in this intellectual atmosphere that George Boole forged his work. Born in 1815 in Lincoln into modest circumstances, never a university graduate, he taught himself mathematics with astonishing rigor. Initially a schoolmaster, he became professor at Queen’s College Cork in 1849. A controversy between Augustus De Morgan and William Hamilton on quantification of the predicate prompted him to publish The Mathematical Analysis of Logic in 1847. His insight was to apply algebra to logical reasoning.
Boole transformed logic into a formal system with his algebraic notation. Literal symbols like x or y represented classes of objects, combinable through operators (+, ×). These operations followed precise rules: commutativity (xy = yx), distributivity (x(u + v) = xu + xv), and this singular property: x2 = x. This last rule marked a break with ordinary numerical algebra, where only 0 and 1 satisfy it.
His system translated logical propositions into equations. Thus, all X are Y became x(1 −y) = 0. The use of symbol 1 for the universe of discourse and 0 for the empty class constituted another discovery.
Augustus De Morgan, Boole’s friend and contemporary, was developing his own logical theories in parallel. His Formal Logic appeared on the same day as Boole’s work in 1847. De Morgan introduced the notion of variable universe of discourse, breaking with the fixed Aristotelian universe. He also formulated the famous laws that bear his name: the negation of a conjunction equals the disjunction of negations, and vice versa.
Boolean notation suffered from limitations. Boole required that addition apply only to disjoint classes, complicating the expression of certain relations. His definition of subtraction required that the subtracted class be included in the initial one.
In subsequent decades, mathematicians refined this system. Charles Sanders Peirce made decisive contributions in the 1880s, notably demonstrating that all Boolean operations could be reduced to a single one: NAND (not-and) or NOR (not-or).
Modern notation took shape gradually. Bertrand Russell introduced the symbol ∨ (or) in 1906, while Arend Heyting proposed ∧ (and) in 1930. The expression “Boolean algebra” was first used by Henry Maurice Sheffer in 1913.
In 1936, Marshall Harvey Stone took a step toward abstraction by unifying earlier work under the concept of Boolean ring. He established the isomorphism between Boolean algebra and this structure, creating a fundamental theoretical bridge.
Practical applications of Boolean algebra exploded in the 20th century. In 1938, Claude Shannon demonstrated in his master’s thesis at MIT that these principles enabled the analysis and design of electrical switching circuits. This discovery created a link between mathematical logic and electronic design, founding modern computer science.
Today, Boolean algebra permeates all of computer science. It structures the design of logic circuits, formal verification of programs, and query optimization in databases. Every microprocessor relies on optimization techniques derived from this theory.
The history of Boolean algebra shows how an abstract theory, born from questions about logic, transforms an entire technological domain. It underscores the importance of mathematical notations in which an adapted symbolism reveals previously invisible relations. George Boole dreamed of a “calculus of thought” mechanizing logical reasoning. While this philosophical project did not succeed as such, his mathematical tools became the secret language of the machines that shape our world.
Submarine Cable
In the mid-XIXth century, crossing the Atlantic required at least ten days at sea. In 1840, Samuel Morse launched an idea that seemed like fantasy : connecting Europe to America with a telegraph cable at the bottom of the ocean. This almost delirious vision would transform our relationship with time and space.
A businessman named Cyrus W. Field seized on this ambition. After building his fortune in the paper industry, he took an interest in telegraphy during the 1850s. His first project, modest in scope, aimed to connect St. John’s, Newfoundland to New York, taking over from engineer Frederic Newton Gisborne’s work. Funding ran short. No matter, Field thought bigger and redirected everything toward a transatlantic link.
The Atlantic Telegraph Company was born in 1856 on the initiative of Field and engineers John Watkins Brett and Charles Tilston Bright, two Britons experienced in submarine telecommunications. London and Washington participated in the financing. The cable’s design sparked passionate discussions among specialists. Morse, supported by Michael Faraday, argued for a thin wire limiting signal delay. William Thomson, the future Lord Kelvin, defended a thick copper core reducing electrical resistance.
The Gutta Percha Company finally manufactured this revolutionary cable. Its structure ultimately consisted of seven twisted copper wires creating a 2.1-millimeter conductor, wrapped in three layers of gutta-percha, a resin extracted from an Indonesian tree. A protective layer of tarred hemp and a metal armor completed the assembly, for a final diameter of about 1.6 centimeters.
No ship could carry the necessary 3,200 kilometers of cable alone. The operation therefore mobilized two vessels : the British HMS Agamemnon and the American USS Niagara. Loading took three weeks, already attracting journalists’ attention and public curiosity.
The first attempt in 1857 was a failure. The cable broke near the Irish coast. The following year, three attempts followed in succession. On July 29, 1858, the ships met in the middle of the ocean, connected their cable sections and departed in opposite directions. On August 5, the continents were finally linked between Bull’s Arm Bay in Newfoundland and Telegraph Field on the Irish island of Valentia.
On August 16, 1858, Queen Victoria and President James Buchanan exchanged messages. Transmitting the sovereign’s 98 words took nearly sixteen hours, a delay that represented a prodigious advance at the time.
Euphoria erupted on both sides of the ocean. New York organized a parade and fireworks so enthusiastic they partially set City Hall ablaze. Trinity Church celebrated the event in lower Manhattan. In London, Atlantic Telegraph Company shares doubled in value while Charles Tilston Bright received a knighthood.
Jeweler Tiffany & Co. sensed an opportunity and purchased the hundreds of kilometers of unused cable brought back by the USS Niagara. The company transformed them into souvenirs, ten-centimeter segments with brass ends and descriptive plates sold for fifty cents apiece. Other merchants followed with cufflinks, earrings and letter openers.
The magic was short-lived and within weeks, the cable stopped functioning, transmissions became illegible. Edward Orange Wildman Whitehouse, technical director at the Irish terminus, had applied excessive voltages up to 2000 volts, convinced that intensity needed to increase with distance. A 1985 analysis by historian Donard de Cogan would also reveal manufacturing problems, the off-center central conductor passing dangerously close to the outer metal armor. The marriage of these defects with the imperfect insulation of the gutta-percha and material impurities sealed the cable’s death warrant.
During its brief existence, this first link transmitted 732 messages. Among them, the British order to cancel the deployment of two regiments to India, the local rebellion having been brought under control. This decision saved between 50,000 and 60,000 pounds sterling, roughly one-seventh of the cable’s cost.
The technical failure provoked rumors and accusations of fraud. Cyrus W. Field, far from being discouraged and still supported by the British government, persevered. The Atlantic Telegraph Company finally installed a permanent link in 1866. This project, completed at a time when running water and electricity remained inaccessible to most people, testifies to an extraordinary vision.
This first electrical crossing of the Atlantic marks the history of communications. It proved the possibility of instantaneous intercontinental exchanges and heralded the architecture of the global network of submarine cables on which all of today’s Internet rests.
Herman Hollerith
In the 19th century, the United States was drowning in data. The Census Bureau struggled to keep pace with an exploding population. Herman Hollerith would create a machine that addressed the challenge of processing these mountains of forms.
In 1879, fresh out of Columbia School of Mines, he joined the Census Bureau. He met Dr. John Shaw Billings, who headed the vital statistics division, and who suggested the idea of mechanizing data processing. Indeed, the stakes became obvious when looking at the 1880 census. Fifty million Americans to count. Operators laboriously processing twenty characteristics per minute with rudimentary tools. The figures were not published until 1889, barely a year before the next census. The situation was untenable.
Hollerith then left the Bureau to follow General Francis Walker to MIT. Having become an instructor in mechanical engineering, he began developing his first prototypes. He drew inspiration from Colonel Charles W. Seaton’s work, who had created a machine using a roll-feed system comparable to that of a mechanical piano. But Hollerith would take a different direction.
After this academic interlude, he returned to Washington, joined the Patent Office, then established himself as an independent expert. This experience gave him the keys to protect his future inventions. He first worked on electric brakes for railroads, outperforming George Westinghouse during tests in 1887. But the examiners doubted electricity’s reliability as a power source. No matter.
His data processing system relied on punched cards measuring 6.5 × 3.25 inches. The operator, reading census forms, could encode up to 17 different characteristics among 240 possible positions. Standard codes, such as those for districts, were preset and reproduced on entire stacks of cards.
The tabulating machine revealed Hollerith’s ingenuity. In the upper jaw, rows of spring-loaded pins; in the lower jaw, tiny cups of mercury. The pins, connected to an electrical current, faced cups linked to 40 counters. The employee inserted a card into the press, closed the jaws, and the pins passing through the holes plunged into the mercury. The current flowed to the electromagnetic counters, advancing each relevant needle. A complete rotation of the units needle activated the hundreds needle. The card was then placed in a compartment of the sorter before moving to the next one.
The numbers spoke for themselves: 700 punched cards per day per operator, 7,000 cards processed daily by a machine user, 250 characteristics analyzed per minute. Through relays, the system performed more sophisticated analyses, such as identifying “white men born abroad”.
The major trial by fire came with the 1890 census. The Bureau ordered fifty machines to count nearly 63 million Americans. Such success that the administration would use improved versions of the system for six decades, until the arrival of UNIVAC I.
Hollerith’s influence spread beyond the census. Statistical services, insurance companies, railroads—all wanted this technology that drastically reduced data processing times. In a rapidly changing industrial society, it addressed a critical need. Moreover, some technical choices he made had surprising longevity. His punched card format matched the dimensions of banknotes of the era, simply because filing cabinets adapted to them already existed. His positional coding system, designed for the 1901 agricultural census, would influence IBM and other companies through the 1970s.
The patents filed by Hollerith were the forerunners of the data processing industry. For sixty years, his machine bridged the gap between manual processing and electronic computing. It transformed how we manage information while initiating the techniques of the modern computer industry. An invention born from a very concrete problem.