Stéphane FOSSE

EPOCH

EPOCH © 2025 by - This book is published under the terms of the CC BY-SA 4.0 license

Chapter 7
1980

Computing Enters the Home

The 1980s opened on a fractured world. The Cold War had lost none of its vigor, and the East-West opposition continued to structure international relations. Yet another battlefield was taking shape: the mastery of information. The United States and the USSR pursued their space and military duel while Japan, an emerging economic power, conquered market after market through its electronics industry.

Information had become a strategic issue. The breakup of AT&T in the United States in January 1984 transformed the telecommunications sector. This break from the monopolistic model stimulated innovation. On the European side, people began questioning the grip of the PTTs on communications. Winds of change were sweeping through the economic landscape.

The service sector, expanding rapidly in Western economies, demanded appropriate tools for offices. Typists put away their typewriters, replaced by word processing systems. Accountants abandoned their calculators for electronic spreadsheets. Productivity was the watchword in the face of increasingly fierce international competition.

The purchasing power of Western households reached unprecedented levels. Color televisions now sat in most living rooms. This familiarity with consumer electronics prepared the ground for the arrival of other devices, including game consoles, video recorders and, above all, microcomputers. Computing, previously confined to businesses, was inviting itself into private homes.

Schools did not escape this technological wave. Political leaders grasped the importance of training young people in these new tools. The French plan "Computers for All" launched in 1985 illustrated this awareness. Thousands of Thomson TO7 and MO5 computers invaded schools. Other countries followed similar paths, creating a substantial educational market.

In businesses, two worlds coexisted. Large centralized systems remained present, but saw their hegemony challenged by microcomputers. Users gained independence, much to the dismay of traditional IT departments. This decentralization raised unprecedented questions about compatibility between all these systems and data sharing. Standardization was the solution.

The financial world shifted into the electronic era. Stock exchanges abandoned manual quotation for computerized systems. Transactions accelerated, their volume exploded. However, this automation revealed its limits during "Black Monday" on October 19, 1987. Automated selling programs spiraled out of control, amplifying the ongoing crash. The incident revealed the fragility of overly autonomous systems.

Communication networks modernized at breakneck speed. Optical fiber progressively replaced copper. Telecommunications satellites multiplied in orbit. In companies, people no longer settled for isolated computers—they connected them. The first commercial electronic messaging systems appeared, harbingers of an interconnected world.

Popular culture seized upon the computing phenomenon. Hollywood portrayed hackers and artificial intelligences. Tron (1982) plunged its viewers inside a computer. WarGames (1983) evoked the risk of a nuclear war triggered by a computer error. Literature was not to be outdone with the emergence of cyberpunk, of which William Gibson and his Neuromancer (1984) were the leading figures.

Software publishing established itself as a major economic sector. The multiplication of platforms created a fragmented market where each machine had its own ecosystem. Publishers competed ingeniously to win over the general public. Ergonomics became a decisive selling point. In the shadow of this commercial effervescence, the first computer viruses made their appearance, heralds of future digital battles.

A new relationship with technology developed. Computing was no longer the preserve of specialists in white coats. Amateur clubs flourished, magazines like Hebdogiciel and Byte disseminated know-how and programs to type in yourself. This democratization was accompanied by a particular philosophy when Richard Stallman launched the GNU project in 1983, the first stone in a free software movement that challenged the grip of commercial publishers.

Silicon Valley became the symbol of a new generation of entrepreneurs. Young people like Steve Jobs, Bill Gates, and Michael Dell disrupted established giants. The image of the computing genius working in a garage before conquering the world became embedded in the collective imagination. Investors flocked in, sensing the sector’s growth potential.

States gradually became aware of the sovereignty issues linked to computing. Controlling digital technologies meant ensuring independence. Mastery of electronic components, operating systems, and networks was a strategic objective. Export restrictions on sensitive equipment to the Soviet bloc testified to this geopolitical dimension.

This decade initiated a transformation of society. The computer was a tool for scientific calculation or management, and a means of creation, communication, and entertainment. Its grip extended to all domains of human activity.

The end of the 1980s revealed a glimpse of a world where information would circulate without barriers. Technical innovations responded to the social aspirations of the era, between the desire for individual autonomy, the need for increased exchanges, and a thirst for modernity. Personal computing fit into this dynamic, translating into silicon the dreams of a generation.

Top

Apple III

In May 1980, at the National Computer Conference in Anaheim, Apple unveiled its new machine with unusual fanfare. The company rented out Disneyland for five hours, spent $42,000, and chartered British double-decker buses to transport 7,000 conference participants. This grandiose staging announced the Apple III, Apple’s first computer aimed at professionals.

Times seemed promising. By late 1978, the Apple II was selling beyond all expectations. Silicon Valley engineers competed for positions at Apple, some accepting pay cuts in exchange for stock options. Richard Jordan, who left Hewlett-Packard during the summer of 1978, recounts this period of collective euphoria. With each halving of the stock price, teams felt invincible. Failure seemed impossible to them.

The project began in this exhilarating context. The Apple III featured a Synertek 6502A 8-bit microprocessor running at 2 MHz, twice the speed of the Apple II. Supporting up to 128 KB of RAM, it included a keyboard with integrated numeric keypad and a Shugart 5.25-inch floppy drive with 143 KB capacity. Four internal expansion slots remained compatible with Apple II cards, complemented by two serial ports. The machine could run in Apple II emulation mode, but revealed its true nature with its proprietary operating system SOS (Sophisticated Operating System), which insiders nicknamed “applesauce.” An integrated real-time clock completed the equipment, while the display handled 24 lines of 80 columns in text mode and 560 by 192 pixels in monochrome graphics.

Steve Jobs led the project and imposed his vision. No fan: the aluminum chassis would suffice to dissipate heat. The size and shape of the case were decided without consulting the engineers, who then had to cram components into a space that was too small with inadequate ventilation. Feature creep worsened the situation: marketing, engineering, industrial design, manufacturing—each department added its requirements. The product swelled beyond its initial design.

Jerry Manock and Dean Hovey, the industrial designers, conceived the chassis before the motherboard was finished. Manock anticipated strict FCC standards on electromagnetic interference and chose a rather solid aluminum chassis. He contracted with Doher-Jarvis, an automotive parts manufacturer in Toledo. The external aesthetics—45-degree bevels, tilted keyboard, brown color—were meant to create a “house identity” for future Apple products.

Delays threatened the initial public offering scheduled for December 1980. Managers ignored engineers’ warnings and launched production. By November, during the first deliveries, the nightmare began. In February 1981, Apple abandoned the real-time clock: the National Semiconductor chip failed to deliver on its promises. The price dropped to $4,190, with a $50 rebate for early customers.

March 1981 marked the beginning of volume deliveries and the full extent of the disaster. Twenty percent of machines arrived dead at customers’ sites, with chips having dislodged during transport. Those that initially worked soon failed: heat caused components to expand and pop out of their sockets. Apple then recommended a solution that would become legendary: lift the front of the computer fifteen centimeters and let it drop to reseat the chips.

Other defects accumulated. Faulty connectors, case screws that pierced internal cables, a motherboard so dense it caused short circuits. The rush also affected software: programmers discovered the machine only nine weeks before shipment. Manuals were proofread the day they were sent to the printer, allowing so many errors through that an addendum became necessary.

Apple attempted a rescue in November 1981 with a revised version at $3,495. The company maintained that problems stemmed from manufacturing and quality control, not design. Yet the new version incorporated different chip sockets, updated software, memory expandable to 256 KB, and offered an optional 5 MB hard drive. Of the 7,200 original Apple IIIs, 2,000 were replaced free of charge.

Sales remained disappointing. Analysts estimated that Apple sold 3,000 to 3,500 units monthly, one-tenth of Apple II sales. In December 1983, InfoCorp counted 75,000 installed units versus 1.3 million Apple IIs. The poor reputation and lack of software exploiting SOS deterred buyers.

A final stand came in December 1983 with the Apple III Plus at $2,995. This version included 256 KB of RAM as standard, a functional clock, a new motherboard, SOS version 1.3, improved ports with DB-25 connectors, and a case facilitating card installation. Too late: on April 24, 1984, Apple discontinued development. After $60 million in losses, the line disappeared definitively from the catalog in September 1985.

Ironically, this failure did not diminish Apple’s success. On December 12, 1980, the initial public offering broke all records. Shares issued at $22 climbed to $29 and sold out within minutes. At the close of the first day, Apple was worth $1.2 billion, achieving the largest IPO since Ford in 1956. Jobs’s 15% stake would soon exceed $250 million.

The Apple III experience shattered the myth of invincibility born from the Apple II’s success. This setback forced a restructuring. In January 1981, president Mike Scott divided development into three groups: personal desktop systems (Apple II/III), peripherals (drives, printers, modems), and professional systems (Lisa division). The R&D budget tripled to $21 million. These tensions cost Scott his position, replaced by Mike Markkula, while Jobs took the chairman role.

IBM’s entry into the personal computer market in 1981 would paradoxically serve Apple. The giant legitimized this emerging sector. Apple acknowledged this with humor in a full-page advertisement in the Wall Street Journal: “Welcome IBM. Seriously.”

Top

Intel 8086/8088

When Intel decided to take the leap into 16-bit processors in the late 1970s, the company had no idea it would become a technological empire. The 8086, launched in 1978, entered a market dominated by Intel’s 8-bit chips, the 8080 and 8085. This transition to 16 bits represented a considerable technical challenge, but Intel adopted a bold strategy: maintain compatibility with existing systems while multiplying performance.

The new processor pushed the boundaries of its time. Where its predecessors maxed out at 64 kilobytes of memory, the 8086 addressed up to 1 megabyte through its 20-bit address bus. Its 16-bit data bus processed double words in a single cycle. Intel designed a dual-unit architecture: the bus interface unit handled external exchanges while the execution unit focused on calculations. This division of labor enabled pipeline operation, where the processor could begin processing the next instruction before completing the previous one.

Manufactured in HMOS technology with its 29,000 transistors, the 8086 ran at 5 MHz in its base version. Intel later pushed the frequency to 8 and then 10 MHz. Its fourteen 16-bit registers were cleverly distributed among data, pointers, flags, and segments. The latter represented the most notable innovation: they divided memory into distinct zones and facilitated program organization.

But here’s the catch. Migrating from 8-bit to 16-bit was expensive for manufacturers. They had to redesign motherboards, adapt all peripheral components, and revise machine architectures. Faced with market resistance, Intel released a stripped-down version in 1979: the 8088. This chip retained the 8086’s internal 16-bit architecture but reduced its external data bus to 8 bits.

The compromise seemed awkward at first glance. The 8088’s 16-bit transfers required two cycles instead of one, limiting performance. Yet this technical limitation became a commercial advantage because manufacturers could reuse their existing 8-bit components: controllers, memory, support circuits. Development costs dropped dramatically.

IBM seized the opportunity in 1981. For its first personal computer, the company sought a fast and economical solution. The 8088 was a natural choice. It worked perfectly with the 8288 bus controller, the 8259A interrupt manager, and the entire range of proven circuits. Its 6-byte cache partially compensated for the bottleneck of the reduced bus.

As the market standard, the IBM PC became a massive success within a few years. Microsoft developed MS-DOS specifically for this architecture, creating a software ecosystem that locked out the competition. IBM clones proliferated, all equipped with the 8088. Intel’s chip became the best-selling processor, far surpassing its more powerful sibling, the 8086.

This commercial success masked the underlying technical shift. The 8088/8086 architecture invented concepts that would span decades. Memory segmentation would survive in all subsequent x86 generations. The extended instruction set, with its string operations and hardware multiplication, would serve as the foundation for future extensions. The 80286’s protected mode would build on these foundations.

A true trademark of the x86 family, Intel planned from the outset for both models to coexist. Same instruction set, same programming, only the external bus differed. Each new processor would execute software from its predecessors while adding its own innovations.

For sophisticated calculations, Intel offered the 8087 math coprocessor. This optional circuit connected directly to the main processor and handled floating-point operations. Its specialized instruction set would later be natively integrated into chips starting with the 80486.

The 8086/8088’s modular architecture also enabled multiprocessing. In maximum mode, multiple chips shared the system bus via the 8288 controller. This possibility, ignored in consumer PCs, found its place in industry and servers.

Production spanned more than ten years. Superseded by more powerful processors, the 8086 and 8088 still equipped embedded systems and industrial controllers for a long time. Their simplicity and energy efficiency suited these discrete applications perfectly.

Forty years later, their influence endures. By establishing x86 as the de facto standard for personal computers, they charted a technological path that continues today. The need to preserve compatibility with their instruction set still weighs on contemporary processors.

This backward compatibility, initially perceived as a burden, became an ultimate weapon. It guaranteed developers and users that their programs would survive hardware evolution. This stability fueled the expansion of the IBM-compatible PC market.

The triumph of the 8088 in the IBM PC illustrates an eternal truth of the technology industry: the theoretically optimal solution doesn’t always win. Economic pragmatism and practical constraints often dictate choices. Intel understood this and built its strategy on this lesson.

Top

Smalltalk-80

By the late 1970s, Xerox PARC already housed several versions of Smalltalk, but these creations remained confined within the laboratory walls. They only ran on Xerox’s specialized hardware: the Alto and Dorado, expensive machines inaccessible to ordinary computer scientists. This frustrating situation prompted Adele Goldberg to take charge of an ambitious project: bringing Smalltalk out of its gilded cage so it could flourish on other architectures.

The undertaking proved delicate. First, they needed to formalize with surgical precision the workings of a Smalltalk “virtual machine” that would serve as a universal foundation, much like Java’s would many years later. This technical abstraction concealed a novel concept: everything is an object in this new universe. Numbers, characters, and classes behave as autonomous entities capable of receiving and sending messages. This radical uniformity disrupted the habits of programmers accustomed to carefully separating data and procedures.

In this unified world, inheritance weaves invisible links between classes. A hierarchy emerges naturally, where each descendant enriches or specializes the behaviors of its ancestors. Class methods orchestrate shared behaviors while instance methods give each object its own personality. This architecture fostered a form of code reuse previously unknown.

The development environment broke with the traditional cycle that required editing, compiling, then executing. Smalltalk-80 offered a continuous experience where programmers modified their code while the program was running. This fluidity transformed the act of programming into a direct conversation with the machine. The class browser and debugger, themselves written in Smalltalk, were living tools that users shaped according to their needs.

The graphical interface introduced innovative concepts for the time: windows overlapped, menus appeared with right-clicks, objects were manipulated directly on screen. These innovations established a visual language that still influences our daily interactions with computers. Steve Jobs grasped this immediately when he visited PARC in 1979; he understood the potential of these ideas for the general public.

Under the hood, automatic memory management finally freed programmers from a tedious technical chore. The garbage collector tracked abandoned objects to recycle their memory. Peter Deutsch and Allan Schiffman refined these mechanisms until they achieved acceptable performance on standard hardware, proving that conceptual elegance could coexist with practical efficiency.

Implementation across different platforms demanded engineering prowess. The PARC team designed a virtual machine with its own instruction set, leaving integrators to translate these abstract commands into native code. This strategy guaranteed portability while preserving the possibility of specific optimizations. Method caches accelerated searching through the class hierarchy: when an object received a message, the system memorized the association between the selector and the corresponding method for subsequent calls.

The management of execution contexts revealed another facet of the object approach. These contexts, objects manipulable by the program, would enable sophisticated mechanisms like exception handling and reflection to emerge. This conceptual uniformity pushed object logic into the most intimate workings of the system.

With the publication of Smalltalk-80 in a special issue of Byte magazine in 1980, the books by Adele Goldberg and David Robson disseminated these ideas well beyond academic circles. Objective-C, Ruby, Python, and many others drew directly from this source. The influence extended beyond the realm of languages: modern development environments retain traces of this vision where everything is accessible and modifiable in real time.

The Smalltalk-80 interface spread into the commercial world thanks to Jobs’s insight. The Lisa and Macintosh popularized these concepts with the general public, creating a de facto standard that endures. The optimization techniques developed for Smalltalk-80 are now standard practices in implementing object-oriented languages.

The exhaustive documentation of the system, including virtual machine specifications and implementation details, constituted an invaluable gift to the computing community. Other teams would thus reinvent the system according to their needs, accelerating the propagation of these pioneering ideas.

Top

UDP

In 1980, David P. Reed was working on a seemingly simple problem: how to transmit data between computers without all the complexity that TCP already imposed? The Internet architecture was taking shape around the TCP/IP suite, but certain applications called for something more direct, faster. Reed then conceived UDP, the User Datagram Protocol, which would become one of the most enduring protocols on the Internet.

Where TCP built solid connections, verified each packet, and guaranteed the order of data arrival, UDP deliberately chose the opposite. No persistent connection, no delivery guarantees, no packet ordering control. This minimalist approach is reflected in RFC 768, which defines the protocol: three short pages suffice where other specifications stretch across hundreds of pages.

This brevity was not negligence. Reed and his colleagues understood that the Internet would need two types of tools: sophisticated protocols like TCP for critical data, and lightweight mechanisms like UDP for other uses. The first service to truly exploit this philosophy was the Internet Name Service, the ancestor of DNS. When a computer requests the IP address of a website, it doesn’t need to establish a complex connection; it sends its query and waits for the response. If the response is lost, it asks again. Simple, straightforward.

DNS still uses this approach, and for good reason—a server handles thousands of queries per second without maintaining state for each client. This resource economy proves invaluable when you consider that a root DNS server can receive millions of queries per hour.

The history of UDP took an unexpected turn with the emergence of multimedia communications. When Voice over IP took its first steps in the 1990s, developers discovered that UDP perfectly matched their needs. In a phone conversation, it’s better to lose a few milliseconds of audio than to wait for a missing packet to be retransmitted. User experience takes precedence over technical perfection. This logic naturally extended to videoconferencing, then to online gaming. In an online shooter game, if the information that a player has fired arrives late, it’s useless. Developers handle these temporal issues themselves with UDP, according to their application’s specifics.

UDP’s elegance lies in this flexibility. The protocol doesn’t dictate how to handle errors or packet ordering; it leaves that responsibility to applications. Some games retransmit critical information multiple times to ensure reception. Others simply ignore lost packets and focus on the most recent data.

This approach has inspired new innovations. DCCP adds congestion control to UDP to avoid overloading the network, and DTLS brings cryptographic security. These protocols show that UDP serves as a stable foundation for building specialized solutions.

Performance measurements confirm Reed’s initial intuitions. A study conducted between 2004 and 2014 on a transpacific link between Japan and the United States revealed variations in average IP packet size. UDP, generating less control traffic than TCP, contributes to more efficient bandwidth usage.

The emergence of sensor networks has given UDP new life. These devices, often battery-powered, periodically transmit temperature, humidity, or position measurements. They don’t need TCP’s robustness; they prefer to conserve their energy by simply transmitting their data without establishing a complex connection.

Cloud computing and microservices have rediscovered UDP’s virtues. When an application decomposed into dozens of services must communicate quickly, UDP offers minimal latency. Netflix thus uses UDP for certain internal communications, Spotify for synchronizing playlists between devices.

The evolution toward the UDP-Lite variant demonstrates this adaptability, tolerating certain errors in packet headers, useful in wireless communications where a few corrupted bits don’t justify rejecting an entire packet.

Forty-three years after its creation, UDP continues to thrive on the Internet. Its longevity stems from a simple principle: rather than trying to solve all problems, it solves a few of them perfectly well. This specialization has allowed it to endure through the ages, from early university networks to modern data centers. This constancy makes it one of the discrete but indispensable pillars of the Internet.

Commodore VIC-20

In 1980 near London, Jack Tramiel called his teams together for a meeting that would change everything. The head of Commodore expressed a concern that haunted him: “The Japanese are coming, so we’re going to become the Japanese”. This pithy statement summed up the entire philosophy that would govern the birth of the VIC-20.

Tramiel saw far ahead. His Commodore empire certainly ruled in Europe, but in the United States, Apple and Radio Shack maintained an edge. More worryingly, Texas Instruments was eating away at its positions in the calculator market. The executive feared above all that Japanese manufacturers would arrive with subsidized machines that would crush the competition. His response? Create a color personal computer accessible to the general public, a machine that would complement the already established PET range.

This bold vision rested on a technical component that had been forgotten in MOS Technology’s drawers for two years. In 1978, Alan Charpentier had designed the VIC (Video Interface Chip) to attract arcade cabinet manufacturers. The reception had been cold, and the chip had been dormant ever since. But now it finally found its calling in this mass-market computer project.

Mike Tomczyk arrived in April 1980 as Tramiel’s assistant. In a few weeks, this energetic man embodied the soul of the VIC-20 project, to the point of earning the nickname “VIC Czar”. By visiting Germany and Japan in a month, he laid the foundation for his program to restructure marketing.

Meanwhile, two teams competed in ingenuity. Robert Yannes, who would later create the famous SID sound chip, cobbled together at MOS Technology a first prototype with PET components and a calculator case. His vision leaned more toward a sophisticated game console. For their part, Bill Seiler and John Feagans assembled another prototype by mixing different elements. They added a 9-pin joystick port and a cartridge connector inspired by the Atari 2600, but above all insisted on integrating BASIC. Their conviction: a computer must allow programming, not just playing.

The final prototype married these two visions. The resulting machine delivered on its technical promises: a MOS 6502A processor running at about 1 MHz, 5 KB of RAM with 3.5 KB usable, cassette and disk connectors, a joystick port, a versatile user port, and four mono audio channels generated by the VIC chip. The whole thing fit into a compact case with clean lines.

Commodore first tested its mettle in Japan. In October 1980, the VIC-1001 was released at 69,800 yen. While fine-tuning production, this strategy of observing the reaction of Japanese manufacturers proved worthwhile. In May 1981, the VIC-20 arrived in the United States at $299.95, and in the United Kingdom in September at £199.99.

Tomczyk then deployed his marketing strategy. He trademarked the phrase “the friendly computer” and bypassed specialized retailers to invest in big-box stores. This approach deliberately placed the VIC-20 in competition with game consoles, highlighting its personal computer capabilities. The documentation reflected this philosophy: the manual favored simplicity and left technical subtleties to the Programmer’s Reference Guide.

William Shatner, the unforgettable Captain Kirk of Star Trek, lent his charismatic face to the advertising campaign. His television appearances made an impression and gave the VIC-20 unexpected notoriety. The results followed. In 1982, Commodore factories produced up to 9,000 units per day to meet exploding demand. Revenue reached $300 million.

The software ecosystem expanded at breakneck speed. Games naturally dominated, with notable creations like Jelly Monsters, an unauthorized but successful adaptation of Pac-Man, or Sword of Fargoal which cleverly exploited the machine’s capabilities. Jeff Minter, future indie game guru, made his first works on the VIC-20 with Matrix and Laser Zone. The VICModem, the first modem sold under $100, opened the doors to the CompuServe network via the Commodore Information Network. Cartridges of 3, 8, or 16 KB extended memory up to 32 KB and multiplied possibilities.

The success exceeded all expectations. The VIC-20 was the first personal computer to break the one million units sold mark. This historic performance testified to the accuracy of Tramiel’s vision: the home computer could appeal far beyond the circle of enthusiasts.

Yet this triumph carried the seeds of its own end. In August 1982, Commodore launched the Commodore 64, a more powerful machine that cannibalized sales of its little brother. VIC-20 production stopped at the end of 1984, and the last units disappeared from shelves in early 1985. About 2.5 million units would find buyers in four years of commercial existence, a remarkable performance but eclipsed by the longevity of the C64 or ZX Spectrum.

This premature disappearance in no way diminished the VIC-20’s legacy. The machine financed the development of the Commodore 64, prepared the acquisition of the Amiga, and established the codes of home computing in the 1980s. Its design had a lasting influence on the industry: peripheral compatibility, importance of ergonomics, deliberate democratization. An entire generation of programmers cut their teeth on this machine, particularly in the United Kingdom where it catalyzed creativity.

A passionate community keeps the flame alive and new programs regularly see the light of day, such as Astro Nell or Game Theory, which reveal unsuspected capabilities despite hardware constraints. These recent creations testify to the soundness of the initial concept: a simple and accessible computer that liberates creativity rather than hindering it.

Top

WordPerfect

In the summer of 1979, Alan Ashton, a computer science professor at Brigham Young University, decided to devote his vacation to a project close to his heart: creating software that would display text exactly as it would appear on paper. At the time, existing systems worked with unsightly formatting codes that cluttered the display. In his laboratory in Orem, Utah, Ashton laid the foundations for what would become one of the most resounding successes in personal computing.

Bruce Bastian, a former university orchestra conductor who had switched to computer science, brought his technical expertise to the project. Together, they developed SSI*WP, the acronym for Satellite Software International Word Processor. The name choice revealed their ambitions: they wanted to transcend the traditional geographical boundaries of software.

Marketing began in 1980 with a price that gave one pause: $5,500 for a license intended for Data General computers. The market reacted cautiously. Sales remained modest but sufficient for the small team to maintain control of its destiny. Don Owens joined the venture as sales manager and began building a network of resellers. Pete Peterson, a former accountant, took charge of financial matters in 1981. This founding team would split over a question: should they accept external investment to accelerate growth?

Owens argued for fundraising that would provide the means for rapid expansion. Ashton and Bastian preferred a more cautious approach, even if it meant progressing more slowly. These incompatible visions created tensions that erupted in 1982 with Owens’ departure. In retrospect, this choice of financial independence would prove decisive for what followed.

The arrival of the IBM PC changed everything. This machine democratized personal computing and opened unprecedented possibilities. WordPerfect released its first PC version at the end of 1982. The software stood out from the competition through its remarkable stability and polished ergonomics. Crashes were rare, features numerous. The integrated spell checker caused a sensation, as did the native footnote management.

Version 4.2, launched in 1986, propelled WordPerfect to unexpected heights. More than half of word processing users now chose this solution. This success owed nothing to chance. The company, renamed WordPerfect Corporation, revolutionized industry standards by offering free telephone technical support. This commercial innovation upended established practices: until then, obtaining help was expensive and time-consuming.

WordPerfect’s user interface cultivated its distinctiveness. Function keys replaced the pull-down menus already featured in other software. A small cardboard template, placed on the keyboard, served as a reminder of available shortcuts. This approach was initially disconcerting but quickly revealed its advantages. Thanks to the “Reveal Codes” function, which displayed all hidden formatting codes in the document, the efficiency of experienced users doubled. Lawyers and notaries became enamored with this transparency that guaranteed them complete control over the formatting of their documents.

The company’s growth defied comprehension. From 11 employees in 1981, it grew to more than 4,500 in 1992. Revenue reached $533 million in 1991. This meteoric expansion generated internal debates about organization. Peterson advocated for a flat structure with few hierarchical levels, while other executives called for more traditional management. These disagreements led to Peterson’s departure in 1992, depriving the company of a coherent organizational vision.

Microsoft Windows 3.0 arrived in 1990. This modern graphical interface won users over with its user-friendliness. WordPerfect then made a major strategic mistake: the company was slow to adapt its software to this environment, remaining attached to MS-DOS text mode. Microsoft Word for Windows took advantage of this delay to establish itself. Integrated into the Office suite, it benefited from visual and functional consistency that WordPerfect struggled to match.

When WordPerfect for Windows finally came out in 1991, the damage was done. The software suffered from performance and usability problems. Users, seduced by the modernity of the graphical interface, switched en masse to Microsoft’s solution. This transition marked the beginning of WordPerfect’s decline.

Novell purchased WordPerfect Corporation in 1994 for $1.4 billion. This pharaonic acquisition turned into a disaster. The networking giant understood nothing about the word processing market. Sales collapsed in the face of Microsoft Office’s growing hegemony. Two years later, Corel acquired WordPerfect for only $158 million, nearly ten times less than the initial purchase price.

This spectacular fall illustrates the brutality of reversals in the software industry. WordPerfect had dominated its market through technical excellence and impeccable customer service. But the company failed to anticipate the importance of the graphical interface. It underestimated Microsoft’s firepower, which controlled both the operating system and could impose its office suite as the de facto standard.

WordPerfect retains a loyal user base, concentrated in the legal world. Legal professionals still appreciate its precision in formatting complex documents. The “Reveal Codes” function is unique and alone justifies the loyalty of certain users.

With WordPerfect, the company established free customer support as an industry standard. Its innovations in WYSIWYG display and spell checking inspired all its successors. In computing, technical leadership is not enough: one must also know how to adapt to technological disruptions and understand users’ changing expectations. The software industry is merciless. A company can dominate a market for years and then collapse in a few months if it misses a technological turn.

Top

ICMP

In 1981, engineers working on ARPANET faced a recurring problem: how to inform a sender that their packet had been lost in transit. The IP protocol, in its initial design, provided no mechanism for signaling errors. Packets would disappear into the maze of the network without a trace, leaving the sender in the dark about the fate of their data.

Jon Postel and Steve Crocker observed this glaring gap from UCLA as ARPANET expanded. The absence of feedback literally paralyzed fault diagnosis. An overloaded router would drop a packet, but the sender remained unaware. A destination was unreachable, but there was no way to know. This situation led to the creation of ICMP, the Internet Control Message Protocol.

RFC 792 formalized this solution by defining two major families of messages: those signaling errors and those querying network status. The first category informs the sender of problems encountered: destination unreachable, packet time-to-live expired, incorrect parameters. The second family serves to probe the network, verify that a host responds, or measure packet round-trip time.

ICMP’s technical architecture is particularly surprising. Rather than positioning itself above IP as a higher-level protocol, ICMP integrates directly into it. Control messages travel in ordinary IP datagrams, taking the same paths as the data they concern. This approach ensures that error messages follow an identical route to the failed packets.

Mike Muuss revolutionized ICMP usage in 1983 with his ping utility. This small application exploits “echo request” and “echo reply” messages to create the Internet’s most popular connectivity test. The principle appeals through its simplicity: send a message to a machine and wait for its response. If it responds, the connection works. Otherwise, a problem exists somewhere along the path.

The success of ping inspired other tools. Traceroute cleverly hijacks “time exceeded” messages to map network routes. By sending packets with progressively increasing time-to-live values, the tool forces each intermediate router to return an error message, thereby revealing its identity and position along the path.

Path MTU discovery illustrates ICMP’s adaptability to emerging needs. This technique uses “fragmentation needed” messages to determine the maximum size of packets traversing a network path without fragmentation. The process significantly improves performance by avoiding the division of packets into smaller fragments.

But ICMP also attracts the attention of hackers. “Ping of death” attacks exploit vulnerabilities in the handling of oversized ICMP packets, causing vulnerable systems to crash. Other attacks use ICMP to flood networks with control messages. These malicious exploits push many administrators to filter or block certain ICMP messages, sometimes creating undesirable side effects on normal network operations.

The arrival of IPv6 redefines ICMP’s role. The new version, ICMPv6, absorbs functions previously handled by other protocols. Neighbor discovery, for example, replaces ARP and integrates directly into ICMPv6. This consolidation simplifies network architecture while strengthening ICMP’s importance in modern infrastructures.

The ICMP protocol influences the design of sophisticated network monitoring tools that analyze control messages to detect anomalies and measure performance. Network administrators learn to interpret these signals to diagnose failures and anticipate problems.

This transformation of a simple control protocol into a cornerstone of Internet infrastructure testifies to its judicious design. Its flexibility has allowed it to adapt to the changing needs of computer networks for over forty years. Today, every ping launched on the Internet perpetuates the legacy of those early ARPANET engineers who simply wanted to know why their packets were getting lost.

Top

Acorn BBC Micro

In 1980, the BBC was grappling with questions about the United Kingdom’s digital future. Officials at the British public broadcaster watched with concern as American companies dominated the microcomputer market. Available machines either cost a fortune or offered laughably limited capabilities. Faced with this situation, the BBC decided to take matters into its own hands and launched an ambitious project: to create a personal computer that would accompany its educational programming.

The specifications established by the BBC reflected this educational ambition. The machine would need to incorporate the BASIC language, feature a proper keyboard worthy of the name, display color graphics, and accept extensions such as teletext. Above all, its price would need to remain within reach of schools and motivated individuals. A considerable technical and commercial challenge.

The tender issued in late 1980 attracted several candidates. Among them was Acorn Computers, a small Cambridge company working on a prototype called Proton. This machine impressed the BBC evaluators, who favored it over the Sinclair ZX-81, still in gestation. Acorn won the contract and officially renamed its prototype the “BBC Microcomputer”.

The collaboration between Acorn’s engineers and the BBC’s teams proved fruitful. The initial prototype underwent numerous improvements that far exceeded the original specifications. ICL and Cleartone handled manufacturing. The architecture was based on the 6502A processor clocked at 2 MHz, a reliable and proven chip. Two specialized integrated circuits completed the package: one handled graphics, the other serial interfaces.

Two versions were released simultaneously in early 1982. Model A sold for £235 with 16 KB of RAM, while Model B cost £335 for 32 KB. The latter featured more connectivity: serial and parallel interfaces, an 8-bit user port, analog inputs, and an expansion bus. One technical innovation deserves mention: the “Tube” interface, which allowed connection of a second processor.

The integrated BASIC was surprisingly rich. Similar in philosophy to Microsoft BASIC, it distinguished itself through numerous specific extensions. The operating system and BASIC occupied 32 KB of ROM, a generous allocation. The professional keyboard featured 64 keys arranged in the QWERTY standard, complemented by 10 programmable function keys.

Graphics performance constituted a major asset. Eight different modes allowed mixing of text and high-resolution graphics as needed. Mode 7, teletext-compatible, produced color graphics while consuming only one kilobyte of memory. A sound chip generated three simultaneous notes and various effects, with sophisticated software control of sound envelopes.

Extensibility set the BBC Micro apart from its competitors. Interfaces for floppy disk drives and Econet networking were integrated directly on the motherboard. A speech synthesis option, developed with Richard Baker, the BBC’s star presenter, could pronounce approximately 150 pre-programmed words. Prestel and teletext adapters, planned for spring 1982, opened the way for program downloads.

The expansion possibilities appealed to advanced users. A second 6502 processor, a Z-80 for running CP/M, or a 16-bit National 16032 processor capable of addressing 16 MB of memory could be connected via the Tube interface. Dedicated monitors completed the range: a monochrome model at £105 and a color model at £288.

The BBC commissioned professional applications: word processor, spreadsheet, domestic database, computer-aided design. Third-party publishers developed educational and entertainment software. A comprehensive user manual accompanied each machine, along with a cassette of 16 demonstration programs.

A few flaws marred the picture. The plastic case lacked durability. The 32 KB of user memory sometimes proved insufficient, with the operating system consuming up to 8 KB. In high-resolution graphics mode, only 2 KB remained available. Teletext mode sometimes displayed erroneous characters due to its specific character set. No cassette cable was supplied with the machine.

These minor defects did not detract from the product’s overall excellence. Display quality was impressive: 80 characters per line remained perfectly legible on a standard black-and-white television thanks to a specially designed font. Teletext mode offered one of the best visual renderings of its generation.

The BBC Micro surpassed most Japanese and American productions of the era through its technical richness and modularity. Commercial success followed: 100,000 units were planned for the first year. This achievement resulted from the meeting of Acorn’s technical expertise and the BBC’s educational vision.

The machine became the standard educational tool in British schools during the 1980s. It introduced an entire generation to computing and fostered a national digital culture. The experience Acorn gained in processor design would later lead to the ARM architecture, present in virtually all modern mobile devices on the planet.

Top

IBM PC-5150

When IBM announced its Personal Computer 5150 in August 1981, the company entered a market six years behind Apple, Atari, Commodore, and Radio Shack. This late arrival forced Big Blue to break with tradition: internal development gave way to external partnerships. A team of just twelve people would design this machine in one year, a record for IBM.

The approach was surprising. Intel supplied the processor, Microsoft delivered PC-DOS and BASIC. Distribution also broke with tradition: instead of its own channels, IBM chose Computerland and Sears. For advertising, the company abandoned its austere image and featured Charlie Chaplin’s Tramp character in a series of offbeat television spots.

At $1,565 for the base model, without floppy drive or screen, the PC 5150 initially targeted individuals. They could connect their television as a monitor and use a tape recorder to save their data. Business users, however, paid $4,500 for a complete configuration with 64 KB of memory, two floppy drives, color display, and dot matrix printer. Adjusted for inflation and converted to current Swiss francs, these amounts would reach 6,000 and 14,000 CHF.

Technically, nothing spectacular. The Intel 8088 processor ran at 4.77 MHz, a thousand times less than our current chips. This economical derivative of the 8086 made do with an 8-bit data bus while addressing up to 1 MB of memory. The motherboard accommodated between 16 and 64 KB of RAM, two 5.25-inch drives of 160 KB each.

A few details nevertheless marked a break. The detached keyboard was exceptional at a time when most computers integrated it into the main unit. Its 83 keys weighed 2.7 kg and produced that characteristic click that would delight nostalgic users. The system unit with its drives weighed 12.7 kg, the monitor 7.7 kg. Total: 23 kg of home computing.

No standard hard drive, that would require the 5161 expansion. No mouse either, it wouldn’t become widespread until later. Storage used audio cassettes via a specialized connector or floppy disks. Early versions used single-sided 160 KB media, then double-sided 360 KB for the Model B marketed as early as 1983.

IBM surprised with its technical transparency. The documentation provided with the PC 5150 revealed everything: electronic schematics, architecture diagrams, interface specifications, BIOS source code. This unusual openness boosted the creation of expansion cards and compatible software. But it also opened the door to cheaper clones that would proliferate.

Five internal slots accepted expansions via what IBM called the “I/O Channel”, predecessor of the ISA standard. These 62-pin connectors accommodated floppy controllers, graphics adapters, parallel interfaces for printers, or RS-232 serial ports for modems. Contemporary PCs have abandoned these parallel buses for faster serial connections like PCI Express or USB.

PC-DOS managed files with the FAT system, without subdirectories in its first iteration. All files spread out in a flat structure: 64 maximum on single-sided floppy, 112 on double-sided. Computer viruses and piracy, though they existed, had not yet infected the planet. The manual addressed only physical security: storing diskettes safely, backing up regularly, controlling machine access. The first virus for IBM-compatible PCs, Brain, wouldn’t appear until 1986.

Production of the 5150 stopped in 1987, replaced by the 5160 model. Internationalization began in 1983. Its architecture still influences our current machines, from connected objects to enterprise cloud servers. IBM’s choices—standardized components, comprehensive documentation—created an open ecosystem that would dominate the market.

Top

MS-DOS

Until then focused on large systems, IBM set its sights on the personal computer market in 1980. The Armonk firm was looking for an operating system for its future machine. It turned to Microsoft, a small company that Bill Gates and Paul Allen had created a few years earlier. Problem: Microsoft didn’t have a proper operating system.

The solution came from Seattle Computer Products. This company had developed QDOS – Quick and Dirty Operating System – under Tim Paterson’s leadership. The story of QDOS deserves attention. In May 1979, Seattle Computer Products designed an 8086 processor board for the S-100 bus. The company was counting on Digital Research to deliver CP/M-86, but the promises dragged on. Tired of waiting, the engineers decided to create their own system. Paterson tackled the task in April 1980. Two months later, QDOS was born, and the system worked remarkably well for such rapid development.

Microsoft sensed a good opportunity. The company acquired a non-exclusive license for QDOS for $25,000 in December 1980. A few months later, it purchased all rights for an additional $50,000. Tim Paterson joined Microsoft’s team to transform QDOS into MS-DOS.

The first IBM PC came out in 1981, equipped with MS-DOS 1.0. This rudimentary operating system operated entirely in text mode. Users typed commands: DIR to list files, COPY to duplicate them, ERASE to delete them. No directories, just 160 KB floppy disks. Austere, certainly, but functional.

Improvements arrived. MS-DOS 2.0 appeared in 1983 with support for 10 MB hard drives and a hierarchical file structure. This version borrowed some ideas from UNIX, notably a more sophisticated batch command processor. MS-DOS 3.0 followed in 1984, tailored for the IBM PC AT. The FAT16 file system made its appearance, managing partitions up to 32 MB.

The technical architecture relied on three main components. The hidden file MSDOS.SYS contained the system core and processed high-level program requests. IO.SYS handled hardware aspects, this abstraction layer between software and hardware. COMMAND.COM provided the user interface, that command prompt millions of users would come to know.

The FAT file system revolutionized disk space management. An allocation table tracked the usage of each unit, composed of several consecutive sectors. This approach avoided external fragmentation but generated internal fragmentation: the last unit of a file was never completely filled. A technical compromise that would prove its worth.

Microsoft played its commercial hand strategically. The agreement with IBM authorized the sale of MS-DOS to other manufacturers. This strategy established MS-DOS as the de facto standard for all IBM-compatible PCs. When Windows 3.0 arrived in 1990, the graphical interface relied on MS-DOS. Success followed.

MS-DOS 6.0 arrived in 1993 with its share of innovations: DoubleSpace for disk compression, an integrated antivirus, a backup system. But DoubleSpace posed problems. Data corruptions prompted Microsoft to release version 6.2, supplemented with ScanDisk to repair damaged disks. The year didn’t end well: Stac Electronics sued Microsoft for patent infringement on the compression algorithm. Microsoft lost and had to remove DoubleSpace. Version 6.22 came out in 1994 with DriveSpace, using a new algorithm to avoid any dispute.

Windows 95 marked the end of MS-DOS’s golden age. The system integrated MS-DOS 7.0 as a simple boot and compatibility component. Windows 98 and Me still kept MS-DOS, but its influence waned. Windows XP, in 2001, definitively closed this chapter, retaining only minimal compatibility to run stubborn old applications.

The MS-DOS system standardized the PC universe, accompanying the emergence of an extraordinarily rich software ecosystem. From word processors to spreadsheets, from databases to games, an entire world of applications was born. Companies like Lotus, Borland, and WordPerfect built their fortunes on this platform.

MS-DOS democratized computing in its own way. Its command-line interface, austere and demanding, trained an entire generation of users. Learning MS-DOS meant understanding the machine’s internal logic. This school of rigor forged skills that would prove valuable with the arrival of graphical interfaces, because the Windows command prompt still retains many commands inherited from MS-DOS: DIR, COPY, DEL still work. The concepts of file management, memory allocation, and program execution established at that time still form the foundations of current systems. DOSBox and other emulators preserve this history, allowing nostalgics to rediscover their old software.

MS-DOS is an important milestone in personal computing. This humble and efficient system made the PC explosion possible. Without it (and QDOS), Microsoft’s history would have taken a different course, and consumer computing might have taken other paths.

Top

Osborne 1

Adam Osborne had grasped something that few entrepreneurs understood in 1980. An author of computer books and head of a specialized publishing house, he observed a booming market where microcomputers remained confined to offices and laboratories. His idea seemed simple: create a computer that could be taken anywhere.

In April 1981, in the pages of Creative Computing Magazine, Osborne laid out his vision with remarkable clarity. He identified five fundamental needs that a personal computer had to satisfy: an affordable price, ease of use without sacrificing versatility, portability, software compatibility through standardized systems, and adaptability to developers’ requirements. This analysis stemmed directly from his industry experience and his frustration with existing machines’ limitations.

The Osborne 1 embodied this thinking. At 10.9 kg, the machine adopted a bold all-in-one design: a 5-inch screen, two floppy drives, a detachable keyboard, and power supply integrated into a single portable case. This approach represented a radical departure from traditional systems that required multiple components connected by a tangle of cables.

Technically, the computer proved sound. Its Zilog Z80A processor running at 4 MHz, 64 KB of RAM, and CP/M operating system offered respectable performance for the era. The screen displayed 52 characters across 24 lines, with horizontal scrolling capability up to 128 characters. Each 5.25-inch floppy drive provided 102 KB of storage. This configuration represented a judicious compromise between performance and mobility in 1981.

Osborne had especially understood the importance of software. For $1,795, the buyer received the machine along with a complete suite including WordStar for word processing, SuperCalc as a spreadsheet, and BASIC and CBASIC for programming. This software bundling strategy was unprecedented and transformed the purchase of simple hardware into acquisition of a complete working environment.

The commercial target revolved around three distinct segments. Mobile professionals found a means to manage their documents and spreadsheets while traveling, before synchronizing their work at the office via modem. The education sector discovered an economical solution for equipping institutions with computer equipment. Individual consumers finally accessed an affordable alternative to traditional desktop computers.

The success exceeded all predictions. Osborne Computer Corporation experienced meteoric growth, becoming one of the fastest-growing companies in American history. Sales reached 10,000 units monthly, generating over $70 million in revenue in the first year alone.

The machine’s flaws became apparent. The 5-inch screen, constantly criticized, forced users into perpetual horizontal scrolling to view standard documents. Floppy storage proved insufficient for certain professional applications. Osborne responded by developing a double-density option increasing capacity to 200 KB per disk.

In 1982, the company announced a major improvement: the SCREEN-PAC, an 80/104 column display option. This hardware upgrade allowed display of up to 104 characters per line, bringing the Osborne 1 closer to industry standards. However, the premature announcement of this unavailable improvement triggered a perverse effect: sales of the existing model plummeted. This phenomenon, later dubbed the “Osborne effect,” illustrated the dangers of poorly managed product communication.

The Osborne 1’s influence on the computer industry far exceeded its commercial success. It established several lasting standards: complete component integration, systematic inclusion of software with hardware, and accessible pricing. Its impact manifested directly in the development of the first laptops from IBM, Compaq, and other manufacturers who subsequently dominated the market.

The machine introduced remarkable technical innovations. Its keyboard used full travel membrane technology offering excellent typing quality while resisting dust and liquid infiltration. Its modular design simplified repairs. Adoption of the S-100 bus and CP/M system guaranteed compatibility with a vast existing software ecosystem.

The device’s ruggedness was impressive. Tests documented in The Portable Companion reported its survival of stairway falls and extreme operating conditions. This durability, coupled with its portability, made it a perfectly suited tool for mobile professionals.

The Osborne 1 transformed workplace organization in businesses. It enabled the emergence of the “mobile office” concept, allowing employees to carry their digital environment while traveling. This evolution foreshadowed the changes workplace organization would undergo with the widespread adoption of portable computing.

The fall was as swift as the rise had been meteoric. In September 1983, Osborne Computer Corporation filed for bankruptcy, victim of increased competition, production difficulties, and strategic errors. The company had underestimated the market’s pace of evolution and the need for constant innovation. Within two years, the Osborne 1 went from revolutionary innovation to obsolete technology. Its impact on democratizing portable computing and establishing industry standards remains undeniable nonetheless.

The Osborne experience underscores the importance of portability, the necessity of a complete software ecosystem, and the delicate balance between technological innovation and market expectations. The Osborne 1 nevertheless wrote an essential chapter in the history of personal computing.

Top

LZW

The LZW compression algorithm tells a singular story, that of a theoretical discovery that took years to find its practical form. It all began in 1977, when Abraham Lempel and Jacob Ziv published a hermetically-titled article in the IEEE Transactions on Information Theory: “A Universal Algorithm for Sequential Data Compression”. Their work presented a method using phrase dictionaries that slide over already-read text, progressively building new patterns by adding symbols to existing sequences.

The following year, the two researchers returned to the same journal with a refined version of their algorithm. These publications remained nonetheless highly abstract, laying theoretical foundations but leaving the reader wanting when it came to concrete applications. It wasn’t until 1984 that a third party entered the scene and transformed this academic work into a usable technology.

Terry Welch was working at the Sperry Research Center when he published his article “A Technique for High-Performance Data Compression” in Computer magazine. His contribution revolutionized Lempel and Ziv’s approach. Welch had the idea of pre-populating the dictionary with the 256 standard ASCII characters rather than starting with an empty dictionary. This modification enabled immediate processing of all simple characters, including on their first appearance. The remaining codes, from 256 to 4095 in a 12-bit dictionary, could thus serve exclusively for newly discovered sequences.

The LZW algorithm was born, bearing the initials of its three creators. Its strength lay in its ability to detect and eliminate redundancies while guaranteeing perfect restoration of the original data. Unlike other methods, LZW never sent raw characters but only codes, generating substantial gains on files containing repetitive patterns.

A remarkable characteristic of this algorithm was its autonomy during decompression. The decoder didn’t need to receive the dictionary used for compression; it could reconstruct it identically by applying the same rules as the compressor. This property made LZW memory-efficient and fast to execute.

The computer industry embraced this innovation. In 1987, CompuServe launched the GIF (Graphics Interchange Format) format, which exploited LZW to compress images. This format quickly became indispensable on early networks and then on the Internet, notably thanks to its ability to store numerous images in a single file and manage transparency. UNIX also integrated the algorithm into its compress command, while the TIFF and PDF formats adopted it for their specific needs.

LZW’s performance varied considerably depending on the type of data processed. Natural language texts, rich in redundancies, suited it perfectly, as did images containing vast uniform areas. On such content, the algorithm commonly achieved compression rates of 50% or more. Binary files, however, yielded more unpredictable results: some compressed better than ordinary text, while others stubbornly resisted any processing.

The practical implementation of LZW required solving some technical puzzles. The trickiest arose during decompression, when the decoder encountered a code before having been able to define it. This baroque situation occurred with sequences of the type (string, character, string, character, string), where the algorithm emitted a code that the decompressor hadn’t yet had the opportunity to create. A specific exception mechanism was therefore needed to handle these particular cases.

LZW’s success story took an unexpected turn in the 1990s. Unisys, holder of a patent on the algorithm, began demanding royalties for any commercial use. This sudden claim provoked an outcry in the computing community and prompted the development of the PNG (Portable Network Graphics) format as a free alternative to GIF. The patent finally expired in 2003 in the United States and in 2004 elsewhere, at last freeing the algorithm from these legal constraints.

Dictionary size directly influenced compression efficiency. A 12-bit dictionary, capable of storing 4096 entries, often represented a satisfactory balance between compression rate and memory consumption. The algorithm gained performance with large files, as the dictionary had more opportunities to identify and reuse recurring patterns.

On small files, LZW showed its weaknesses. The storage cost of the dictionary could exceed the compression benefit, making the compressed file larger than the original. Variants like LZWS (Lempel-Ziv-Welch-Setia) attempted to dynamically adapt the strategy according to data size, with mixed success.

The LZW algorithm established enduring principles in the field of compression: the use of adaptive dictionaries, lossless compression, the importance of autonomy during decompression. It occupies a special place in the computing ecosystem. More recent methods certainly offer better compression rates, but its relative simplicity makes it an excellent pedagogical tool for understanding the fundamental mechanisms of data compression. It remains used in certain specific applications where its robustness and execution speed compensate for its limitations. In the history of information technologies, LZW remains one of the first compression algorithms to have achieved lasting commercial success and truly universal adoption.

Top

AutoCAD

In 1982, computer-aided design was the preserve of the privileged few. CAD systems cost a fortune – several hundred thousand dollars – and required specialized machines that only large corporations could afford. A handful of programmers decided to change the game.

John Walker gathered around him about fifteen developers, most from Information Systems Design, a company working on UNIVAC systems. They created Autodesk with a simple but bold idea: develop CAD software for personal microcomputers that were beginning to flood offices. The organization they established was unusual for the time: everyone worked from home, communicating by telephone and mail. With an initial investment of $59,000, they launched a handful of parallel projects, convinced that one of them would eventually break through.

The breakthrough came from Mike Riddle and his Interact program, written in a programming language he had designed himself. The Autodesk team purchased the source code and undertook a complete rewrite in C. Dan Drake and Greg Lutz handled this delicate conversion, progressively transforming this drawing software into what would become AutoCAD. The choice of C, still uncommon in the commercial world, proved judicious for the software’s future portability.

In late 1982, at the Comdex show in Las Vegas, AutoCAD made its public debut. The software ran on an IBM PC equipped with 64 KB of memory and two floppy drives. To make an impression, the team organized a demonstration with an HP plotter borrowed for the occasion. The announced price caused a sensation: $1,000, certainly substantial for PC software, but trivial compared to existing solutions.

Mike Ford developed an original commercial strategy based on a network of trained resellers. These handled first-line technical support, thus freeing Autodesk from the constraints of direct customer service. This decentralized approach aligned well with the company’s DNA and its remote work philosophy.

AutoCAD won over its target clientele: small architecture and engineering firms. Its ability to run on different hardware platforms and exchange files seamlessly represented a major asset in a highly compartmentalized computing world. Users appreciated this flexibility, which prevented them from being locked into a particular manufacturer.

The technical team maintained a frantic pace of innovation. Support for Intel’s 8087 math coprocessor significantly accelerated calculations. Integration of a LISP language offered advanced users the ability to extend the software’s functionality according to their specific needs. Dimensioning tools became increasingly sophisticated. Each week, during collegial technical meetings, the team decided on the product’s future directions.

Three years after its creation, Autodesk went public. The company had never sought venture capital, developing solely through its own revenues. Business Week ranked it two consecutive years as the fastest-growing company. John Walker, little attracted to managerial aspects, handed the reins to Al Green, then to Carol Bartz in 1992.

The arrival of Windows 3.0 in 1990 transformed the technical landscape. Until then, AutoCAD ran primarily under DOS and was ported to numerous UNIX variants. Management decided to focus its efforts on Windows, anticipating this system’s future dominance. This decision coincided with the emergence of 3D modeling as an essential feature, developed under Scott Heath’s leadership.

AutoCAD succeeded where others failed through different ingredients: an exceptional technical team obsessed with quality and compatibility, an effective commercial strategy through resellers, accessible pricing, and a remarkable ability to follow the rapid evolution of computer hardware. By democratizing CAD and making it accessible to small organizations, the software transformed design practices in industry and architecture.

Top

CD-ROM

At the turn of the 1980s, personal computing operated on floppy disk time. These small plastic squares, fragile and limited to a few hundred kilobytes, were the primary means of exchanging programs and data. The idea that a single disk could contain the equivalent of several hundred floppies belonged to the realm of science fiction. Yet this revolution was already taking shape in Philips laboratories.

The story begins with optical storage research conducted by the Dutch company during the 1970s. Philips engineers were working on the VLP system, a technology that etched analog video signals onto a plastic disc read by laser. These experiments, initially intended to replace video cassettes, would open a completely unexpected path. From this work would emerge first the audio Compact Disc, then its computing descendant: the CD-ROM.

The collaboration between Philips and Sony resulted in the 1983 announcement of the CD-ROM, an acronym for Compact Disc Read-Only Memory. The two electronics giants jointly defined the technical specifications in the famous “Yellow Book,” a reference document that established the rules for the new format. The concept adopted the physical dimensions of the audio CD: a polycarbonate disc 12 cm in diameter and 1.2 mm thick, scanned by an infrared laser that deciphers information engraved as microscopic grooves.

But where the audio CD stores music, the CD-ROM organizes data differently. It offers two operating modes. Mode 1 prioritizes reliability for computer data through an enhanced error correction system. Mode 2 is intended for compressed audio and video content, accepting minor errors imperceptible to the human ear or eye.

The storage capacity revolutionized habits: 650 MB at once, equivalent to 450 standard floppies. With this sudden abundance, software publishers could finally design ambitious applications without worrying about space constraints. Gone were the days of having to cut features to fit on a few floppies.

The first CD-ROM drives hit the market in 1985. These expensive and relatively sluggish machines displayed a transfer speed of 150 KB/s, dubbed “single speed” or “1X” in technical jargon. Adoption was initially limited, reserved for professionals and affluent enthusiasts. It would take until the early 1990s for the technology to truly find its audience.

The rise of multimedia changed everything. Video game and educational software developers discovered the possibilities offered by this new storage capacity. Where Windows 95 required no fewer than 13 floppies in its basic version, the CD-ROM version fit on a single disc and drastically simplified installation. No more juggling a stack of floppies, monitoring insertion order, or restarting the procedure when one proved defective.

The video game industry seized the opportunity. The CD-ROM enabled the integration of cinematic-quality video sequences, orchestral music, and detailed graphics previously unthinkable. Sony built the success of its PlayStation on this technology, abandoning expensive cartridges to bet on CD-ROM. Games took on new dimensions, studios could tell complex stories with multiplied production resources.

The race for speed was on. Manufacturers competed ingeniously to accelerate data reading. From 1X, speeds progressed to 2X, 4X, 8X and far beyond, reaching 52X in the early 2000s. This surge in performance was accompanied by a spectacular drop in prices. Drives, initially sold for several thousand francs, became accessible to the general public. Blank discs followed an identical trajectory, transforming the CD-ROM into an affordable archival medium.

Technical evolution didn’t stop there. In 1988, CD-R appeared, finally allowing users to burn their own discs. This innovation democratized the production of custom CDs, paving the way for individual backups and data exchange between individuals. The CD-RW storage medium, introduced in 1997, took an additional step toward versatility by allowing data erasure and rewriting.

Standardization was the key to this success. Technical specifications were subject to strict international standards, with ISO 9660 notably defining the universal file system. This approach, inherited from the 1985 “High Sierra” format, guaranteed that a CD-ROM burned on one machine could be read on any other, regardless of the drive manufacturer.

The CD-ROM spawned a large family of derivative formats. CD-i targeted interactive multimedia applications, Video CD attacked the home video market, Photo CD revolutionized digital image storage. These variations met with mixed fortunes, but testified to the conceptual richness of the original medium.

The CD-ROM made possible the distribution of multimedia encyclopedias like Encarta, exhaustive documentary collections, and voluminous professional databases. It directly contributed to the multimedia explosion of the 1990s, a period when the personal computer ceased being a simple word processing machine to become a family entertainment center.

This hegemony began to falter in the early 2000s. The DVD, with its superior storage capacity, gradually relegated the CD-ROM to secondary status. The arrival of high-speed Internet and the widespread adoption of USB flash drives accelerated the process. The dematerialization of software, downloaded directly from publishers’ servers, dealt an additional blow to physical media.

Top

Intel 80286

February 1, 1982 marked the debut of the Intel 80286, officially designated iAPX 286 in Intel’s terminology. This 134,000-transistor chip represented a turning point in the evolution of 16-bit microprocessors. Its destiny became intimately tied to that of the IBM PC/AT in 1984, and subsequently to the entire ecosystem of PC/AT compatibles that would dominate the computing landscape until the early 1990s.

The performance history of the 80286 tells a story of gradual power increase. Intel launched its first models at 6 MHz and 8 MHz, before offering faster versions reaching 12.5 MHz. AMD and Harris would later push this architecture to unexpected heights: 20 MHz for the former, 25 MHz for the latter. These figures mask a more nuanced reality: the processor executes on average only 0.21 instructions per clock cycle. Translated into raw power, this rate yields 0.9 MIPS for the 6 MHz model, 1.5 MIPS at 10 MHz, and 1.8 MIPS at 12 MHz.

Under the hood, Intel completely rethought the architecture. The 80286 nearly doubled performance per cycle compared to its predecessors, the 8086 and 8088. This technical achievement was no magic trick: it resulted from targeted optimizations, such as handling complex base+index addressing modes through a dedicated circuit rather than the general arithmetic unit. Demanding mathematical calculations, particularly multiplication and division, now required fewer cycles.

The most spectacular leap concerned memory addressing. With its 24 address bits, the 80286 theoretically managed up to 16 MB of RAM, shattering the one-megabyte barrier that constrained the 8086. This capacity remained largely theoretical in daily practice, however. The prohibitive cost of RAM and the scarcity of compatible software limited most machines to a single megabyte. Accessing this extended memory from traditional real mode also imposed a significant performance penalty.

The 80286’s major contribution lay in its protected mode. This innovation elevated it to the level of professional processors of the era. Beyond exploiting 16 MB of physical memory through its integrated memory management unit, this mode opened a logical addressing space of one gigabyte. Gone were the crashes caused by undisciplined applications writing anywhere. Memory protection became real. The system organized memory into distinct segments for data, code, and stack, with a privilege hierarchy that prevented low-level programs from interfering with higher-level ones.

This elegant mechanism concealed a critical flaw, however. Once switched to protected mode, the 80286 could only return to 8086-compatible real mode through a complete reset. IBM deployed considerable ingenuity in the PC/AT to work around this limitation: external circuits, specialized code in the ROM BIOS, a convoluted instruction sequence that triggered the reset while preserving memory and control. The solution worked, but performance suffered heavily.

These technical constraints directly influenced the software ecosystem. In January 1985, Digital Research partnered with Intel to present Concurrent DOS 286, an operating system designed to natively exploit protected mode in a multi-user and multitasking environment. The project ran into harsh realities: 8086 emulation on production chips revealed malfunctions. Intel responded by correcting documented errors in the E-1 stepping, and actually modifying the microcode in the E-2 stepping to accelerate emulation. These improvements allowed IBM to adopt DR Concurrent DOS 286 as the foundation for its IBM 4680 OS in 1986, intended for IBM Plant System products and point-of-sale terminals.

Criticism rained down on the 80286. Bill Gates, never short on memorable phrases, called it a “brain-dead chip,” anticipating Windows’s inability to run multiple MS-DOS applications in parallel. This position accentuated the rift between Microsoft and IBM, the latter stubbornly developing OS/2 for the 286 in text mode, a project initially shared by both giants.

Despite its glaring imperfections, the 80286 democratized memory protection mechanisms previously reserved for mainframes and minicomputers. Where its NS320xx and M68000 competitors required external components to manage the MMU, the 80286 integrated these functions directly on its chip. This integration, coupled with substantial performance gains, propelled the x86 architecture and IBM PCs from entry-level systems to high-end workstations and servers.

The processor also enriched computing capabilities by handling different types of numbers: unsigned packed decimal, unsigned binary, unsigned unpacked decimal, signed binary, and floating point. These functionalities, combined with its multitasking capability, oriented it toward communications applications, real-time process control, and multi-user systems.

The 80286 saga perfectly illustrates the complexity inherent in microprocessor development, where each architectural choice resonates across decades of software evolution. Its imprint endures through subsequent generations of x86 processors, which retain its fundamental concepts while smoothing over its growing pains.

Top

SMTP

In August 1982, the IETF published RFC 821. This technical specification, which might seem unremarkable in the constant flow of normative documents, would transform global electronic communication. It gave birth to the Simple Mail Transfer Protocol, the invisible architecture that carries billions of messages daily today.

The idea seems simple: create a common language between machines to exchange electronic mail. Before SMTP, each messaging system spoke its own dialect. A message sent from a UNIX server could not reach a recipient on a Windows machine. The interconnection of computer networks required a digital esperanto.

RFC 821 established the rules of the game. An SMTP client enters into conversation with a remote server, identifies itself, specifies the sender and recipients, then transmits the content. Each step generates standardized numerical response codes. This technical choreography, repeated millions of times every second across the planet, orchestrates the global exchange of messages.

The protocol sailed through the 1980s and 1990s without major modification. Its robustness was impressive because for nearly twenty years, the same technical specification supported the explosion of the Internet. But this longevity also revealed its flaws. Hackers discovered how to exploit open relays to flood the network with unsolicited mail. Spam was born and proliferated.

In April 2001 came RFC 2821. The IETF introduced ESMTP (Extended SMTP), an enhanced version of the original protocol. The new features changed the game: automatic delivery notifications, detailed error messages, support for non-standard characters in headers. Most importantly, ESMTP enabled encryption of communications between servers. Security finally entered the equation.

This evolution nevertheless preserved a cardinal principle: backward compatibility. All ESMTP servers accept classic SMTP connections. This strategy ensured a gradual transition, avoiding the chaos of an abrupt break with the existing infrastructure.

RFC 5321, published in October 2008, consolidated these achievements. It refined the rules for using TLS to secure exchanges and clarified certain ambiguous technical aspects of previous versions. But the protocol now faced challenges far more complex than its original specifications.

Spam transformed the Internet into a battlefield. Malicious automata systematically tested address combinations to collect valid targets. Directory Harvest Attacks automated this prospecting. Malware spread via attachments. Spoofing techniques enabled the falsification of message origins, sowing confusion about the true identity of senders.

Faced with this escalation, the technical ecosystem responded. With the SPF framework, domains officially declared which servers could send emails on their behalf. DKIM added cryptographic signatures to messages to authenticate their origin. Anti-spam filters analyzed content, consulted blacklists, applied sophisticated heuristics to separate the wheat from the chaff.

Yet the SMTP protocol carries certain structural vulnerabilities. It does not natively verify whether the sender’s address matches their declared domain. Automatic notifications can confirm the existence of email addresses to spammers. The MIME format, initially designed to enrich messages with different types of content, also serves to conceal malicious content. A dual-use technology.

Despite these weaknesses, SMTP still dominates global electronic messaging. Its longevity stems first from its conceptual simplicity: developers quickly grasp how it works. Second, its flexibility: the protocol accepts extensions without breaking with existing implementations. Finally, the inertia of the immense installed base of servers and clients that use it.

This technical history reveals a broader truth about the evolution of computing standards. A protocol rarely survives on its intrinsic qualities alone. It must adapt to new uses while integrating emerging security constraints and preserving compatibility with its ecosystem. SMTP illustrates this delicate balance between innovation and continuity.

Current concerns focus on systematic encryption of communications, strengthened sender authentication, and adaptation to mobile messaging. The protocol continues to evolve, forty years after its birth. This exceptional longevity in the computing universe testifies to its robust initial design and its capacity for continuous adaptation.

The complex architecture now surrounding SMTP—anti-spam filters, authentication systems, security protocols—forms a sophisticated technical ecosystem. This growing complexity contrasts with the protocol’s original simplicity.

Top

Commodore 64

In 1982, Jack Tramiel launched a bold gamble with the Commodore 64. Known for his philosophy “computers for the masses, not the classes”, the businessman set the price at $595—half that of the competition. This machine takes its name from its 64 KB of RAM, an impressive amount that foreshadowed unprecedented possibilities.

The secret to this economic performance lay in a decision made six years earlier. By acquiring MOS Technology in 1976, Commodore secured control of its components: the MOS 6510 processor, derived from the famous 6502, but also two innovative chips. The VIC-II handled display with its 16 colors, 320x200 pixel resolution, and especially its eight sprites, those small graphical objects that would bring arcade games to life. As for the SID, this integrated synthesizer offered three synthesis voices with precise control of ADSR sound envelopes. Few computers of the era could rival such audio capabilities.

The C64’s architecture revealed particular ingenuity. The system juggled between ROM and RAM as needed, freeing up memory space when necessary. The PETSCII character set included semi-graphical symbols that transformed the screen into a true creative canvas. A novice programmer could design rudimentary interfaces without knowing a single line of machine code. The integrated BASIC and KERNAL system offered the accessibility Tramiel sought, without limiting experts who explored the machine’s most technical depths.

Commercial innovation accompanied technical innovation. Commodore abandoned specialized stores to invest in big-box retailers like K-Mart. This groundbreaking strategy brought the C64 into American, then European, households. Parents discovered an affordable “family” computer. Children found a sophisticated game console. This dual nature appealed to a much broader audience than computer enthusiasts alone.

The numbers speak for themselves: between 12 and 17 million units sold through 1994 made the C64 the best-selling personal computer in history. This exceptional twelve-year longevity testified to a robust and scalable design. When IBM-compatible PCs dominated the professional market, the little Commodore continued to reign over home entertainment.

The software library literally exploded. Thousands of titles emerged, from classics like Impossible Mission to amateur creations distributed via BBS networks. These electronic bulletin boards, ancestors of our forums, connected users by modem. People exchanged programs, tips, and shared passions. Tools like the Shoot’em Up Construction Kit democratized video game creation. No need to be a programmer to design your own shooter!

The EasyScript word processor and CalcResult spreadsheet proved that the C64 wasn’t limited to gaming. The arrival of GEOS in 1986 revolutionized the user interface. This graphical system offered windows, icons, and drop-down menus, rivaling Apple’s Macintosh in an otherwise different market segment.

But it was the community that truly forged the C64’s soul. Companies like HAL Laboratory continually pushed technical boundaries. The “demoscene” was born from this creative emulation. These artist-programmers created stunning visual demonstrations, blending impossible graphical effects, fluid animations, and refined musical compositions. Their works proved that a modest machine could rival far more expensive systems, provided one mastered every register, every processor cycle.

With this chip, a true electronic musical instrument, composers like Rob Hubbard and Martin Galway created a specific musical style: “SID music”. Their melodies left a lasting mark on video game soundtracks and influenced emerging electronic music. Who would have imagined that a home computer would spawn a new musical genre?

Top

Sinclair ZX Spectrum

In 1980, the personal computer remained a mysterious object for most people. Machines cost a fortune and seemed reserved for experts or businesses with deep pockets. Clive Sinclair harbored a different ambition. This British entrepreneur, who had already made his mark with pocket calculators, wanted to create a computer that everyone could afford.

In April 1982, Sinclair Research Ltd launched the ZX Spectrum, a small black box that would challenge many assumptions. With its price of 125 pounds sterling for 16 KB of memory or 175 pounds for the 48 KB version, this machine literally shattered market conventions. The design was disarmingly simple: a compact case with a membrane keyboard that plugged directly into any family television.

Under this minimalist hood beat the heart of a Zilog Z80A at 3.5 MHz. Sinclair had made clever technical choices. The real breakthrough lay in color management: eight different hues with two brightness levels each, hence the name "Spectrum" which sounded like a promise. The screen displayed 256 × 192 pixels, a decent resolution for the time. Sound was limited to a simple buzzer, but no matter! Programs loaded from ordinary audio cassettes, the very same ones used for listening to music.

The system embedded a version of BASIC language adapted by Nine Tiles Ltd, designed to leverage the machine’s graphics and sound capabilities. No frills in the operating system, just the essentials to make everything work. This stripped-down approach actually concealed remarkable intelligence: making computing accessible without oversimplifying it.

Marketing began through direct sales from Sinclair’s offices, before reaching British store shelves. The reception exceeded all predictions. People discovered they could finally afford a "real" computer, not a gadget. The enthusiasm was such that a video game industry emerged almost instantly around this little black machine.

The Timex factory in Dundee, Scotland, ran at full capacity. Production climbed to 50,000 units per month in 1983 to meet unwavering demand. The milestone of one million units sold was reached as early as 1983, then two million the following year. The Spectrum became the best-selling personal computer in Europe.

This commercial success enabled Sinclair to develop an entire universe around its machine. The ZX Printer thermal printer, joystick interfaces, and especially the Microdrive drives launched in 1983 completed the offering. While their high price limited adoption, these small cartridges offered faster and more reliable storage than cassettes.

Specialized press flourished with magazines entirely dedicated to the Spectrum. Readers found programs to copy line by line and accessible programming courses. A true software industry took shape around publishers like Ultimate Play The Game, Ocean, or Imagine. Games grew in sophistication, exploiting every available byte of memory.

British national education did not remain indifferent to the phenomenon. Seeing in this machine a way to prepare an entire generation for computing challenges, the government subsidized Spectrum purchases in schools. Thousands of schoolchildren thus discovered programming on these small membrane keyboards.

Facing intensifying competition, Sinclair launched the Spectrum+ in October 1984. Same electronics, but in a redesigned case with a real mechanical keyboard inspired by that of the QL, the brand’s high-end computer. Sold at 179.95 pounds, it appealed to those who had been put off by the original’s membrane keyboard. An upgrade kit was offered to owners of the first models.

The software catalog grew day by day, exceeding 10,000 titles in 1985. While games dominated largely, professional and educational applications could also be found. The developer community shared its discoveries and tricks, creating remarkable creative excitement around the machine’s technical limitations.

But the market evolved quickly. The arrival of the Commodore 64 and Amstrad CPC changed the game. These competitors offered better keyboards and superior multimedia capabilities for comparable prices. Spectrum sales began to flag as early as 1985.

Sinclair Research’s financial difficulties worsened. The dismal failure of other projects like the QL computer or the C5 electric vehicle weighed down the accounts. In 1986, Amstrad purchased the computer division. Spectrum production continued under this new owner until 1992, totaling more than 5 million units sold across all versions.

The Spectrum literally democratized personal computing in Europe, taking it out of laboratories and installing it in family living rooms. Entire generations of programmers and video game creators cut their teeth on this little membrane keyboard. Its simplicity encouraged experimentation and understanding of computing mechanisms.

A passionate community continues to create for the Spectrum. Emulators allow reliving the experience on our modern machines, while developers enjoy pushing the limits of this architecture over forty years old. The Spectrum has become the symbol of a golden era when programming remained within everyone’s reach.

Its cultural impact is measured less in market share than in vocations inspired. This little black box demystified computing by showing that a computer could be simple, affordable, and powerful all at once. Clive Sinclair’s vision, making technology accessible to as many people as possible, finds its finest illustration here. Other computers certainly succeeded it, more powerful and more comfortable, but none will have had a comparable social impact.

Top

Thomson TO/MO

In 1979, in the workshops of Thomson’s factory in Moulins, José Henrard handled electronic circuit board prototypes. This economist and sociologist, a CNRS researcher, discovered computing through his trainer brother’s computer. Thomson recruited him with an ambitious mission: to design a French home microcomputer. His vision went beyond a mere technological gadget. He wanted to create a machine that would demystify computing, teach, and communicate.

His prototype integrated a Motorola 6809 8-bit microprocessor, considered among the most powerful of its time. With 8 KB of RAM expandable to 32 KB and a graphical resolution of 320×200 pixels in 8 colors, the TO7 stood out with its light pen included as standard. Users could now interact directly with the screen, whereas most computers still relied solely on a keyboard.

In September 1980, Henrard presented his prototype to Thomson’s executives. Their bewilderment was evident when faced with what they perceived as a gadget. Nevertheless, the project received approval. Production began in Moulins. Two years later, Thomson, freshly nationalized under Alain Gomez’s leadership, unveiled the first 100 TO7s at SICOB. The specialized press applauded: remarkable graphics quality, acknowledged entertainment dimension, robust Microsoft Basic. The only drawback: the price of 7,000 francs.

The following year was strategic. Thomson created SIMIV (International Society for Microcomputing and Video) to orchestrate its microcomputer policy. Jean Gerothwohl, a former advertising executive and Alain Gomez’s fellow student at Sciences Po, took charge. His strategy prioritized education and adopted a long-term approach. Sales took off thanks to bold marketing campaigns: mass-market advertising, targeted promotions, comprehensive press kits.

1984 saw the birth of two new models. The MO5, priced at 2,390 francs, challenged British machines like the Oric or Spectrum. The TO7-70 came equipped with 64 KB of RAM. Thomson secured a decisive contract with UGAP: 40,000 machines over five years for the Ministry of Education. A success that paved the way for a greater ambition.

In January 1985, Laurent Fabius, then Prime Minister, announced the "Computing for All" plan. By seeking to introduce all French students to computing and train 110,000 teachers in one year, the objective bordered on utopian. Thomson was the main supplier with a massive order of 108,400 machines intended to equip 24,000 educational establishments. The Lille-based company Léanord developed the "Nanoréseau," an ingenious system allowing up to 31 Thomson computers to connect to a more powerful master machine.

The TO7-70 revealed its technical innovations. Its RAM reached 48 KB, expandable to 64 KB. It now handled 16 colors versus 8 previously. Its proprietary operating system interfaced with Basic and Logo. The computer multiplied the interfaces: cassette reader for storing programs, cartridge port for software, SCART socket for television, audio interface, game controllers.

A rich software library developed around education, particularly for learning French and mathematics. Business applications arrived with the Colorcalc spreadsheet and the Pique-Fiche database. Colorpeint democratized computer-assisted drawing. Games completed this diversified offering.

However, Thomson struggled against international competition. Export attempts failed, except for some sales in Italy and a contract with Algeria. In 1986, the company launched the TO9, a more powerful model priced at 8,990 francs, but compatibility with previous models posed problems. Users questioned, developers hesitated.

The arrival of low-cost PC compatibles, led by Amstrad, brutally disrupted the market. Thomson abandoned its proprietary standard strategy in 1986 to venture into PC compatibles with the TO16 range. Too late. Sales collapsed: 160,000 machines in 1986, 100,000 in 1987, 60,000 in 1988. The Saint-Pierre-Montlimart factory closed its doors, eliminating 450 jobs.

On January 27, 1989, Thomson announced the definitive end of its microcomputing activity. This decision, made without consultation with users or the Ministry of Education, closed nearly a decade of French industrial adventure. The final withdrawal occurred on January 1st, 1990. Thomson preferred to refocus on consumer electronics and defense.

The legacy of Thomson computers marks the computerization of French society. The "Computing for All" plan enabled the first massive introduction of computing in schools. Their design prioritizing direct interaction via the light pen and the educational dimension testifies to a distinctive vision of personal computing. The bold technical choices, from the 6809 microprocessor to the Nanoréseau, illustrate the innovative capacity of the French industry in the 1980s.

The commercial failure resulted from a national market that was too narrow, the absence of international breakthrough, high prices, and especially the relentless emergence of the PC standard. This adventure nevertheless represents a significant stage in the French computer industry, blending technological ambition, political will for national independence, and democratization of access to computing.

Top

PostScript

In 1982, two researchers from Xerox PARC decided to walk out. John Warnock and Charles Geschke had just spent months developing a printing technology called Interpress, capable of describing a page to be printed without worrying about the type of printer used. The concept was innovative, but Xerox saw no commercial interest in it. The two men then founded Adobe Systems and got to work on their own version: PostScript.

The era was hardly conducive to this kind of venture. The 1970s were steeped in a purely computational conception of computing. IBM boasted in its advertisements about the derisory cost of multiplications on its machines. But at Xerox PARC, another idea was germinating: the computer would become a communication tool. Everyone would soon have their own personal machine, connected to printers shared over the network.

Adobe’s initial plan was to manufacture printers and open printing service centers. Bill Hambrecht, their investor, persuaded them to change course. Why not instead design complete systems combining workstations and laser printers? That’s when Steve Jobs burst onto the scene.

Jobs discovered PostScript and immediately grasped its potential. His Macintosh revolutionized the user interface, but printing was a disaster. The ImageWriter barely exceeded 72 dots per inch, a derisory resolution for serious use. In 1985, Apple released the LaserWriter, the first printer to embed PostScript. This machine contained more computing power than the Macintosh: 512 KB of read-only memory just for the PostScript code, a veritable fortune.

The announced price of $7,000 shocked Apple’s sales staff. Three times the price of the Macintosh for a printer! Jobs held firm, and the opportune drop in memory chip prices ultimately vindicated his gamble.

PostScript solved a complex technical problem with a strikingly simple approach. The language treated text as a special case of graphics. This unification allowed characters to be manipulated with total freedom: they could be rotated, resized, and deformed at will. The idea was all the more appealing because it transcended the resolution of the output device. A page description could be printed on different types of printers while maintaining its quality.

Adobe didn’t stop at creating a printing standard. In 1987, the company launched Illustrator, a vector drawing software. The era was bold because no graphic designer used computers. Illustrator’s interface, based on Bézier curves and their control points, was born from a family anecdote. Warnock’s wife, a professional graphic designer, regularly asked him to hand-code in PostScript the logos she had to create. This domestic experience directly inspired the software’s interface.

The following year, Adobe acquired a small program developed by the Knoll brothers. This software would become Photoshop. The gamble seemed crazy: the largest hard drives didn’t exceed 20 megabytes, just enough to store one high-resolution photo. Adobe was betting on hardware evolution, and history proved them right.

In 1989, PostScript’s dominant position faltered. Apple and Microsoft joined forces to create TrueType, a competing format for fonts. Adobe responded by rapidly developing ATM (Adobe Type Manager) for displaying fonts on screen. The technical quality of the solution and the trust Adobe inspired allowed it to maintain its lead.

Two years later, Warnock had a revelation. Why not use PostScript to create portable documents? All they had to do was capture the PostScript stream generated by any application to obtain a file readable on any platform. This idea gave birth to Acrobat and the PDF format. For success to come, the Internet had to develop so that exchanging formatted documents would become an obvious necessity.

Adobe’s triumph rests on remarkable insights. The company knew how to identify needs in the market. It maintained impeccable technical standards in typographic rendering. Its commercial strategy, which consisted of licensing PostScript to manufacturers rather than producing its own machines, proved judicious.

The PostScript language established principles that remain relevant today, such as the separation between document description and its rendering, freedom from hardware constraints, and uniform treatment of text and graphics. PDF, the direct heir of PostScript, reigns over global document exchange. Beyond these technical aspects, PostScript transformed the computer into a tool for graphic creation and communication, definitively wresting it from its status as a mere calculator.

Top

SunOS

In 1982, when Bill Joy left the corridors of the University of Berkeley, he took with him far more than a master’s degree in electrical engineering. This man had spent years working intensively on BSD, that version of UNIX born in California’s university laboratories. He created the vi editor, developed the C shell, and knew the system’s inner workings better than anyone. He co-founded a new company: Stanford University Network, shortened to Sun. Sun Microsystems. An era when AT&T was just beginning to commercialize UNIX, and when computing was experiencing a rare period of excitement.

The first version of SunOS appeared in 1983. Joy and his colleagues took BSD as their foundation, but they didn’t simply reproduce what existed. They grafted remarkably efficient networking tools onto it and developed the Network File System, that NFS which would later become indispensable in the industry. Meanwhile, Sun worked on windowing solutions for UNIX. Their objective? To make this system, known for its austerity, accessible to a wider audience.

The 1980s established SunOS as a reference, especially in advanced technical fields. Computer-aided design engineering firms eagerly competed for Sun workstations. These machines, powered by SunOS, vastly outperformed the personal computers of that era. They offered a stable multi-user environment, which changed everything for businesses accustomed to the limitations of single-user systems.

But in 1988, a bombshell shook the small world of UNIX. AT&T acquired equity in Sun Microsystems. Competitors were outraged, convinced that Sun would benefit from privileged information about UNIX’s evolution. They responded by creating the Open Software Foundation, while AT&T, Sun, Data General, and Unisys established UNIX International. These maneuvers revealed the tensions running through the computer industry, torn between technical standardization and commercial warfare.

The landscape changed again in 1993. Sun announced that SunOS 4.1.4 would be the last version built on BSD. The company switched to System V Release 4, the result of collaboration with AT&T that combined the strengths of System V and BSD. This new version inherited the commercial name Solaris, with SunOS becoming the technical designation for the kernel. A strategic shift that was part of a broader standardization effort in the UNIX world.

The graphical interface represented another arena for innovation and battle. Sun developed the OPEN LOOK interface with AT&T, hoping to create a unified user experience for all UNIX variants. But this interface clashed with OSF/Motif, built on X Window, a system developed at MIT that offered a significant advantage: the ability to run programs on a remote machine while displaying their output locally, regardless of hardware architecture or operating system.

The confrontation went in Motif’s favor. Sun took it in stride and adapted: starting with Solaris 2.5.1, the company offered a package combining both interfaces under the name Common Desktop Environment. This ability to adapt demonstrated Sun’s agility in the face of market changes and the emergence of new standards.

On the technical front, SunOS brought innovations that left a lasting mark on computing. The fast file system, introduced in BSD 4.2 thanks to DARPA funding, revolutionized performance and combated file fragmentation. NFS, a purely Sun creation, became the reference solution for network file sharing. These advances, coupled with deep integration with Sun hardware, explain the system’s commercial success.

The history of SunOS reflects that of computing as a whole. Born in the academic world as a BSD variant, heir to that research tradition characteristic of UNIX, it evolved into a complete commercial solution. It integrated emerging standards without denying its technical specificities. This transformation accompanied that of Sun Microsystems, which moved from daring startup status to industry heavyweight.

When SunOS gave way to Solaris, a chapter closed, but the legacy remained. The innovations it carried, particularly in networking and distributed file systems, continue to influence the development of modern operating systems. Its trajectory illustrates how technical choices, commercial strategies, and industry standards intertwine to shape technological evolution.

SunOS helped establish UNIX as a credible platform for professional applications, particularly in technical and scientific sectors. Its success in universities and research centers trained an entire generation of computer scientists in UNIX concepts.

Top

Agat

In 1980, while the Apple II was enjoying resounding success in American schools, the Soviet Union embarked on an ambitious yet obstacle-ridden technological venture. Kremlin leaders became aware of a worrying gap: their children were growing up without ever touching a computer keyboard. Faced with this reality, they decided to create the Agat, an almost exact copy of Steve Wozniak’s machine.

The Agat story begins with a painful realization for Soviet officials. In a society that prided itself on being at the forefront of scientific progress, personal computing remained largely unexplored territory. The Ministry of Education understood that swift action was needed to train a generation of computer users as a priority. Rather than starting from scratch, Soviet engineers made the pragmatic choice to replicate what was working elsewhere.

This decision revealed a broader strategy that the USSR had been practicing for decades. While this approach raised questions about the country’s technological autonomy, copying Western innovations saved time on research. The Agat fit into this logic of accelerated catch-up, with all that this implied.

Production therefore began in factories discovering the intricacies of consumer electronics. The first machines rolled off the assembly lines with a design strongly reminiscent of the Apple II, but the similarities often stopped at appearance. The machine’s heart beat to the rhythm of a K588 processor, the Soviet version of the famous 6502. RAM ranged from 16 to 32 kilobytes depending on the model, respectable figures for the time.

Released in 1983, problems accumulated from the first months. Locally produced electronic components failed to deliver their promised reliability. Failure rates climbed, turning each computer delivery into a technological lottery. Schools receiving these machines discovered that some simply didn’t work, or broke down after a few weeks of use.

The software challenge proved even more formidable. Reproducing hardware was possible with good engineers and perseverance, but creating a complete software ecosystem required resources and expertise that were lacking. Programs designed for the Apple II stubbornly refused to run properly on the Agat. Each adaptation required considerable work, slowing down the development of a worthy software library.

The production figures speak for themselves. In three or four years, barely 6,000 units left Soviet factories. For a country with tens of thousands of schools, these volumes remained derisory. The goal of massively computerizing education became a pipe dream. Only a few pilot institutions benefited from these rare machines, creating a glaring inequality among Soviet students.

This scarcity stemmed from the structural limitations of Soviet industry. Central planning, so effective at producing steel or tractors, struggled to adapt to the demands of precision electronics. Delivery times stretched out, specifications changed mid-course, and coordination between different component suppliers turned into a logistical nightmare.

Technological isolation worsened the situation. Western embargoes on sensitive technologies deprived the USSR of advanced components and precision machine tools. Soviet engineers had to reinvent solutions proven elsewhere, losing precious time in a race where every month counted.

The Soviet state harbored ambiguous feelings toward the computerization it officially promoted. On one hand, leaders understood the strategic importance of these technologies. On the other, they feared the uncontrolled circulation of information that computer democratization might encourage. This mistrust showed through in the usage restrictions imposed on the Agat: no question of leaving them in the hands of unsupervised users.

Education officials also discovered that owning computers wasn’t enough. Training teachers, creating adapted educational programs, maintaining the computer fleet: so many ancillary questions that hadn’t been anticipated. Many Agats ended up in closets for lack of qualified personnel to use them.

By the mid-1980s, initial enthusiasm gave way to realism. Soviet officials began looking toward IBM PC clones, easier to produce and benefiting from a richer software ecosystem. The Agat, which was meant to symbolize Soviet technological independence, progressively came to symbolize failure.

Perestroika accelerated this reassessment. The gradual opening to Western technologies made the computer self-sufficiency project obsolete. Why persist in reinventing the wheel when you could buy or copy more efficiently what already existed? This question, long obscured by ideology, resurfaced with force.

The abandonment of the Agat marked the end of an illusion. The USSR discovered that catching up technologically required more than political will and massive investments. Modern technological ecosystems rest on complex networks of innovations, skills, and know-how that cannot be decreed from Moscow.

This experience left traces in the post-Soviet technological landscape. The delay accumulated in the 1980s still weighs on ex-USSR countries today. Russia remains largely dependent on Western computer technologies, a situation whose origins lie in the failure of projects like the Agat.

Beyond its purely technical aspects, the Agat adventure tells the story of a system confronted with its own contradictions. The USSR wanted to modernize its education without losing control of information, innovate without challenging its rigid structures, catch up with the West without drawing inspiration from its methods. These tensions, long masked by space and military successes, came to light with the emergence of personal computing.

The Agat bears witness to an era when technological innovation was a major geopolitical issue. Its failure was only the beginning of broader difficulties that the USSR would face with digital technology. In this technological race, the rules of the game had changed: individual creativity, organizational flexibility, and freedom of information took precedence over central planning and state secrecy.

The history of this Soviet computer reminds us that technologies never develop in a vacuum. The Agat, in its successes as in its failures, is a mirror of a USSR in transition, torn between its ambitions of power and the limits of its political and economic model.

Top

Apple Lisa

In the corridors of Apple, late 1979, the excitement was palpable. A delegation had just returned from Xerox PARC in Palo Alto, and their faces betrayed barely contained enthusiasm. What they had seen there would radically transform a project that had seemed well underway. The Lisa, whose development had begun the previous year as a simple professional computer with a custom processor, would undergo a radical metamorphosis.

Larry Tesler had shown them Smalltalk. This graphical interface, with its overlapping windows and mouse for navigation, represented everything Apple had been seeking without knowing it. Tesler, captivated by the visitors’ sharp intellect, would join them in July 1980. From the initial project, only the code name would remain, along with a few components and a handful of engineers. Everything else would be reimagined.

Apple was betting big on this venture. Fifty million dollars vanished into the development labyrinth, the equivalent of 200 person-years of work. The team swelled to 300 people, a hundred of whom devoted themselves to the project’s core: making hardware and software communicate in perfect harmony. The custom processor gave way to the Motorola 68000, better suited to the demands of multitasking and graphics.

Apple’s engineers invented what would become our everyday computing experience. The menu bar appeared for the first time, accompanied by a single-button mouse. The clipboard concept was born in these laboratories, as was that famous trash can that would swallow our unwanted files. True, the Xerox Star already used icons, but the Lisa went further by allowing them to be grabbed with the mouse, moved, and opened with a double-click. Windows naturally overlapped, creating this electronic desktop effect.

The obsession with detail pushed the team to heights of perfectionism. Novice users paraded through the laboratories to test each interface element. Debates over terminology dragged on: what to call this button, that function? Translators spent months seeking equivalents in other languages, aware that every word mattered in the user experience.

Seven applications accompanied the operating system: LisaWrite for text, LisaDraw for drawing, LisaCalc for organized tables. LisaGraph translated data into meaningful curves, LisaProject orchestrated complex projects, LisList organized information into structured databases, and LisaTerminal opened doors to the outside world. All shared a common philosophy: consistency trumped originality.

The 12-inch screen displayed its black-and-white pixels with unprecedented precision. The Motorola 68000 processor pulsed through 1 MB of memory, expandable to double. The “Twiggy” drives, named after the era’s slender model, proved temperamental. Sony would replace them with its more reliable 3.5-inch drives. The 5 MB ProFile hard drive completed the ensemble, a colossal amount of storage.

The multitasking operating system stood as a technical feat. Virtual memory and error protection demonstrated a maturity rare in the world of personal computers. Bill Atkinson etched his name in history with QuickDraw, this graphics engine capable of displaying 4,000 characters and 800 lines per second. The file system’s redundancy mechanisms promised to limit catastrophes in case of sudden failure.

January 1983: Apple unveiled its creation to the world. The price: $9,995. An astronomical sum that made even the wealthiest companies raise an eyebrow. Sales started reasonably well, roughly 80,000 units sold in eighteen months. But enthusiasm quickly faded. The price tag discouraged buyers, third-party applications were slow to arrive, and most importantly, a little brother appeared: the Macintosh, more affordable and more compact, stole the show from 1984 onward.

April 1985 marked the end of the Lisa adventure. Production ceased, leaving Apple with an embarrassing inventory. The computer was briefly reborn as the Macintosh XL, modified to run its little brother’s applications thanks to the MacWorks program. Sun Remarketing bought the unsold units and artificially extended the machine’s career. As for the last remaining units, they ended up in a landfill, collateral victims of a shareholder lawsuit.

Yet the Lisa had sown seeds that would germinate everywhere. Its DNA found its way into the Macintosh, then into Windows, GEM, and Framework. Pull-down menus, dialog boxes, drag-and-drop became obvious features. QuickDraw migrated intact to the Macintosh, MacProject replaced LisaProject, LisaDraw transformed into MacDraw. The desktop manager inspired the Finder. Lisa Pascal continued its career as a reference language.

Beyond purely technical aspects, the Lisa redefined our relationship with computers. It proved that a complex machine could remain accessible, that power doesn’t exclude simplicity. Computing stepped out of its circles of insiders to address the masses. The principles it established—direct manipulation, visual consistency, immediate feedback—crossed decades without aging. But innovation alone isn’t enough: one must still find the right moment, the right price, the right audience. The Lisa failed commercially but triumphed conceptually.

Top

Coleco Adam

Coleco Industries decided to leave the familiar world of video game consoles in 1983 to venture into the personal computer market. This American company, which had achieved success with its ColecoVision, launched the ADAM, a machine meant to be revolutionary in its design.

The initial concept was appealing: why force families to buy a computer, printer, keyboard, and storage media separately when everything could be integrated into a single product? The ADAM thus arrived on the market as a complete package for $750. In the box, buyers found the central unit equipped with 80 KB of RAM (expandable to 144 KB), a detachable 75-key keyboard with a professional feel, a daisy-wheel printer capable of producing quality correspondence, and two game controllers to honor the brand’s gaming origins.

The storage system was one of the machine’s most distinctive features. Coleco had developed a proprietary format called the “digital data pack,” a sort of large high-density magnetic cassette capable of storing up to 500 KB of data. This format promised storage capacities exceeding the 5.25-inch floppy disks of the era, but this promise would turn into a nightmare.

From a technical standpoint, the ADAM adopted a highly original architecture. Rather than concentrating all the intelligence in the central unit, Coleco had distributed multiple microprocessors throughout the system. The computer contained two, the keyboard had its own, and the printer also had its own electronic brain. This fairly visionary network approach was theoretically supposed to enable multitasking operations that no other home computer offered at the time.

SmartWriter, the ADAM’s flagship software, was directly embedded in the machine’s ROM. This integrated word processor represented an undeniable competitive advantage: where Apple II or Commodore 64 owners had to pay hundreds of additional dollars for WordStar or similar software, the ADAM offered this functionality right out of the box. SmartBASIC completed the base software offering, with announced compatibility with Apple’s AppleSoft BASIC. Coleco promised future support for the CP/M operating system, the gateway to the professional world.

The ADAM’s software catalog reflected Coleco’s educational ambitions. SmartLOGO, developed in collaboration with Seymour Papert from MIT, offered programming instruction for younger users. Sophisticated games like Buck Rogers Planet of Zoom sat alongside personal management utilities. Backward compatibility with ColecoVision cartridges added a welcome gaming dimension to this package.

Unfortunately, reality would dampen the initial enthusiasm. From the first months of commercialization, user reports multiplied signaling repeated malfunctions. The printer, presented as a major asset, broke down regularly. The magnetic tape drives proved temperamental, and the provided technical documentation was largely insufficient for understanding this complex machine’s operation.

The tape storage system crystallized all of the ADAM’s design problems on its own. Users discovered the hard way that cassettes had to be removed before turning the machine on or off, otherwise their data would be permanently corrupted. This constraint, never clearly explained in the manual, caused the loss of countless hours of work and fueled the machine’s poor reputation.

The network architecture, so attractive on paper, generated unexpected limitations. The printer served as the general power supply for the entire system, making it impossible to replace with a faster or more reliable model. The much-vaunted multitasking was actually quite limited: it was impossible to do anything else while printing was in progress.

Coleco’s policy toward third-party developers did not foster the emergence of a rich software ecosystem. Unlike Apple, which actively encouraged external developers, Coleco imposed restrictive licensing agreements and provided little technical information about its system. This closed approach limited opportunities to enrich the available software catalog.

Performance tests conducted by BYTE magazine in 1984 confirmed early users’ mixed impressions. While SmartBASIC proved indeed faster than Apple’s AppleSoft BASIC for certain calculations, tape read and write operations were dramatically slow compared to floppy disk drives. The absence of an automatic backup system and the difficulty copying data from one medium to another significantly complicated the machine’s daily use.

Faced with this failure, Coleco attempted to salvage the situation. A 5.25-inch floppy disk drive was announced, along with a 300/1200 baud modem and a 64 KB memory expansion. The company promised a complete overhaul of the technical manual and fixes for the most troublesome software bugs. But these belated improvements could not save a machine whose reputation was already permanently tarnished.

The ADAM disappeared from shelves in 1985, after less than two years of chaotic commercial existence. The considerable financial losses from this project precipitated Coleco toward difficulties that would lead to its bankruptcy in 1988. This story cruelly illustrates the pitfalls awaiting new entrants to the personal computer market of the 1980s. Technical complexity, the requirement for absolute reliability, and the need for a solid software ecosystem constituted significant barriers to entry.

In hindsight, the ADAM had nevertheless introduced forward-thinking concepts to home computing. The integration of a word processor directly into ROM foreshadowed our modern operating systems with their preinstalled applications. The “all-in-one” package concept would durably influence personal computer design.

The Coleco Adam adventure is valuable testimony to the importance of execution quality in a technological innovation’s success. It also reminds us how crucial an active developer community is for establishing a viable computing platform. These lessons remain entirely relevant for understanding current technology market dynamics.

Top

Compaq Portable

While IBM dominated the market with its PC, a young Texas-based company, Compaq Computer Corporation, was preparing to shake up the rules of the game in 1983. Its bet: to create the first portable computer fully compatible with the IBM PC architecture.

The idea wasn’t completely new. Two years earlier, Adam Osborne had already commercialized his Osborne 1, presented as the first portable computer to achieve genuine commercial success. But Osborne’s machine suffered from a fatal flaw: it couldn’t run software designed for IBM PCs, which were establishing themselves as the market standard. This was exactly the problem that Compaq’s engineers would solve.

The Compaq Portable came in the form of a 12 kg briefcase, equipped with a sturdy handle. Its design heralded what would soon be called “luggables,” machines that one dragged more than carried. The 9-inch CRT screen stood at the center of the case, the keyboard detached to provide access to the 5.25-inch floppy drives. An external “power brick” completed the package, adding its considerable weight to that of the machine.

Under the hood, there was an Intel 8088 processor, the same one that equipped the IBM PC. This total technical compatibility allowed a portable computer for the first time to run exactly the same programs as a desktop machine. The monochrome screen, expandable RAM, and floppy drive completed a package sold for $3,590, a price that clearly positioned it in the high-end segment.

Success came quickly. Professionals finally discovered a machine that gave them the freedom to work on the go without abandoning their software habits. The build quality and system reliability won over a demanding clientele, accustomed to IBM standards. Compaq also relied on a carefully selected distribution network and impeccable technical support.

In 1986, the Texas firm made a major splash with the Portable 386. This new generation integrated the Intel 80386 processor clocked at 20 MHz, a world first that propelled Compaq to the rank of technological pioneer. Performance soared: 25% faster than 16 MHz machines, and up to 3.5 times faster than computers based on the 80286 at 8 MHz.

The machine came in two versions: the Model 40 with its 40-megabyte hard drive, and the Model 100 which doubled the storage capacity. The most spectacular innovation lay in the adoption of a dual-mode plasma screen, with a resolution of 640 x 400 pixels unprecedented for this type of machine. One megabyte of RAM equipped the base configuration, expandable to 10 megabytes.

Ergonomics received careful attention. The detachable keyboard gained 12 programmable function keys, ventilation doubled in efficiency, and shock-resistant mounts protected the hard drives. Dimensions shrank: 24.8 x 40.6 x 19.8 centimeters for a weight reduced to 9 kg, a significant decrease compared to the initial model.

Connectivity expanded with serial and parallel ports, an RGBI output for external monitor, and the possibility of installing an internal 1,200 or 2,400 baud modem. An optional expansion module added two 8/16-bit slots. For backup, a 40-megabyte magnetic tape unit completed the accessories offering.

These successes propelled Compaq to the heights of American industry. Between 1983 and 1986, the company broke all records by becoming the fastest company to reach the Fortune 500 ranking. This meteoric rise validated the existence of a market for high-end professional portable computers and prompted the competition to join the race.

The “luggable” concept naturally evolved toward greater lightness. In 1988, Compaq introduced the SLT, its first portable adopting the “clamshell” format that characterizes our modern laptops. At 6 kg, the SLT marked a step toward true mobility, even though current standards remained distant.

The adventure of the Compaq Portable reveals the importance of compatibility in the computer economy. By offering machines 100% compatible with the IBM PC standard, Compaq established a business model that the entire industry would adopt. This strategy accelerated market standardization of PCs and gave rise to a genuine personal computing industry.

The Compaq Portable matched the performance of a desktop machine, a true revelation at the time. Commercially, it validated the specialized distribution model and established quality standards that would durably influence the sector.

This period of intense innovation saw manufacturers seeking the balance between power, autonomy, and portability. The compromises of the Compaq Portable, with its power “brick” and its 12 kg, may seem archaic by our current standards. Yet, this machine paved the way toward modern mobile computing and proved that successful technological innovation always results from the convergence of technical advances, an accurate vision of the market, and a coherent commercial strategy.

Thanks to this winning combination, Compaq established itself as a major player in the computer industry, a position it would maintain until its absorption by Hewlett-Packard in 2002. The story of the Compaq Portable is that of a company that knew how to transform a technical constraint—IBM compatibility—into a decisive competitive advantage.

Top

GNU Project

The story begins in 1983 in the corridors of MIT’s artificial intelligence laboratory. Richard Stallman, a seasoned programmer, experiences something that will change his life and, consequently, the history of computing. Faced with a malfunctioning Xerox printer, he encounters an outright refusal: no access to the source code to fix the problem. This frustration crystallizes a deeper unease: the software industry is shifting toward a closed model that breaks with the tradition of sharing that had characterized university laboratories until then.

This realization drives Stallman to announce, that same year of 1983, the launch of a project of rare ambition: GNU. The recursive acronym “GNU’s Not UNIX” conceals a revealing objective: to build a complete operating system, UNIX-compatible certainly, but freed from any proprietary constraints. Every user must be able to study, modify, and share the programs they use daily.

To give substance to this vision, Stallman takes a decisive step in 1984: he resigns from MIT. This break guarantees him intellectual property rights over his future work. He first tackles a text editor, GNU Emacs, which achieves unexpected success in the computing community. The momentum created encourages what follows: GCC, the compiler, emerges in 1987, followed by the GDB debugger and a full range of system utilities.

In 1985, an institutional turning point occurs with the creation of the Free Software Foundation. This organization goes beyond mere technical support for the GNU project: it forges the philosophical and legal foundations of an emerging movement. Stallman defines four freedoms he considers inalienable: to use, study, modify, and redistribute software. These principles find their legal expression in 1989 with the General Public License, a license that cleverly turns copyright against itself to guarantee the permanence of the freedoms granted.

Development is organized according to a model unprecedented for the time. Small teams of geographically dispersed volunteers, coordination through mailing lists, file exchanges via FTP: the Internet makes possible this decentralized collaboration that is only the beginning of our modern software development methods. Each team works on specific components according to a meticulous plan, an approach that contrasts with the improvisation often criticized in community projects.

Around 1990, the GNU edifice approaches completion. Compiler, debugger, editors, system utilities: nearly all elements of an operating system are in place. Yet the centerpiece is missing: the core of the system, its kernel. HURD, the GNU kernel, becomes mired in unexpected technical complexity. This is where a young Finnish student, Linus Torvalds, comes in, launching the development of Linux in 1991. The meeting between GNU tools and this new kernel gives birth to a complete and free system that Stallman names GNU/Linux, thus emphasizing his project’s contribution.

The GPL gradually becomes a de facto standard, adopted by thousands of developers worldwide. Companies like Cygnus Solutions, founded in 1989, prove that a viable business model can be built around free software by selling support and services. The collaborative methodology initiated by GNU inspires an entire generation of projects.

The questions raised by Stallman and the FSF fuel broader debates about intellectual property in the digital age. The concepts of software freedom spread to other domains: free culture, open data, open science. This conceptual expansion reveals the philosophical scope of an initiative born from technical frustration.

The 1990s and 2000s confront the GNU project with new challenges. Software patents, digital rights management, emergence of a more pragmatic “open source” movement: so many developments that test the movement’s doctrinal coherence. The FSF maintains its uncompromising stance on freedom issues, sometimes at the cost of relative marginalization compared to less radical approaches.

Yet the facts prove this perseverance right. GNU/Linux establishes itself massively in web servers, conquers supercomputers and, through Android, infiltrates billions of mobile devices. GCC remains an essential technical reference. GNU tools continue to equip the majority of modern development environments.

The GNU project has demonstrated, beyond its technical achievements, that an initiative driven by strong ethical convictions could durably transform an entire industrial sector. By simultaneously creating high-performance tools and a coherent legal-philosophical framework, Stallman and his collaborators founded a viable alternative ecosystem.

This historical contribution illuminates our contemporary issues. At a time when technology shapes our societies with growing intensity, the questions raised by the GNU project about freedom and control in the digital world resonate with particular acuity. The technical utopia of 1983 has transformed into economic and social reality, proving that ideas, even the most radical ones, can change the world when they meet the appropriate technical tools.

Top

DNS

The history of the Domain Name System begins with a file. A simple text file called HOSTS.TXT that the SRI Network Information Center laboriously distributed across ARPANET in the early 1980s. This makeshift solution worked as long as the network remained limited to a few dozen machines, but the explosion in the number of connected computers transformed this system into an administrative nightmare.

Paul Mockapetris and his team at the USC Information Sciences Institute observed this drift with concern in 1982. The HOSTS.TXT file grew larger every day, its distribution was costly, and nobody could keep up with the pace of modifications. Technological evolution worsened the situation: whereas the original ARPANET primarily connected shared central systems, the arrival of individual workstations exponentially multiplied the number of machines to reference. The scale shifted from the number of organizations to the number of users—a dizzying prospect for the time.

Faced with this impasse, Mockapetris explored existing alternatives. He studied IEN116 from DARPA Internet, examined XEROX’s Grapevine, but no system truly met the specific needs of the Internet. The team then decided to design an entirely new architecture, distributed by nature: the Domain Name System. The first specifications appeared in 1983 in RFCs 882 and 883, technical documents that would transform the Internet.

The central idea of DNS lies in its tree structure. Each node of the tree carries a label, and a computer’s full name is constructed by ascending from node to node up to the root. This natural hierarchy makes delegation of authority elegant because each organization manages its portion of the namespace without depending on a central body for routine modifications. The first top-level domains reflected this pragmatic philosophy: .edu for education, .com for businesses, .fr and .uk for countries.

The technical architecture rests on two pillars. On one side, name servers store and distribute information for their zone of responsibility. On the other, resolvers integrated into client systems query these servers to locate the requested information. Between the two, a sophisticated caching system temporarily stores responses according to a time-to-live defined by each administrator. This caching constitutes DNS’s performance secret: it avoids endlessly repeating the same queries to distant servers.

The transition from HOSTS.TXT did not happen overnight. The University of Berkeley took the lead in 1985, becoming the first organization to completely abandon the old system in favor of DNS. This pioneering migration immediately revealed practical problems: users struggled to adapt to new addressing formats, applications had to be reprogrammed to handle temporary failures inherent in any distributed system.

In 1987, approximately 300 domains had been delegated by the SRI-NIC. A year later, this number already exceeded 650, including 400 for normal namespaces and 250 for network address spaces. Seven root servers ensured the system’s global redundancy, judiciously distributed across the Internet’s main backbone networks. This exponential growth validated Mockapetris’s intuitions: DNS absorbed without difficulty a load that the old system could never have supported.

The 1990s saw DNS usage explode with the arrival of the Web and the democratization of the Internet. The system proved its robustness in the face of this unexpected surge in demand. New record types were progressively added to handle new functionalities, such as MX records that revolutionized email routing. One of the protocol’s major strengths lies in this capacity for evolution without breaking compatibility.

But success also attracted malicious interest. In the early 2000s, cache poisoning attacks (DNS spoofing) multiplied, and domain hijackings made headlines in specialized publications. The technical community responded with cryptography through DNSSEC, a protocol for securing DNS exchanges. Its deployment began symbolically in 2010 for the DNS root, before progressively extending to top-level domains and then to organizations.

Internationalization raised other issues, both technical and political. How could Chinese, Arabic, or Russian users be allowed to use their own characters in domain names? IDNs (Internationalized Domain Names) provided an elegant answer in the 2000s, but their implementation required profound modifications to ensure compatibility with billions of existing systems.

DNS governance underwent a major evolution in 1998 with the creation of ICANN. This non-profit organization took over from the U.S. government to coordinate the global domain name system. ICANN oversees the allocation of new top-level domains, establishes naming policies, and attempts as best it can to reconcile technical and commercial interests.

Between 2012 and 2024, DNS went through a new expansion phase. ICANN’s new gTLD program released hundreds of novel extensions: .shop, .paris, .blog, and thousands of others. Simultaneously, privacy concerns drove the development of DNS-over-HTTPS and DNS-over-TLS. These protocols encrypt DNS queries, limiting the possibilities for surveillance and censorship by intermediaries.

Forty years after its conception, DNS continues to operate on its original principles. This exceptional longevity is explained by Mockapetris’s architectural choices: favoring simplicity over sophistication, flexibility over premature optimization. The caching mechanisms and hierarchical distribution of authority remain relevant despite technological evolution. DNS illustrates how an infrastructure can grow while preserving its compatibility with existing systems.

In 2024, DNS manages billions of domain names and responds to hundreds of billions of daily queries. This invisible infrastructure supports all global Internet traffic. Its current challenges concern security against denial-of-service attacks, query privacy, and root server resilience.

The DNS adventure tells how a technical innovation can restructure the Internet’s architecture, transforming a simple directory into a critical global infrastructure. Its modular and evolutionary design is a model for all protocols aspiring to longevity on the Internet.

Top

C++

In the 1970s, Bjarne Stroustrup discovered at Cambridge the joys and frustrations of Simula. He liked the language for its ability to structure code with classes, but was exasperated by its slow execution. For his thesis on distributed systems, he had to abandon Simula’s elegance and return to BCPL, faster but less organized. This contradiction left a mark: why should one have to choose between performance and code clarity?

When Stroustrup joined AT&T’s Bell Labs in 1979, he kept this question in mind. The projects he worked on—analysis of the UNIX kernel, network distribution—demanded both efficiency and structure. He set himself the bold challenge of grafting Simula’s classes onto the C language. Thus was born “C with Classes,” a hybrid that retained C’s speed while adding object-oriented organization.

The first steps were modest. Stroustrup added classes, then constructors and destructors to automatically manage object initialization and destruction. Public-private access control appeared next, followed by virtual functions that enabled polymorphism. Each addition answered a specific need encountered in the field.

In 1983, Rick Mascitti suggested a new name: C++. C’s increment operator perfectly symbolized this evolution of the language. The name stuck and history would remember this designation. Two years later, in 1985, the first commercial version emerged from Bell Labs, accompanied by Stroustrup’s foundational book, The C++ Programming Language. The Cfront compiler translated C++ code into C, an elegant solution to ensure maximum portability.

Success came quickly. Programmers discovered that they could retain their C habits while accessing the benefits of object orientation. No abrupt change, but a smooth transition. The language didn’t slow down execution compared to C, didn’t require a specialized environment like Smalltalk. This pragmatic approach won over a growing community.

Between 1985 and 1989, C++ grew richer. Multiple inheritance appeared in 1989, templates opened the way to generic programming, exception handling brought greater robustness. Each feature matured first in use before being integrated into the language. This caution forged C++’s reputation for stability.

The growth was impressive: from a few hundred users in 1985, the number rose to 400,000 in 1991. Application domains multiplied: operating systems, video games, graphical interfaces, scientific computing. Development environments appeared, specialized libraries enriched the ecosystem.

In 1989, the standardization process began under the auspices of ANSI and ISO. The work culminated in 1998 with the publication of the first international standard (ISO/IEC 14882:1998). This standardization freed C++ from its creator and ensured its longevity beyond Bell Labs.

The Standard Template Library deserves special mention. Alexander Stepanov developed a transformative approach: generic containers (vector, list, map) and reusable algorithms that work with all these containers. This library transformed the way C++ was written by favoring the reuse of proven code.

The iostream input-output system advantageously replaced the printf and scanf inherited from C. Type-safer, more extensible, it integrated naturally into the C++ philosophy. These additions to the standard library made the language richer without burdening the compiler core.

Stroustrup defended constant principles throughout the evolution: no performance degradation compared to C, backward compatibility preserved, support for varied programming styles. C++ accepts classic procedural programming, object-oriented with classes and inheritance, generic programming with templates. This flexibility explains its adoption in very diverse contexts.

C++ popularized object orientation in domains traditionally reserved for procedural languages. Java and C# would draw heavily from its concepts, even if they made different architectural choices. Generic programming with templates inspired many subsequent languages.

C++ continues its evolution through successive standards. C++11 modernized the language with lambda expressions and smart pointers. C++14, C++17 and C++20 continue to expand possibilities: modules, coroutines, concepts. The language adapts to multicore architectures, parallel computing needs, and modern programming constraints.

This longevity stems from its designers’ pragmatic approach. Rather than revolutionizing for revolution’s sake, they always favored practical utility and preservation of the existing. C++ remains true to its origins: a powerful tool for those who want fine-grained control over resources without sacrificing code expressiveness. The history of this language illustrates how a good idea, nourished by field experience, crosses decades by adapting without losing its identity.

Top

Objective-C

Brad Cox and Tom Love probably never imagined that their decision to hybridize the C language with Smalltalk concepts would give birth to one of the most enduring technologies in modern computing. Objective-C, created in 1983, would traverse the decades by constantly adapting.

The starting point was nevertheless simple. Dennis Ritchie and Ken Thompson had designed C in Bell Labs in the late 1960s to develop UNIX. This procedural language offered an interesting compromise between performance and readability, but its limitations became apparent in building complex systems. On the other side, at Xerox PARC, Alan Kay and Dan Ingalls were revolutionizing the programming approach with Smalltalk and its objects that encapsulated data and behaviors.

Cox and Love envisioned marrying these two worlds. Rather than creating a new language from scratch, they extended the existing C by grafting Smalltalk’s object-oriented programming onto it. This pragmatic approach preserved the investment in C code while opening new possibilities for software design. The Free Software Foundation recognized the value of this innovation and integrated it into its toolbox.

Objective-C became a niche language appreciated by curious computer scientists. When Steve Jobs left Apple in 1985 to launch NeXT, he bet on Objective-C to develop NeXTSTEP. This technical choice transformed the language’s destiny. NeXT pushed collaboration to the point of working with Sun Microsystems on OpenStep, a standardized version of their system. The Free Software Foundation followed suit with GNUstep.

Apple, bogged down in the 1990s, desperately sought to modernize Mac OS. After several failed attempts, two options presented themselves to the company: acquire Be, Inc. from Jean-Louis Gassée or buy back NeXT. The choice of NeXT in 1996 sealed Jobs’s return and made Objective-C the technical heart of Mac OS X. A simple experimental language suddenly became the foundation of Apple’s future.

The iPhone arrived in 2007. iOS, derived from Mac OS X, propelled Objective-C onto millions of mobile devices. The iPad three years later amplified the phenomenon even further. Developers worldwide discovered this hybrid language to create applications reaching hundreds of millions of users. This unexpected explosion revealed the robustness of the architecture imagined by Cox and Love twenty-five years earlier.

Faced with this massive adoption, Apple introduced substantial improvements, notably Automatic Reference Counting (ARC), which drastically simplified memory management. Gone were memory leaks and crashes related to lost pointers: ARC automated these tedious and error-prone tasks.

Objective-C’s syntax bears the traces of this turbulent history. Traditional C coexists with Smalltalk’s brackets in a cohabitation that surprises at first glance but reveals its coherence in use. The message system, inherited from Smalltalk, replaces direct method calls with a more flexible mechanism where objects communicate with each other. This approach enables dynamic method binding and facilitates system extension.

Frameworks accompany this syntactic richness. Foundation provides the basic building blocks: data manipulation, collections, file operations. These libraries form a coherent whole that accelerates development and standardizes practices. The id type perfectly illustrates the language’s philosophy: it allows generic manipulation of objects while preserving type safety when necessary.

The ecosystem developed beyond Apple’s borders. GNUstep maintains Objective-C’s cross-platform portability, demonstrating that the concepts transcend the proprietary implementation. Xcode enriches the developer experience with Interface Builder and its visual tools. The community creates libraries, frameworks, and best practices that nurture a virtuous circle.

Swift arrived in 2014 as Apple’s new champion. This announcement did not sound the death knell for Objective-C but testified to its maturity. Apple organized a gradual coexistence rather than an abrupt break, implicitly acknowledging the value of its historical language. The millions of lines of Objective-C code will continue to function for years.

Objective-C demonstrated that extending an existing language could succeed where creating from scratch often fails. Its message model inspired numerous object-oriented frameworks. Its ability to mix procedural and object paradigms opened paths explored by other languages.

Forty years after its birth, Objective-C remains a lesson in pragmatic engineering. Cox and Love knew how to identify the strengths of two different worlds and intelligently merge them. Their creation survived trends, acquisitions, mobile devices, and continues to power applications used daily by hundreds of millions of people. This longevity testifies to a solid design that knew how to adapt without losing its identity.

Top

IPX/SPX

In the mid-1980s, when Novell was working on NetWare, its engineers faced a challenge: creating a network protocol capable of enabling communication between computers distributed throughout the enterprise. They didn’t start from scratch. The XNS system that Xerox had developed a few years earlier served as their model. This pragmatic approach gave birth to IPX/SPX, a protocol suite that would leave its mark on an entire decade of network computing.

The era was marked by the first steps in networking. Companies were discovering the value of sharing files and printers across multiple workstations. NetWare offered an answer to these needs with its client-server architecture, where IPX/SPX played the role of messenger between machines. The protocol was structured in two distinct layers: IPX for routing packets from one point to another, SPX for guaranteeing their proper reception.

Novell had found an elegant trick to simplify network configuration. Where other protocols required complex mapping tables, IPX used the network card’s MAC address directly as the machine identifier. Each IPX address combined a 32-bit network number, defined by the administrator, with this famous 48-bit MAC address. Ultimately, there was no longer any need for an additional protocol to link the logical and physical identities of the machine.

Routing relied on RIP (Routing Information Protocol), but not IP’s RIP. The two protocols shared the name without being compatible. Every minute, IPX routers exchanged their routing tables using a particular metric called a “tick”. A tick corresponded to one-eighteenth of a second, providing an estimate of the delay on each link. If two routes showed an identical number of ticks, the hop count would decide.

SAP (Session Announcement Protocol) added an extra dimension to the IPX/SPX ecosystem. This service announcement protocol transformed the network into a vast bulletin board where each server proclaimed its capabilities: file sharing, printing, messaging. As with RIP, announcements repeated every minute. Routers maintained a list of all available services, ready to inform any client seeking a particular resource.

IPX/SPX’s adaptability manifested in its ability to coexist with different Ethernet standards. The protocol accepted multiple encapsulation types: Novell’s proprietary format, IEEE 802.2 and 802.3 standards, Ethernet II, and SNAP. This flexibility facilitated integration into heterogeneous environments, provided consistency was maintained in the configuration of all equipment.

At the top of the protocol stack, NCP handled business aspects: file access, print control, security. NetBIOS found its place through an emulation layer that allowed existing applications to continue functioning. NLM modules extended the system’s possibilities with new protocols, communication services, or database access.

The 1990s saw IPX/SPX flourish in enterprises. Its ease of implementation and good performance made it a natural choice for network administrators. But the Internet was growing and TCP/IP was establishing itself as the universal language of digital communications. Faced with this rise in power, Novell had to revise its strategy.

The company gradually integrated TCP/IP into NetWare. Later versions offered the ability to encapsulate IPX in UDP/IP packets to traverse networks based on the Internet protocol. The arrival of NDS in NetWare 4.0 reduced dependence on the SAP protocol by allowing clients to locate services through a centralized directory.

The transition accelerated in the 2000s. New applications were born with TCP/IP in their DNA, relegating IPX/SPX to the status of legacy technology. Novell eventually abandoned development of its protocol suite to turn toward IP-based solutions.

This evolution shouldn’t obscure IPX/SPX’s lasting influence. The direct use of MAC addresses, automatic service discovery—these concepts would resurface in more recent technologies. Modern protocols like mDNS or UPnP revisit ideas that SAP had already explored.

The story of IPX/SPX tells the tale of an era when network protocols competed to impose their vision of communication. Faced with the universal standardization of TCP/IP, proprietary solutions, whether technically accomplished or not, could only fade away. This lesson extends beyond the purely technical realm: in an interconnected world, openness and interoperability often prevail over local excellence.

IPX/SPX’s fate ultimately illustrates an immutable truth of computing: no technology, however brilliant, survives without adapting to market changes. TCP/IP didn’t triumph solely through its intrinsic qualities, but because it became the lingua franca of the Internet. In this battle of protocols, universality had the last word.

Top

SMB

In 1984, IBM created a protocol that would transform how computers share their resources: SMB, or Server Message Block. At the time, no one imagined this technology would become the backbone of modern enterprise networks. Yet the concept seemed simple: enable remote machines to share files and printers via NetBIOS.

Microsoft seized this innovation. As early as 1987, the Redmond firm integrated SMB into its LAN Manager program, followed the next year by IBM with its OS/2 LAN server. These initial implementations marked the beginning of a system that would soon colonize offices worldwide. Local network resource sharing had finally found its path.

Nine years later, Microsoft reached a decisive milestone. Windows NT 4.0 arrived in 1996 with CIFS, the Common Internet File System. This dialectal version of SMB adapted to increasingly complex enterprise networks. The IETF published a draft standard the following year, while SNIA proposed a technical specification in 1999. The protocol gradually emerged from its proprietary constraints.

The millennium changed, and with it the protocol’s designation. Windows 2000 returned to the SMB name while bringing considerable extensions. Dedicated TCP ports (445) were added to the traditional NetBIOS ports (137-139). This evolution accompanied NetBIOS’s migration to TCP/IP, reflecting the growing maturity of network infrastructures.

The year 2006 marked a turning point. Windows Vista introduced SMB 2.0, a version that revolutionized the protocol’s architecture. Microsoft streamlined the design: the 75 commands of the old version were reduced to 19 instructions. Asynchronous operations appeared, reads and writes gained efficiency. HMAC SHA-256 hashing replaced the antiquated MD5, strengthening the security of a protocol that had become indispensable.

Windows 7 refined this foundation with SMB 2.1 in 2009. Cache management improved, high-bandwidth networks benefited. The file leasing system replaced the opportunistic locking mechanism, reducing network chatter while improving metadata management.

But 2012 witnessed SMB’s finest transformation. Windows 8 and Windows Server 2012 delivered SMB 3.0, a version that finally addressed modern enterprise needs. High availability became possible through transparent failover. Multichannel aggregated multiple network connections, multiplying performance. The RDMA protocol integrated naturally, achieving previously unimaginable throughput. AES-CCM encryption secured the most sensitive exchanges.

The evolution didn’t stop there. SMB 3.02 arrived with Windows 8.1 the following year, bringing support for asymmetric clusters and perfecting RDMA. Windows 10 introduced version 3.1.1, further hardening security against growing threats.

Meanwhile, another story unfolded in the free software world. Andrew Tridgell developed Samba, an alternative implementation that reconciled the Windows and UNIX universes. This titanic reverse-engineering project forced Microsoft to publish its specifications, democratizing a long-guarded protocol. Samba transformed SMB into a de facto standard, transcending operating system boundaries.

This four-decade evolution tells the story of enterprise networking. From a simple local sharing mechanism, SMB evolved into a complex system managing performance, security, and high availability. Each version responded to shifts in enterprise computing: virtualization, cloud, data protection.

Security remains the protocol’s Achilles’ heel. The early LM and NTLM authentication mechanisms showed their weaknesses. NTLMv2 and packet signing, introduced with Windows NT 4.0 SP3, patched the gaps. Recent versions integrate robust encryption and anti-attack protections, addressing contemporary threats.

Having become the universal standard for file sharing, SMB equips all modern operating systems. Its constant adaptation accompanies distributed architectures, cloud storage, and new cybersecurity threats.

The vulnerability discovered by the Cult of the Dead Cow in 2001 (CVE-2008-4037) necessitated major fixes. SMB relay attacks required the addition of specific protections. Each incident forged a more resilient protocol.

Modern IT environments demand ever more: increased performance, enhanced security, adaptation to new storage architectures. This perpetual capacity for evolution explains the longevity of a protocol born forty years ago, still indispensable to 21st-century enterprises.

Top

Lotus 1-2-3

VisiCalc reigned supreme in the spreadsheet market when Mitch Kapor and Jonathan Sachs decided to create their own software. It was 1982, and these two men knew well the universe they were about to disrupt. Kapor had worked at Personal Software, VisiCalc’s publisher, where he designed VisiPlot and VisiTrend for graphical visualization. Sachs had already programmed spreadsheets, notably for Data General. This field knowledge gave them a head start: they knew exactly what existing solutions were missing.

The IBM PC, launched the previous year, brought new credibility to microcomputing and offered technical capabilities far beyond 8-bit computers. Its 640 KB of memory and 16-bit processor opened unprecedented possibilities. Kapor and Sachs decided to bet on this platform and design a spreadsheet that would fully exploit its resources.

Their technical approach broke with their competitors’. While others used high-level languages, they chose assembly language to maximize performance. They programmed direct interface with video hardware, bypassing the BIOS to accelerate display. They rethought the cell recalculation order to optimize the updating of interdependent formulas, resulting in execution speed five times superior to other spreadsheets.

The user interface benefited from remarkable innovations. Pull-down menus with mobile cursor and contextual prompts made the software accessible to novices. Graphics integration instantly transformed data into diagrams, creating spectacular demonstrations that sparked prospects’ enthusiasm. Keyboard macros, added despite Sachs’s initial reluctance, gave experienced users the ability to automate their tasks and develop true applications.

Lotus Development Corporation didn’t settle for a good product: the company also innovated in its commercial strategy. It trained its resellers, took care of customer support, and accompanied the software with a demonstration disk featuring an interactive tutorial. The documentation was carefully crafted. Above all, the company invested heavily in advertising in the business press, an unusual approach for a software publisher at that time.

The launch at Comdex in fall 1982 exceeded all expectations: $900,000 in orders in the first few days. In 1983, the first full year of commercialization, revenue reached $53 million. The following year, it jumped to $156 million, and staff grew from 20 to 750 people in just two years, making Lotus the world’s largest software publisher.

This success was explained by a perfect alignment of the stars. The product answered exactly the needs of companies discovering microcomputing. Its compatibility with VisiCalc files facilitated migration. The IBM PC installed base was experiencing explosive growth. Product quality and commercial execution left no room for competitors.

But this lightning success created its own difficulties, making company growth unmanageable. Original teams moved to other projects. Development of new products proved complex. Symphony, an attempt to create an integrated office suite, disappointed. Jazz, the Macintosh version, failed against Microsoft’s Excel.

Diversification attempts multiplied with Metro, Express or Manuscript, never recapturing the magic of 1-2-3. Only Notes, developed by Ray Ozzie, met with real success after IBM’s acquisition of Lotus in 1995.

Technological evolution eventually caught up with the leader. The transition to Windows proved late and laborious. The assembly code, initially an asset, ultimately became a burden for evolving the product. Microsoft Excel, designed from the outset for graphical interfaces, gradually gained market share and established itself as the new standard.

Lotus 1-2-3 demonstrated the gigantic commercial potential of the professional software market. It established the standards for modern user interface with its pull-down menus and keyboard shortcuts. Macros enabled programming by end users. The DIF (Data Interchange Format) facilitated data exchange between applications.

Lotus 1-2-3 transformed the perception of personal computers in business, from experimental tool to indispensable instrument for individual productivity. This transformation accelerated the computerization of organizations and founded a true software industry. Kapor and Sachs’s spreadsheet wasn’t just a product: it was a catalyst that precipitated the business world’s entry into the digital era.

Microsoft Word

November 1983. Readers of PC World discovered with surprise a diskette tucked inside their magazine. On that diskette: Multi-Tool Word, an unknown word processor from Microsoft. This bold distribution marked the commercial debut of software that would radically transform our relationship with digital writing.

The story begins ten years earlier, however, in the refined laboratories of Xerox PARC. Charles Simonyi, a young Hungarian computer scientist fresh from Stanford, joined this innovation hub in 1972. He worked on Bravo, a project that challenged conventions: for the first time, a word processor displayed text exactly as it would appear on paper. Gone were cryptic formatting codes, gone were surprises at printing. The concept bore a name that would make its fortune: WYSIWYG, “What You See Is What You Get.”

Bravo remained confined to the laboratories. Xerox, the photocopying giant, failed to grasp the commercial potential of this research. Bill Gates, however, had his eye on it. In 1981, he recruited Simonyi with the specific mission of creating Microsoft’s office applications. The Hungarian drafted a three-and-a-half-page memorandum that defined the entire strategy, an integrated software suite for individuals and businesses. This document, now kept in a vault, outlined what would become Office.

Word was born in this context. Its first MS-DOS version broke with contemporary practices. Where WordStar, the market leader, made do with the keyboard, Word bet on the Microsoft mouse. Advertisements touted this rudimentary graphical interface that displayed bold, italic, or underlined text. Novel for MS-DOS, but limited since changing fonts was impossible. Sales struggled and WordStar kept its faithful users, accustomed to its keyboard shortcuts.

Everything shifted with the Macintosh’s arrival in 1984. Apple imposed its graphical interface, and Microsoft seized the opportunity. Simonyi’s team, no more than twenty developers, adapted Word for the Mac. This time, Bravo’s concepts found their medium of expression: multiple fonts, sophisticated layout, complete WYSIWYG. The software finally revealed its potential.

This development philosophy illustrated Microsoft’s approach in the 1980s. Spot promising innovations, adapt them to the mass market, iterate relentlessly. Simonyi cultivated this deliberately small organization to maintain its coherence and agility, reminiscent of his years at PARC.

With Windows, Microsoft transposed the Mac version’s achievements to its platform. Windows 3.0, launched in 1990, propelled Word to the top. The commercial strategy proved successful with its bundled sale with Windows and slashed prices. WordPerfect and WordStar, though well established, lost their footing. Word was now indispensable.

Microsoft didn’t merely dominate the word processing market. The company orchestrated integration among its applications: Word, Excel, PowerPoint now formed Office. This coherence met the expectations of businesses seeking compatible tools. The .doc format naturally established itself as the reference for exchanging documents.

The 1990s saw Word enriched endlessly: spell checker, thesaurus, mail merge, tracked changes. This accumulation of features proved divisive. Casual users got lost in endless menus. Professionals appreciated this functional richness. Simonyi questioned this drift of Word toward ever-increasing complexity.

Beyond the tool itself, Word transformed our writing practices. Sophisticated formatting, once the preserve of typographers and publishers, became democratized. Anyone could create professional-looking documents. This accessibility had its downside: built-in templates standardized presentation. Word influenced how we structure thought, how we organize ideas.

This criticism has accompanied Word since its beginnings. Does the software shape our creativity? Does it standardize our expression? These questions have traversed forty years of technological evolution. From MS-DOS to the cloud, from diskette to smartphone, Word has accompanied our digital transformations.

Office 365 propelled Word into the collaborative era. Documents live in the cloud, are shared in real time, follow the user across all devices. This omnipresence crowns four decades of continuous evolution.

Charles Simonyi left Microsoft in 2002. His creation survives him, transformed but faithful to its founding principles. The reference word processor, Word is used by hundreds of millions of people as their word processing software of choice. Its history encapsulates that of consumer computing, from laboratory innovation to universal tool, a technology that shapes our daily actions.

Top

MSX

Personal computing in the 1980s resembled a technological wild west. Each manufacturer developed its own machines, incompatible with one another, turning the purchase of a family computer into a veritable obstacle course. Software written for a Commodore wouldn’t run on an Amstrad, and vice versa. This technological cacophony frustrated Kazuhiko Nishi, founder of ASCII Corporation in Japan (no relation to American Standard Code for Information Interchange) and vice president of emerging technologies at Microsoft.

The man had carefully observed the success of the VHS standard in the video industry. Where Betamax and VHS clashed, victory had gone to the format that managed to unite the greatest number of manufacturers. Drawing from this lesson, Nishi conceived the MSX in 1983. His idea boiled down to a few words: create a technical standard that different manufacturers could adopt, thus guaranteeing total compatibility between all machines bearing the MSX logo.

The chosen name revealed the project’s ambition. “Machines with Software eXchangeability”, according to its creator, expressed this desire to enable software exchange between computers of different brands. Other interpretations circulated: “MicroSoft eXtended BASIC” or “Matsushita Sony X-machine”, testifying to the technological giants’ commitment behind the initiative.

The MSX’s technical architecture relied on already proven components. A Zilog Z80A processor running at 3.58 MHz formed the heart of the system, supported by a TMS9918 video chip capable of displaying 16 colors at 256x192 resolution, and an AY-3-8910 sound generator. RAM ranged from 8 to 128 KB, typically 32 or 64 KB depending on the model. This configuration allowed the use of three different media: cassettes, floppy disks, and cartridges.

Microsoft developed MSX-DOS, an operating system inspired by the MS-DOS of IBM PCs. It integrated an extended BASIC that simplified programming and guaranteed compatibility between machines. In a market where each manufacturer typically cobbled together its own solutions, this hardware and software standardization represented a genuine innovation.

Japanese electronics giants took the bait. Sony, Toshiba, Panasonic, Canon, Yamaha: all adopted the standard. Sony kicked things off with its HB-10 in 1983, initially dressed in red and white to appeal to the Japanese market, then in black for export. Over twenty manufacturers produced their own MSX computers, some like Sony and Philips offering up to a dozen models in their lineup.

In Japan, success was immediate. Consumers appreciated this compatibility guarantee that spared them unpleasant surprises. Video game publishers sensed the opportunity: Konami developed its Metal Gear series there, while Castlevania and Gradius found a privileged platform for expression on MSX. It was on this platform that the famous “Konami Code” was born, that legendary button combination in the video game universe.

The phenomenon spread far beyond Japanese borders. South Korea, Brazil, and Chile massively adopted the MSX. In Europe, the Netherlands became its stronghold, with Philips making it their flagship product. The Soviet Union succumbed: the Ministry of Education acquired hundreds of MSX1 and MSX2 computers to equip school computer labs, training an entire generation of programmers behind the Iron Curtain.

Yet America resisted. Only the SpectraVideo and the Yamaha CX-5M (marketed as a musical instrument) managed to cross the Atlantic. In 1984, the Commodore 64 reigned supreme over American territory, and the market saw little interest in adopting what it considered a technically inferior product. Across the Channel, competition from the ZX Spectrum and the prohibitive price of cartridge games compared to cheap cassettes severely limited its adoption.

Faced with this resistance, the standard evolved. The MSX2, launched in 1986, brought substantial improvements: more memory, a faster processor, a more powerful graphics chip. It maintained compatibility with its predecessor and integrated a floppy disk drive, an economical alternative to cartridges. In 1988, as European distribution ceased, the MSX2+ appeared in Japan and Korea, followed by the MSX TurboR, the ultimate evolution of the standard.

The TurboR marked a breakthrough by adopting a 16-bit processor while preserving Z80 compatibility. Alas, its high price and compatibility problems with certain earlier software broke its momentum. It was only distributed in Japan, with an exclusively Japanese BIOS that further limited its reach.

The emergence of the IBM PC and Microsoft’s focus on Windows ultimately sounded the death knell for the MSX. ASCII Corporation attempted to keep the standard alive, developing the visionary idea of networking MSX computers to create a “primitive internet”. But the refusal to adopt the CD-ROM format definitively buried hopes for the MSX3.

The MSX legacy endures in the retro computing enthusiast community. Its standardization philosophy and rich software library continue to inspire developers and hobbyists. Its influence can be found in chiptune music and retrogaming culture. Projects like the 1chipMSX, which concentrates all MSX logic into a single FPGA chip, perpetuate its spirit of innovation.

The MSX adventure illustrates a bold attempt at standardization that anticipated issues still relevant today. Though the standard didn’t achieve its goal of world domination, it demonstrated the viability of a collaborative approach to computer development.

Top

Novell NetWare

In Utah in 1979, four men embarked on a computing adventure without imagining they would revolutionize how machines communicate with each other. Dennis Fairclough, George Canova, Darin Field, and Jack Davis founded Novell Data Systems Inc. in Orem, far from Silicon Valley’s laboratories. Their financial backer, Safeguard Scientifics, bet on this team to develop CP/M-based systems. The early steps were arduous: competition was fierce and sales struggled to take off.

The idea that would change everything emerged almost out of necessity. Why not connect these microcomputers proliferating in offices, rather than persist in a saturated market? This insight didn’t come from nowhere. Several team members, including newcomers Drew Major, Dale Neibaur, and Kyle Powell, had worked on government contracts that familiarized them with ARPANET. They understood the power of networks before the general public discovered the Internet.

The road remained fraught with obstacles. Safeguard’s board of directors nearly shut down the operation, but the company secured new financing at the last minute. Raymond Noorda arrived as CEO in 1983. This experienced businessman renamed the company Novell Inc. and bet everything on NetWare, their network operating system.

NetWare’s approach stood out from what existed at the time. A machine designated as the dedicated network server, orchestrating access to hard drives and printers shared by other workstations. The star topology relied on twisted-pair cabling, a more reliable technology than competing solutions. NetWare, initially called ShareNet, came in two versions: one for Intel 8086 processors, the other for Motorola.

The 1980s saw personal computing massively installed in businesses. NetWare rode this wave and became the reference for departmental local area networks. In 1986, Novell expanded its territory with GroupWise, a messaging and collaborative tools solution long before Microsoft Office 365.

The early 1990s established NetWare’s dominance: nearly 70% of the network operating system market belonged to it. Version 4 radically transformed the system’s capabilities. Novell Directory Services revolutionized administration by consolidating multiple servers under unified management. The NetWare Administrator interface simplified network administrators’ daily work. NetWare’s file and print services had no equivalent at the time.

Everything shifted with the Internet explosion in the late 1990s. Local area networks interconnected, extending beyond department boundaries to embrace entire buildings, then continents via WAN connections. Novell responded with intraNetWare, which included an improved version of NetWare 4.11. This version facilitated installation, boosted performance, and introduced the first 32-bit client for Windows.

IntraNetWare adapted to new uses with an IPX/IP gateway for connecting workstations to IP networks, an integrated web server, native support for DHCP and DNS. NetWare still excelled in its area of expertise, but companies criticized its weakness in hosting business applications.

Microsoft chose this moment to strike hard. The Redmond firm ended its partnership with 3Com and developed Windows NT with integrated network features. This vertical integration strategy directly threatened Novell’s model. Facing this offensive, the Utah company made a strategic mistake: it acquired WordPerfect and Quattro Pro to confront Microsoft on its own turf. This diversification scattered efforts and resources across markets that Novell poorly understood.

Microsoft played a war of attrition. Its considerable financial resources gave it the means to persevere until Windows NT prevailed over NetWare. Novell, engaged on too many fronts, had to divest its acquisitions and constantly revise its strategy without regaining its former glory.

The turn of the 2000s saw Novell embrace open-source philosophy. NetWare was reborn as Open Enterprise Server (OES), available in two versions: one retained the traditional NetWare kernel to ensure compatibility, the other relied on SUSE Linux Enterprise Server. NetWare 6.5 marked the end of an era: it corresponded to OES-NetWare Kernel with Support Pack 2. From then on, Novell recommended running NetWare in a Xen virtual machine hosted on SUSE Linux Enterprise Server.

This thirty-year trajectory tells the story of enterprise networking evolution. NetWare was born from an insight: connecting isolated machines to multiply their utility. It accompanied organizations’ computerization, transforming standalone workstations into collaborative networks. Its fall against Microsoft illustrates the computing industry’s brutality, where dominant position guarantees nothing against a determined and well-funded competitor. The final conversion to open source testifies to the sector’s transformations, where openness and interoperability have also become imperatives for survival.

Top

Oric-1

Cambridge, October 1979. Dr Paul Johnson and Barry Muncaster had just founded Tangerine Computer Systems in the effervescent atmosphere that characterized the microcomputer industry. Their company’s name reflected a somewhat whimsical trend: after Apple, the age of fruit in computing had arrived. Their first offspring, the Microtan 65, paved the way for what would soon become a far more ambitious adventure.

Paul Kaufman joined the team during the summer of 1981 and took charge of the Tansoft Gazette. The company grew and restructured. In early 1982, after splitting off its Tandata Prestel division, Tangerine relocated to Cambridge Science Park and launched Tansoft, its software arm. It was at this point that Oric Products International was born in April 1982, with Tangerine serving as the research laboratory.

Initially, however, the idea seemed quite different: to design a desktop computer for executives, capable of interfacing with the Prestel network. Kaufman drafted specifications for a hypothetical Microtan 2, enhanced with sound and graphics capabilities. These reflections gave birth in late 1981 to the Tangerine Tiger project, an ambitious machine equipped with three processors: a Z80 for CP/M, a 6809 to manage peripherals, and a dedicated graphics chip. But this Tiger would never roar: H.H. Electronics acquired the project, which would sink into obscurity.

On 27 January 1983, it was in the sumptuous setting of Coworth Park Manor, the former residence of Lord and Lady Derby, that Oric Products International officially unveiled the Oric-1. Peter Harding, the 34-year-old commercial director, set the tone with disconcerting confidence: six major contracts signed with retail chains for over 200,000 units. His objective was to beat Clive Sinclair by offering much more for much less money.

The pricing strategy seemed bold. One had to shell out £129 for the 16k version, £169.95 for the 48k model. Behind this machine, a team of talented developers worked tirelessly. Andy Brown and Chris Shaw, two seasoned programmers, tackled the main ROM. Peter Halford developed the cassette routines and Oric Mon, while Kaufman coded the sound portions on a Microtan in Forth before Andy Brown transposed them into machine language.

The first reactions from the specialist press remained cautious. The Centronics printer port caused a sensation: “unusual, even unique for a machine at this price,” noted one critic. But ROM bugs and cassette loading failures tarnished the image. Oric’s reaction was swift: Cosma Sales, the cassette duplicator, was shown the door, accused of being responsible for the tens of thousands of defective cassettes in circulation.

Across the Channel, success was slow to come. France proved more welcoming, with A.S.N. signing an exclusive distribution contract on 29 June 1983 based on 4,000 units per month. The French market developed a quality software ecosystem, making the Oric the French equivalent of the British Spectrum. An unexpected success in a country with a reputation for being difficult to conquer.

This upturn was short-lived. February 1985: Edenspring placed Oric into receivership. The figures were chilling: £5.5 million in debt against only £3 million in assets. The fall seemed inevitable when, in June 1985, S.P.I.D. emerged from France and acquired the company for “several hundred thousand pounds.” The Atmos, successor to the Oric-1, came back to life in a Norman factory.

The new French owners attempted to relaunch the machine with the Telestrat, specially designed for the Minitel network. The idea seemed attractive on paper, but the high price and market specificity limited ambitions. December 1987: Oric International fell back into receivership, this time with substantial tax debts on the books. A year later, in December 1988, production lines shut down permanently.

The Oric-1 was based on the popular 6502 processor, a choice dictated as much by its reputation as by its moderate cost. Its memory handled text and high-resolution display with remarkable sound capabilities. Its serial graphics attributes proved well-suited to games, a significant asset. The machine featured modified Microsoft BASIC and a UHF modulator for television connection.

The Oric story reveals the pitfalls of the 1980s market. Despite honorable technical specifications and attractive prices, the company failed to control its growth and consistently supply the market with quality software. Broken promises accumulated, notably concerning the Microdisc floppy drive, announced in 1983 but only available in 1984, when it was almost too late.

Yet the Oric survived its own demise thanks to a community of passionate users. Publications like Oric User Monthly continued to appear long after the company’s end. In France as in the United Kingdom, user groups still maintain the memory of this machine which, despite its flaws, introduced an entire generation to personal computing.

The Oric adventure embodies that era when technical boldness rubbed shoulders with commercial improvisation. While the ultimate failure was perhaps inevitable, it testifies to the creative vitality of the British computer industry in the 1980s, capable of offering credible alternatives to industry giants in a constantly evolving market.

Amstrad CPC 464

In 1984, Alan Sugar was doodling on a paper napkin during a transatlantic flight. This sketch would give birth to one of the most significant machines in British home computing: the Amstrad CPC 464. The idea seemed simple on paper, but its technical implementation required far more than a few pen strokes. Roland Perry, Amstrad’s technical director, inherited this sketch and transformed it into reality.

The challenge was substantial. Sugar wanted a computer that would work straight out of the box, without the usual headache of multiple connections that discouraged so many families. The CPC 464 integrated everything into a single unit: keyboard, cassette drive, and direct connection to its dedicated monitor. A single power cable sufficed. This simplicity contrasted with the practices of the era, when setting up a computer system often proved an ordeal.

Perry had initially set his sights on the 6502 processor, but British developers were more familiar with the Z80. This Zilog processor, running at 4 MHz, already powered numerous machines across the Channel. The decision proved judicious: the 64 KB of RAM came with remarkable graphics capabilities. Twenty-seven available colors, several display modes—enough to appeal to gaming enthusiasts and aspiring programmers alike.

Amstrad’s commercial strategy deserves attention. Sugar offered two configurations: £200 for the monochrome screen version, £300 for color. These prices cleverly positioned between the Sinclair Spectrum—cheaper but limited—and the BBC Micro—more powerful but out of reach for many budgets. Most importantly, Sugar anticipated the pitfall that had killed many promising machines: the lack of software at launch.

Forty prototypes went to developers. Training sessions were organized, and a substantial software library awaited the first buyers. This meticulous preparation paid off. Users discovered a reliable, easy-to-install machine that delivered on its promises. While patience remained necessary with the interminable access times of magnetic storage, the integrated cassette drive simplified program loading.

This limitation prompted Amstrad to react quickly. 1985 saw the birth of the CPC 664, equipped with a 3-inch floppy drive. But this intermediate model gave way to the CPC 6128, featuring 128 KB of memory and more professional ambitions. Evolution was underway, but the original 464 retained its appeal for the general public.

Games made the CPC’s reputation. The “Roland” series paid tribute to Perry, while the graphics and sound capabilities enabled quality adaptations. The integrated BASIC opened the doors of programming to beginners, supported by careful documentation. Some dared venture into serious applications: Protext demonstrated that a proper word processor could run on the machine.

The international market was not forgotten. In France, Amstrad paid attention to localization: adapted keyboards, translated manuals, organized distribution. This attention to national specificities bore fruit in several European countries. The CPC established itself durably in households, opening the doors of the computer market to Amstrad. The company followed up with the PCW 8256 in 1985, then launched PC compatibles as early as 1986.

As the years passed, Amstrad attempted to reignite the flame. 1990 saw the arrival of the Plus models, 464 Plus and 6128 Plus, with improved graphics. A derived console, the GX4000, failed against the Japanese giants. Production faltered in the early 1990s, but the story did not end there.

An active community preserves the CPC’s legacy. Emulators resurrect the software of yesteryear, while amateur developers create new programs. This loyalty testifies to the machine’s enduring qualities: coherent design, comprehensive documentation, unshakeable robustness. Amstrad’s pragmatism hit the mark: ease of use rather than a performance race.

Top

Apple Macintosh

The story begins in 1979 in the corridors of Apple, where Jef Raskin, a veteran of the Apple II team, was mulling over an idea that seemed crazy at the time. He wanted to create a computer as simple as a toaster, sold for a thousand dollars and mass-produced. Apple’s management, absorbed by the Apple III’s troubles and the costly development of the Lisa workstation, greeted this proposal with a polite shrug.

Everything changed when Burrell Smith joined the team. This Apple II repairman from the maintenance department was hiding technical genius. After a visit to Silicon Valley that had electrified him, he had started tinkering with microprocessors. His makeshift prototype—a Motorola 6809 jury-rigged onto an Apple II with a TV screen—impressed Raskin so much that he immediately brought him into the project. Steve Jobs, then vice president but sidelined from the Lisa project for lack of experience, sensed the opportunity and took charge of the Macintosh.

Jobs set up his team far from headquarters, in offices nicknamed “Texaco Towers” because they were located behind a gas station. The place looked more like a student apartment share than corporate offices. This physical proximity created a particular dynamic: Andy Hertzfeld, in charge of the operating system, worked side by side with Smith on the hardware. This close collaboration allowed them to exploit every available bit of memory.

The team had set themselves an ambitious goal: 300,000 units per year. Smith, obsessed with cost reduction, set out to revolutionize the architecture. Where the competition multiplied connectors and expansion slots, the Macintosh would have only two circuit boards. No buffers, no superfluous slots. Vital functions would be burned into read-only memory for improved reliability.

The approach was radical: specialized architecture versus general-purpose flexibility. The team methodically dissected competing products and found that their multiple connectors slowed performance while inflating prices. The Macintosh would rely on a fast serial port for future expansions.

The user interface drew inspiration from the Lisa, but in a streamlined version. Graphical windows allowed multiple programs to be displayed, icons replaced obscure commands, the mouse became king. However, everything had to run with less memory than the Lisa, and faster. The team rewrote the software from the ground up, abandoned the Lisa’s multitasking in favor of an illusion created by small system programs like the calculator.

Bill Atkinson bequeathed them QuickDraw, his graphics management software created for the Lisa. This toolkit, stored in ROM, became the soul of the Macintosh. Developers could enhance the system by loading additional definitions from disk, thus preserving the limited memory.

The choice of the Motorola 68000 divided the team. This expensive processor far exceeded immediate needs, but Jobs bet on falling prices. The team optimized everything: only one processor mode used out of four available, three interrupt levels out of seven, file system with a single index.

Manufacturing required a revolutionary factory in Fremont, California. Inspired by Japanese methods, it applied just-in-time production and zero defect tolerance. Suppliers delivered small batches frequently, errors were analyzed immediately. The factory was designed in three evolutionary phases, anticipating increased production rates.

In January 1984, the Macintosh arrived at $2,495. The price was far from the thousand dollars Raskin had dreamed of, but the computer made an impression. The Super Bowl commercial, a pastiche of Orwell’s 1984, presented Apple as a liberator against IBM uniformity. The user-friendly graphical interface and compact design created a new standard.

Criticisms were plentiful: high price, network limitations, insufficient memory. But the Macintosh had planted a seed. It demonstrated that a computer could be something other than a calculating machine: a design object, an extension of human thought.

The original team had established principles that would survive all trends: design first, deep hardware-software integration, obsessive simplicity. Jobs left Apple in 1985 after a conflict with the board of directors. The company sank into incoherence, multiplying models without a common thread.

Jobs’s return in 1997 put these fundamentals back at the center. Streamlined lineup, refined design, total integration: these principles, applied to the iPod and then to the iPhone, allowed Apple to conquer the world. That beige machine from 1984, born from the meeting between a brilliant repairman and an obsessive visionary, had changed our relationship with technology.

Top

HP-UX

In the early 1980s, Hewlett-Packard decided to venture into the UNIX world. The market was buzzing with activity, with manufacturers vying to offer their own variant of the system. In 1984, HP released the first version of HP-UX, a reworked UNIX System V that ran exclusively on HP 9000 servers in the 200, 300, and 400 series. These machines featured the Motorola 68000 processor, already proven in other contexts.

HP did more than simply adapt UNIX. The engineers immediately introduced access control lists, a more refined approach than the traditional UNIX permissions system. A logical volume manager was also part of the basic equipment, simplifying life for administrators facing storage challenges. These additions were no gimmicks: they addressed the concrete needs of professional users.

The evolution followed the pace of HP hardware. Version 5.0 from 1985 extended to the Integral PC, then to the 200/300 and 500 series. HP began unifying its approach, seeking consistency across its different product lines. This strategy paid off when the PA-RISC architecture arrived toward the end of the decade.

PA-RISC, these RISC processors developed internally by HP, would guide the system’s technical evolution for years to come. In 1988, HP juggled two development lines: versions 3.x for the 600/800 series and 6.x for the 300 series. This situation reflected the delicate transition between the old architectures and the new PA-RISC generation.

Unification became concrete in 1990 with HP-UX 7.x, which covered the 300/400 and 600/700/800 series. Users gained in simplicity, administrators in consistency. A year later, version 8.x brought shared libraries, significantly improving performance and memory usage. Version 9.x from 1992 enriched the arsenal with the System Administration Manager and Logical Volume Manager, two tools that modernized system administration.

HP-UX 10.0 came out in 1995. This version unified the 700 and 800 series while adopting the SVR4 directory structure, aligning with the UNIX standards of the time. The following year, HP-UX 10.20 crossed into 64-bit territory with PA-RISC 2.0 processors. A technical feat that opened new horizons.

HP-UX 11.00 achieved a remarkable accomplishment in 1997: supporting 64-bit addressing without abandoning 32-bit applications. This backward compatibility spared companies the abrupt disruption of a migration. Three years later, HP-UX 11i v1 (11.11) introduced operating environments, customizing the system according to the specific needs of each type of usage.

The architecture underwent a new upheaval in 2001 with HP-UX 11i v1.5 (11.20), which integrated Intel Itanium support. This strategic decision accompanied the gradual transition from PA-RISC to Intel processors. HP-UX 11i v2 (11.23) from 2003 strengthened virtualization, while version 11i v3 (11.31) from 2007 added native multipathing and further refined virtual capabilities.

Over successive versions, HP-UX earned a solid reputation in IT services, software publishing, and finance. The VxFS file system and LVM manager became essential references. System administrators appreciated their flexibility and efficiency in managing storage, especially in complex environments.

As a consistent strength, HP progressively integrated security and intrusion detection directly into the kernel, developed secure partitioning, and implemented role-based access control. These elements elevated HP-UX among the most secure UNIX systems on the market, a decisive advantage for sensitive applications.

Virtualization occupied an increasingly important place in the system’s evolution. nPartitions arrived first, followed by vPars (Virtual Partitions), and finally Integrity Virtual Machines. These technologies allowed companies to optimize their hardware resources and consolidate their servers. The transition from PA-RISC to Itanium required considerable technical effort, but HP skillfully preserved application compatibility.

Clustering with HP Serviceguard and continuous improvements to administration tools kept HP-UX competitive against the changing demands of organizations. But Linux gained ground and cloud computing changed the rules of the game. HP-UX’s market position found itself progressively called into question.

In 2017, HP announced its roadmap through 2025, reassuring its loyal user base. This announcement addressed concerns from companies relying on HP-UX for their critical operations. However, the aging of compatible hardware posed a problem: HP 9000 and Integrity servers were difficult to maintain, spare parts were becoming scarce.

In response, new approaches emerged. Hardware emulation, notably with Charon-PAR from Stromasys, offered an interesting alternative. Companies could continue running their HP-UX applications on modern x86 hardware or in the cloud, thereby escaping the constraints of aging proprietary hardware.

The history of HP-UX demonstrates constant adaptation to enterprise needs. Its innovations in logical volume management, security, and virtualization inspired the evolution of professional operating systems. Its reputation for reliability and robustness made it the tool of choice for numerous critical applications, from finance to healthcare and industry.

Top

NFS

In 1985, Sun Microsystems released a technology that would transform how computers share files across networks: the Network File System, better known by the acronym NFS. At the time, the idea of connecting distant machines to access shared data posed a significant technical challenge. Yet this innovation would become one of the cornerstones of distributed systems.

The story begins with SunOS 2.0, which integrated the first public version of the protocol, NFSv2. But behind this release lay Sun’s bold strategic vision. Where other companies would have kept their technology proprietary, Sun bet on openness. The NFS protocol only defined the message formats exchanged between clients and servers, leaving everyone free to implement their own solution. This decision spawned a thriving ecosystem: NetApp, EMC, IBM and many others joined the race, creating their own NFS servers while maintaining interoperability.

This strategy contrasted sharply with previous approaches. SVR3’s Remote File System, for example, established a direct correspondence between client and server system calls. The approach worked, certainly, but revealed its limitations as soon as a server crashed: clients often found themselves stuck, forced to reboot to resume their activities.

Sun’s engineers took a radically different direction. They designed NFS as a "stateless" system, where the server keeps no record of ongoing operations. Each request arrives with all the information needed for its processing. This architecture, counterintuitive at first glance, conceals formidable elegance: if the server stops abruptly, clients need only resend their requests once it has restarted. No complex state reconstruction, no delicate synchronization.

At the heart of this mechanism lies the "file handle," a true identity card for each file on the server. In NFSv2, this descriptor gathers a volume identifier, an inode number, and a generation number. This last element prevents confusion when the system reuses inode numbers: each file thus maintains its unique identity, even after deletion and recreation.

Of course, querying the server at every file access would have generated unsustainable network traffic. NFS therefore integrates client-side caching mechanisms, temporarily storing accessed data and metadata. This optimization immediately raises a thorny question: how to ensure that all clients see the same information? The answer lies in the "close-to-open" model. Modifications travel up to the server when files are closed, while clients verify data state at each opening. This approach favors performance while maintaining reasonable consistency.

The protocol would undergo several metamorphoses over the years. In 1995, NFSv3 arrived with performance improvements, notably better handling of asynchronous writes. But it was NFSv4, standardized in 2000, that truly revolutionized the architecture. Paradoxically, this version abandoned the stateless principle in favor of a stateful protocol, introduced cache delegations and considerably improved Internet compatibility.

Security was the Achilles’ heel of early versions. The authentication system relied on blind trust between clients and server, simply using UNIX numerical identifiers. This naivety was gradually corrected through Kerberos integration, then by the more sophisticated mechanisms of NFSv4.

But NFS development necessitated the creation of the Virtual File System interface, an abstraction layer allowing the coexistence of different file systems within the UNIX kernel. This innovation, still present in modern systems, greatly simplified the integration of new file systems.

The 1990s saw fierce competition emerge. Microsoft pushed its Server Message Block, later renamed Common Internet File System, which naturally established itself in the Windows universe. More ambitious systems like Andrew File System or, more recently, Lustre for supercomputers, offered advanced features for specific niches.

Yet NFS withstood these assaults. Its relative simplicity, proven robustness and near-universal availability make it a sensible choice for many organizations. Storage manufacturers continue to innovate around the protocol. NetApp, notably, developed hardware optimizations to accelerate synchronous writes, historically NFS’s weak point.

NFS’s technical legacy still informs the design of distributed systems. The stateless architectural principle, initially controversial, now inspires numerous network protocols. The lessons learned about cache consistency continue to fuel academic and industrial research.

As cloud storage reshapes computing’s landscape, the challenges NFS addressed remain relevant. While solutions evolve, distribution, consistency and fault tolerance are issues that transcend time. The NFS adventure demonstrates that a pragmatic approach favoring robustness over sophistication can endure. In a field where technological changes succeed one another at a breakneck pace, this longevity commands respect.

Top

POP3

In 1984, electronic communications faced a problem that is difficult for us to imagine today. Messages arrived successfully on servers through the SMTP protocol, but how could users retrieve them without a permanent Internet connection? This question concerned engineers of the era, who sought a practical solution for users still accustomed to intermittent telephone connections.

The POP (Post Office Protocol) emerged to address this need. The first version appeared in 1984. The idea was simple: create an electronic “mailbox” system where messages wait until the user comes to collect them. POP2 followed in 1985 with some improvements, and finally POP3 arrived in 1988, introducing extension mechanisms and more robust authentication systems.

The operation of POP3 demonstrates a simplicity that took precedence over sophistication. The client connects to the server via TCP port 110, authenticates, and uses a handful of basic commands. STAT to check the number of messages, LIST to view their size, RETR to retrieve one, QUIT to leave. Nothing more, nothing less. This economy of means has largely contributed to its success among developers.

The integration of POP3 into Microsoft Outlook and other mainstream email clients sealed its fate. The protocol spread like wildfire, becoming the reference solution for individuals and small organizations. Its logic perfectly matched the habits of the time: people checked their mail on their personal computer, just as they emptied their physical mailbox.

But this approach conceals a trap that would prove problematic. By default, POP3 deletes messages from the server after downloading them. Once transferred to the living room computer, it becomes impossible to access them from the office or, later, from a mobile phone. This limitation, trivial in the 1990s, became a genuine handicap with the explosion of mobile computing.

Faced with these constraints, the IMAP protocol established itself as the modern alternative. Where POP3 empties the server’s mailbox, IMAP keeps it intact. Messages remain accessible from any device, organized in folders, available offline if necessary. IMAP also offers server-side search functions and partial downloading of large messages.

Yet POP3 refuses to disappear. Its technical lightness remains an undeniable asset: it consumes few server resources and is easy to use. For certain specific uses, particularly when the user works primarily on a single device, POP3 retains its advantages. Messages stored locally can be accessed without an Internet connection and do not clutter the servers.

This persistence illustrates an interesting phenomenon in technological evolution. Two philosophies clash here: POP3 makes the personal computer the center of the digital universe, while IMAP favors a centralized vision where the server is the sole repository. Each approach addresses different needs and directly influences the user experience.

Recent history has proven IMAP right. The increase in storage capacity, the democratization of permanent broadband connections, and especially the simultaneous use of multiple devices have made synchronization indispensable. Who would accept today being able to check their emails only from their laptop?

The protocol has not remained static. Security improvements have emerged, such as the APOP authentication mechanism that strengthens exchange protection. These developments demonstrate an ability to adapt to contemporary requirements.

Standardization by the IETF played a decisive role in POP3’s longevity. This normalization guaranteed its interoperability and facilitated its integration into numerous software applications. Paradoxically, this ubiquity may explain its survival: replacing such a widely adopted standard requires considerable effort.

The story of POP3 reveals that a technical solution does not need to be the most advanced to endure; it must primarily address a concrete need effectively. POP3 brilliantly solved the problem of email for the intermittent connections of the 1980s. The fact that this problem has since largely disappeared does not erase this initial success.

This persistence is also explained by very human factors. User familiarity with a tool, resistance to change, preference for proven solutions constitute obstacles to the adoption of new technologies. System administrators know this well: migrating to a new protocol requires time, money, and training.

The trajectory of POP3 is part of a broader dynamic in Internet evolution. Protocols are born, evolve, coexist, and sometimes disappear according to logics that transcend purely technical considerations. The history of email shows how different approaches can survive side by side, each retaining its niche of use.

This coexistence is not accidental but a characteristic of mature technological ecosystems. POP3 and IMAP respond to different philosophies of personal data management. One prioritizes autonomy and local control, the other relies on centralization and universal accessibility. Both visions remain legitimate depending on the context of use.

Understanding the history of POP3 means grasping how technologies become rooted in their time while maintaining a limited capacity for adaptation. It also means measuring the importance of initial design choices, which can determine the success or obsolescence of a standard over several decades.

Top

IBM PCjr

In 1983, IBM dominated the professional market with its PC. Riding high on this success, the American giant cast its eye toward home computing, a territory dominated until then by machines like the Apple II or the Commodore 64. The idea emerged to adapt the PC for families by creating a scaled-down version dubbed “Peanut” internally. Rumors multiplied in the specialized press for months until the official announcement of the PCjr on November 1st, 1983.

Expectations ran high. IBM’s research budget at the time exceeded that of certain nations, and its name alone was enough to trigger public enthusiasm. The machine reused the Intel 8088 processor from the PC and maintained substantial software compatibility with its bigger sibling. This deliberate connection aimed at two objectives: reassuring buyers about the sustainability of their investment and protecting sales of the professional PC.

Two versions came to market. The base model, priced at $669, featured 64 KB of memory. The enhanced model climbed to $1,269 with its 128 KB and integrated floppy disk drive. These prices immediately positioned the machine at the high end, well above the direct competition.

The PCjr’s main innovation lay in its wireless keyboard, a world first for a personal computer. The infrared connection made the machine usable from up to 6 meters away, transforming the living room into an improvised office. But this innovation came with a questionable choice: flat keys inspired by calculators, devoid of inscriptions. The symbols and letters appeared on the chassis, between the keys, complicating typing and straining the eyes.

Performance-wise, IBM equipped its machine with remarkable graphics capabilities. Several display modes coexisted, from 320×200 pixels in 4 colors to 640×200 in two shades. The enhanced model supported up to 16 simultaneous colors. Sound was equally impressive with a sophisticated circuit producing three synthesis channels over 7 octaves, complemented by a white noise generator.

Sales began in January 1984, but deliveries remained limited until March. IBM deployed unprecedented marketing firepower. Charlie Chaplin, already the PC’s spokesperson, returned in television commercials. Over a hundred books appeared before the actual release, and four specialized magazines launched. Rarely had a machine benefited from such media hype.

Yet sales struggled to take off. Early reviews revealed glaring flaws. With its execution speed cut in half compared to the PC, the PCjr disappointed users accustomed to its elder’s performance. Memory, capped at 128 KB, severely limited upgrade possibilities. The internal architecture made adding a second floppy drive or hard disk impossible. Proprietary connectors forced buyers to purchase overpriced adapters to connect standard peripherals.

Facing this criticism, IBM attempted to right the ship in July 1984. A new, more conventional keyboard replaced the flat-key model. Prices dropped significantly. A major promotional campaign accompanied the 1984 holiday season, temporarily boosting sales to 17% of the home market. But the upturn was short-lived: by January 1985, that figure had plummeted to 4%.

The ax fell on March 19, 1985. IBM announced the end of production, just fifteen months after commercial launch. Between 100,000 and 350,000 unsold machines piled up in warehouses. The company slashed prices on remaining stock, even giving them away to schools purchasing PCs. The bill came to $45 million in losses.

Paradoxically, this commercial failure masked some lasting successes. Sierra On-Line developed King’s Quest specifically for the PCjr, revolutionizing the graphic adventure genre and launching a legendary franchise. Tandy Computer drew directly on IBM’s experience to create its Model 1000, correcting the major flaws in the process: professional keyboard, performance equivalent to the PC, expansion capabilities. This Tandy 1000 achieved considerable success and extended the PCjr’s technical legacy.

A community of enthusiasts keeps the memory of this unique machine alive. Expansions like the jrIDE card add a hard drive and 640 KB of memory to surviving units. The DOSBox emulator resurrects its software catalog. In Japan, IBM had attempted a variant called the JX, developed with Matsushita, but this competing project sank even deeper with only 8,000 units sold.

The PCjr story illustrates the pitfalls of a poorly executed derivative strategy. By deliberately constraining their home machine’s capabilities to protect professional PC sales, IBM’s engineers created a flawed product, too expensive and too limited against agile and inventive competition. Exclusive distribution through professional networks, ill-suited to the family market, sealed the fate of a venture that marked the end of IBM’s ambitions in home computing.

Top

GNU Emacs

In 1974, in the hallways of MIT’s Artificial Intelligence Laboratory, Richard Stallman tackled a problem that frustrated all programmers of the era: the TECO text editor, while powerful, remained cumbersome to use. He got the idea to improve this tool by drawing inspiration from the E editor developed at Stanford. He added real-time display and redesigned the command system.

Initially, this improvement didn’t change much about TECO. Users launched the visual editing mode with Control-R, modified their text, then exited for other tasks. But one feature would transform everything: the ability to redefine commands to call programs written directly in TECO. This freedom unleashed users’ creativity as they developed their own command systems. TECMAC, MACROS, RMODE, TMACS, Russ-mode, and DOC flourished in the community. Usage evolved so that exiting the editing mode became unnecessary, and newcomers no longer learned to manipulate TECO directly.

However, TECO showed its limitations. It lacked decent programming structures: no named functions, no proper variables. The first editors built on this foundation quickly became unmanageable. In 1976, the TMACS system attempted to add these missing features. The experiment worked, but performance suffered. Stallman observed, analyzed, and decided to create EMACS.

His approach combined two challenges: enriching TECO with the libraries and self-documentation necessary for readable programming, while designing an entirely new set of editing commands. He studied the numerous TECO-based editors and sought to minimize the number of keystrokes for common operations. By the end of 1976, the first EMACS was operational.

Development continued on Digital Equipment Corporation’s PDP-10, under MIT’s ITS system. Most of the new code was written in TECO, with Stallman adding only the bare minimum to the language: optimizations for search loops, symbolic expression parsing, and a few new input-output interfaces. In 1977, interest grew beyond MIT’s walls. Mike McMahon from SRI International adapted EMACS to Digital’s Twenex system. A hundred sites were already using it.

The term “Emacs” then became generic, designating an entire family of extensible editors. One of the most notable versions emerged under UNIX, developed by James Gosling. This implementation brought two remarkable innovations. First, it exploited UNIX’s process management capabilities: filtering text through external programs, running different tools in separate windows. Second, it introduced MLisp (Mock Lisp), an extension language independent of the C used for implementation.

This separation between implementation language and extension language revolutionized usage. With MLisp, simple and elegant, users created extensions without compilation or linking. The development cycle accelerated, encouraging the small improvements that make all the difference in a user interface. The language offered an abstract view of text, strings, processes, and windows.

In 1984, Stallman launched GNU Emacs as part of his GNU project. This time, he chose true Lisp as the extension language, more powerful and coherent than MLisp. GNU Emacs became dominant. An active community developed extensions for everything: email, discussion forums, specialized editing for programming languages, debugging interfaces.

Emacs proved that extensible and customizable software can adapt to everyone’s needs. This model inspired numerous subsequent developments: a stable core combined with a powerful extension language. Modern development environments adopted this approach, integrating programming languages, interface customization, and task automation.

Emacs’s legacy is found in its self-documentation, which provides access to command and function explanations from within the editor, and has become standard for online help in interactive software.

GNU Emacs continues to evolve, faithful to its original philosophy. In a world where technologies become obsolete in a few years, its longevity testifies to the soundness of its principles: extensibility, power, and flexibility. It adapts to contemporary needs without renouncing what made it successful.

Top

Microsoft Paint

Some software programs leave their mark through technical sophistication or innovative prowess. Microsoft Paint is not one of them. Yet since 1985, this modest drawing application has accompanied millions of Windows users and has shaped, almost inadvertently, an entire digital aesthetic.

The adventure begins in the 1970s, when two philosophies clash in the field of computer graphics : vector programs that work with precise geometric objects, and digital painting software that directly manipulates pixels. Dick Shoup, at Xerox PARC, develops SuperPaint in 1973, a system that captures hand movements on a pixel grid. This bitmap approach, less rigorous than vector graphics, offers a spontaneity that brings the computer closer to traditional artistic gesture.

Microsoft discovers this path when Apple triumphs with MacPaint on its first Macintosh computers. In 1985, Windows 1.0 includes Windows Paint, coded by Dan McCabe. The program offers 24 tools in a spartan interface such as pencil, brush, geometric shapes, and also some advanced functions like Bézier curves or isometric drawing. But these ambitions run up against the technical realities of the era. Computers struggle, screens lack resolution, and Windows 1.0 doesn’t really convince anyone.

In the following years, Microsoft turns away from Paint. The company collaborates with IBM on Presentation Manager, a graphical operating system project that ultimately fails. Paint stagnates in Windows 2.0, victim of this strategic reorientation that leads nowhere.

The awakening comes with Windows 3.0 in 1990. Microsoft abandons the idea of developing all its applications in-house and calls upon existing publishers. Microsoft Paintbrush replaces Windows Paint, based on ZSoft’s PC Paintbrush, created by Mark Zachmann. The transformation is spectacular : color makes its appearance with 256 shades available, the tool palette migrates to the left side, and colors line up at the bottom of the screen. The interface gains coherence and readability.

Windows 95 definitively transforms Paint. Renamed Microsoft Paint (or MS Paint for short), it becomes a standard component of every Windows installation. This ubiquity coincides with the explosion of the Internet, which mutates from a textual medium into a visual universe. Between 1995 and 2000, nearly one in two Americans discovers the Internet, and Paint is their first contact with digital graphic creation.

The tool then develops its particular aesthetic signature. The exclusive use of the mouse, combined with the bitmap approach, produces shaky lines, approximate contours, color areas with sharp borders. These technical “flaws” create a visual language recognizable among thousands. Where other software aims for perfection, Paint embraces its imperfections and transforms them into expressive characteristics.

Curiously, Microsoft leaves Paint in this state for more than ten years. From 1995 to 2007, features evolve little. This inertia is explained by a paradigm shift : Windows no longer needs to seduce through its integrated applications. Paint fulfills its role as a basic tool for simple tasks, without pretending to compete with Photoshop or Illustrator.

In the 2000s, the Internet sees the birth of the first visual memes. Communities on Something Awful or 4chan adopt Paint’s “amateur” aesthetic as a banner. The 2008 Rage Comics perfectly embody this movement : rudimentary comic strips with expressive characters, entirely created with Paint. The imprecision is deliberate, the amateurism claimed.

Microsoft does attempt to modernize the experience in 2012 with Fresh Paint, a touch application that simulates traditional painting. But this smooth and sophisticated version masks its digital nature instead of embracing it. Meanwhile, tools like Rage Maker proliferate to facilitate meme creation in authentic Paint style.

Paint’s story reveals how constraints shape creativity. Its technical limitations, which could have constituted handicaps, have almost become distinctive assets. Paint has never figured in innovation rankings, but it has left its own mark on visual culture.

This trajectory illuminates the importance of “ordinary” technologies in computer evolution. Spectacular innovations capture attention, while modest tools quietly structure our daily practices. Paint survives in Windows, witness to a period when creating digitally became accessible to the masses, with its awkwardness and particular poetry.

Top

X Window System

In 1984, at MIT, two seemingly unrelated projects would converge toward a major transformation in graphical display for UNIX systems. On one side, the Argus system at the Laboratory for Computer Science needed a debugging environment for its distributed applications. On the other, Project Athena had to manage thousands of workstations equipped with bitmap displays. Faced with the diversity of hardware – Digital VSlOO machines, IBM workstations, and other heterogeneous architectures – engineers realized they needed to completely rethink their approach to graphical display.

The inspiration came from Stanford, where Paul Asente and Brian Reid had created the W system as an alternative to VGTS. W already allowed transparent network access for display, thanks to the synchronous communication mechanism of the V system. This system provided text windows for ASCII terminal emulation and graphics windows built on a simple display list mechanism. Asente and Chris Kent adapted W for VSlOO machines at Digital’s research laboratories, but a few days of experimentation revealed the limitations of this approach: a hierarchical window system accessible via network would be far more efficient, provided it wasn’t limited to specific application modes.

X Window System was born from this reflection. Its client-server architecture reverses the usual logic: the server controls the physical display while client applications communicate with it via an asynchronous protocol carried over a reliable byte stream. This conceptual inversion changes everything. It becomes possible to run a program on any machine on the network and use any display, without worrying about the hardware architecture or underlying operating system. Performance remains remarkable: a VAXStation-II/GPX displays 19,500 characters per second and draws 3,500 short vectors per second, whether locally or via network.

Among the innovations introduced by the system at the time, windows organize themselves in arbitrary hierarchies, resize and overlap freely. The server encapsulates all hardware dependencies, making the communication protocol completely device-independent. An application runs without modification on a monochrome terminal or a sophisticated color display. The protocol minimizes network latency through asynchronous requests and fine-grained event management.

Color management reveals X’s technical sophistication. The proposed abstract model adapts to monochrome screens as well as the most advanced pseudo-color displays. Multiple applications can share a common color table while maintaining their own colorimetric spaces. This approach allows the harmonious coexistence of multiple graphical applications without resource conflicts.

In 1987, DEC integrated X into its commercial products, validating the technical approach. MIT distributed the complete source code without restrictions, accelerating adoption by universities and companies. This open-source strategy before its time transformed X into a de facto standard. Academic researchers and commercial engineers collaborated on development, creating a rich and diverse ecosystem. The system was ported to more than seven different hardware architectures and sixteen display types.

The 1990s saw the birth of GNOME and KDE, desktop environments that demonstrated X11’s robustness. But limitations appeared: the font system proved insufficient for modern needs, 2D graphics capabilities required improvements, accessibility for the visually impaired remained difficult to implement.

Xft solved typographical problems by allowing direct access to font files and introducing antialiasing. Fontconfig separated character management from the windowing system, simplifying internationalization and installation of new fonts. Cairo brought advanced graphics capabilities: Bézier curves, translucent image composition, high-quality vector rendering. These developments integrated naturally thanks to X’s modular architecture.

The Composite protocol transformed visual management by allowing translucent windows and sophisticated effects. XEvIE improved accessibility by intercepting and transforming input events. These extensions kept X relevant in the face of growing demands from modern interfaces, without breaking compatibility with existing systems.

Security was never neglected. X implements sophisticated authentication and authorization mechanisms from its inception. Standalone terminals request login services from remote hosts securely with the X Display Manager Control Protocol. Administrators can centralize access management while preserving the user experience of traditional terminals.

X’s client-server model and modular design inspire contemporary software architectures. By establishing a standard for distributed graphical systems, X shapes the evolution of user interfaces on Linux and related systems.

In 2004, the X.org foundation took over project governance and continued development. Innovations continue: high-resolution display support, hardware acceleration via OpenGL, adaptation to emerging display technologies. Wayland appears as an alternative for certain use cases, but X Window System remains in 2024 the reference windowing system on many UNIX and Linux platforms.

X’s architectural principles – modularity, extensibility, hardware independence – retain their relevance in the development of contemporary interfaces. More than a technology, X illustrates the effectiveness of the collaborative approach where universities, companies, and communities coordinate their efforts to create sustainable infrastructure.

Top

Intel 80386

When Intel launched the 80386 in October 1985, no one really expected this processor to revolutionize computing. In the company’s corridors, it was considered more of a transitional product, a stopgap solution until the arrival of the real flagship project: the iAPX 432. This ambitious processor kept accumulating delays, and the 386 had to fill the void. History would remember that it was ultimately this “temporary solution” that would permanently transform the computing landscape.

The project began in early 1982 under the direction of John Crawford, the chief architect. His team of about ten engineers faced a difficult dilemma: should they break with the past to create a cutting-edge processor or maintain compatibility with existing systems at the risk of limiting performance? Discussions followed one after another. Intel’s customers were clear: they did not want to rewrite their software. Total compatibility at the object code level was therefore naturally required.

This constraint did not prevent Crawford’s team from thinking big. The 386 definitively abandoned the 16-bit architecture of its predecessors to move to 32-bit. Registers, arithmetic and logic unit, internal buses: everything shifted to the new dimension. The processor could now directly address 4 gigabytes of physical memory, a capacity that seemed gigantic at the time. Modern operating systems like UNIX or OS/2 would make virtual memory management with paging their reality.

Manufacturing the 386 required leading-edge technical innovations. Intel abandoned NMOS technology in favor of CMOS, reducing power consumption and improving signal quality. But it was primarily the two-level metal fabrication process that posed problems. This new technology generated significant production defects. Engineers discovered that the “forbidden zone” between metal traces greatly complicated production. Yields dropped, costs skyrocketed, and the team had to constantly adjust parameters.

Development resources remained modest. About ten engineers worked with often makeshift tools. The team made a bold decision by adopting UNIX as the development operating system, despite the absence of official authorization. This transgression proved successful. Automation of component placement and routing, achieved through software developed by a Berkeley student, significantly accelerated the design.

The commercial launch held a surprise. Compaq, not IBM, released the first personal computer equipped with the 386. IBM no longer controlled the market’s technological evolution alone—this was a breakthrough in the industry. The Compaq Deskpro 386 became a reference that forced IBM to react. Intel discovered that demand exceeded its initial forecasts.

To meet this demand while controlling costs, Intel developed the 386SX. This simplified version retained the internal 32-bit architecture but used an external 16-bit bus. The compromise appealed to budget computer manufacturers who could offer modern architecture at a reduced price. The 386SX achieved significant commercial success, democratizing access to 32-bit capabilities.

The 386’s performance evolved with the first version clocked at 16 MHz reaching 5 MIPS. Later versions could go up to 33 MHz for 9.9 MIPS. These figures represented a considerable leap compared to previous processors. The 386 transformed personal computing by enabling smooth execution of graphical applications and multitasking systems.

The processor’s success extended far beyond the PC market. UNIX workstations adopted it massively. Industry integrated it into automated control systems. Scientific research made it a reference component for its equipment. More surprisingly, the 386 found its place in space: the Hubble telescope and the SAMPEX probe carried hardened versions of the processor.

The first sophisticated mobile devices adopted the 386. The BlackBerry 950 and Nokia 9000 Communicator integrated low-power versions of the processor. This exceptional longevity testified to the robustness of the architecture designed by Crawford and his team. The 386 remained in production for years, well after the arrival of its successors.

The 386’s influence exceeded its commercial success alone. Following this experience, Intel adopted a development strategy alternating architectural innovations and manufacturing optimizations. This approach, later called “tick-tock”, still guides the company’s strategy. At Intel, the principle of backward compatibility established with the 386 guarantees the sustainability of software investments—a true dogma.

The 386 perfectly illustrates how computing history sometimes proceeds through fortunate accidents. Designed as a temporary solution, this processor ended up establishing the foundations of modern computing. Its legacy endures in every personal computer, reminding us that in technology, pragmatic solutions often prevail over the most ambitious projects.

Top

AppleTalk

In the early 1980s, Apple Computer embarked on an ambitious challenge: creating a network protocol in parallel with the development of the Macintosh. The idea seemed bold, but it addressed a concrete need. Users wanted to share their files and printers without wrestling with complex technical configurations. AppleTalk was born from this desire to simplify network computing.

The protocol concealed its sophistication behind a remarkably transparent interface. Computers communicated with each other without users needing to understand the underlying mechanisms. This philosophy stood in stark contrast to the systems of the era, often intimidating and reserved for experts. Apple bet on the invisibility of technical operations: only the result mattered.

Two main versions emerged. Phase 1 targeted small workgroups but suffered from constraining limitations. It was impossible to exceed 135 machines on a segment, and wide area networks remained out of reach. These restrictions hindered AppleTalk’s expansion in enterprises. Phase 2 corrected these flaws by allowing up to 253 machines per segment and managing wide area networks. This evolution paved the way for more ambitious deployments.

AppleTalk’s flexibility was expressed through its physical implementations. LocalTalk exploited Apple’s proprietary twisted pair cabling, while TokenTalk relied on Token Ring networks. EtherTalk leveraged Ethernet, and FDDITalk exploited FDDI optical fibers. This modularity enabled technological transitions without a complete system overhaul.

The protocol architecture was methodically organized. The DDP (Datagram Delivery Protocol) transported packets from one point to another. The AARP (AppleTalk Address Resolution Protocol) established the correspondence between logical and physical addresses. The RTMP (Routing Table Maintenance Protocol) kept routing information up to date. Each protocol fulfilled its function without encroaching on the others.

Name management relied on ingenious mechanisms. The NBP (Name Binding Protocol) associated understandable names with technical addresses, while the ZIP (Zone Information Protocol) organized networks into logical zones. The AEP (AppleTalk Echo Protocol) verified that machines remained accessible. The ATP (AppleTalk Transaction Protocol) guaranteed exchange reliability for critical applications such as file sharing or printing.

Dynamic addressing constituted one of AppleTalk’s innovations. An identifier was automatically assigned to each machine upon connection. This approach prevented address conflicts while simplifying administration. The address combined three elements: a 16-bit network number, an 8-bit node identifier, and an 8-bit socket number to designate the relevant service.

The zone concept revolutionized network organization. Machines could be grouped according to functional rather than geographical criteria. The accounting department could thus gather its resources in a single zone, even if the computers were scattered across different buildings. This flexibility would appeal to many network administrators.

Transmission occurred without prior connection establishment. Packets traveled independently toward their destination. Higher-level protocols then added the necessary reliability mechanisms according to needs. This modular architecture facilitated system evolution and maintenance.

Cisco progressively enriched its AppleTalk integration. It added support for different EtherTalk versions, compatibility with VLANs, integration with WAN protocols like Frame Relay. These extensions pushed the protocol’s limits far beyond Apple’s initial intentions. The encapsulation of RTMP packets over IP also enabled AppleTalk traffic over the Internet.

Security remained AppleTalk’s poor relation, with the protocol delegating it to applications—a questionable choice by current standards but consistent with the philosophy of the 1980s. Cisco attempted to fill this gap with its distribution lists, a rudimentary but useful mechanism for controlling the propagation of routing information.

Cisco’s implementation presented a surprising peculiarity: it refused to transmit certain packets with identical local addresses. This restriction, contrary to Apple’s specifications, aimed to preserve the integrity of address tables. The manufacturer prioritized network stability over strict standards compliance.

AppleTalk’s decline began in the 1990s. TCP/IP progressively established itself as the universal standard for network communications. Its open nature and adoption by the Internet sealed the fate of proprietary protocols. In 2009, Mac OS X v10.6 Snow Leopard definitively abandoned AppleTalk. Apple thus came full circle by turning toward the open standards it had initially shunned.

With the AppleTalk protocol, a sophisticated network remained simple to use. Auto-configuration and automatic service discovery, central concepts of AppleTalk, still inspire current developments. Bonjour, AppleTalk’s spiritual successor at Apple, perpetuates this philosophy of simplicity in a TCP/IP world.

Top

NTP

In the world of distributed systems, synchronizing time between distant machines represents an ongoing technical challenge. The Network Time Protocol solved this equation with elegance that spans decades. Its origin dates back to 1979, when a public demonstration revealed the Internet to the world via a transatlantic satellite link. This first appearance already highlighted the importance that temporal coordination would assume in network architecture.

Two years later, in 1981, the Internet Engineering Notes dedicated their reference IEN-173 to this synchronization technology. RFC-778 followed with the first public specification of the protocol. The initial design took an unexpected form: it was directly integrated into the Hello routing protocol, documented under RFC-891. This hybrid approach persisted for several years within Fuzzball, an experimental operating system that served as a testbed.

The year 1985 marked the official birth of NTP version 0. David L. Mills developed his version in Fuzzball while Louis Mamakos and Michael Petry simultaneously worked on a UNIX implementation at the University of Maryland. Their respective codes left lasting traces in current software, testament to the soundness of their architectural choices. RFC-958 documented this first formal version, focusing on the NTP packet header and the offset and delay calculations that remain unchanged today.

At that time, hardware constraints dictated performance. Modest networks and computers limited precision to around ten milliseconds on Ethernet. By comparison, transatlantic links generally maintained precision under 100 milliseconds.

Version 1 emerged three years later with RFC-1059. This document constituted the first complete exposition of the protocol and its algorithms, including the first versions of the filtering, selection, and clock discipline mechanisms. These developments built upon a series of experiments documented in RFC-956, where the theory of the filtering algorithm took shape. This version introduced client/server and symmetric modes, as well as the use of the version field in packet headers.

In 1991, the first academic transaction article devoted to NTP Version 1 appeared. This document presented the complete model to the technical community for the first time: architecture, protocol, and algorithms formed a coherent whole whose principles remain current. Scientific recognition of the protocol was achieved with this publication.

NTP Version 2 arrived in 1989 with RFC-1119. Dennis Ferguson at the University of Toronto completely rebuilt it to faithfully respect the specification. This version brought two major innovations: the NTP control message protocol for administering servers and clients, and the cryptographic authentication system based on symmetric cryptography.

The specification for NTP Version 3 appeared in 1992. Its 113 pages included a remarkable appendix proposing a formal error analysis accompanied by a detailed budget covering all error contributions from the primary source to the final client. These works established the foundations of maximum and estimated error statistics, rigorously characterizing synchronization quality.

Lars Mathiesen at the University of Copenhagen then undertook a meticulous revision of version 2 to conform it to the version 3 specification. More than a year of intensive exchanges between specification and implementation resulted in a coherent formal model, an exemplary process of convergence between theory and practice.

The following years saw NTP continue to evolve. New features and algorithmic revisions accumulated while preserving interoperability with previous versions. This period saw the birth of Simple Network Time Protocol (SNTP) version 4, a simplified protocol compatible with NTP on IPv4, IPv6, and OSI, but lacking the sophisticated mitigation and discipline algorithms of its predecessor.

The worldwide deployment of NTP evoked the ham radio spirit. Each new country discovered using NTP represented a symbolic victory for the developers. Satisfaction reached its peak when a country’s national standards laboratory established a primary NTP server directly connected to its national time and frequency ensemble.

Maintaining configuration scripts and patch libraries constituted a thankless but necessary task. Harlan Stenn, a dedicated volunteer, currently manages this process using modern autoconfiguration tools. New versions first undergo testing on the DCnet research network, then in broader environments like CAIRN before publication on www.ntp.org.

Technological progress radically transformed NTP performance. Precision improved from hundreds of milliseconds in the turbulent Internet of the 1980s to tens of nanoseconds in the Internet of the new millennium. This evolution blended historical lessons and technological revivals, preserving those ham radio resonances with each new country appearing on the Internet with active NTP.

The arrival of affordable atomic clocks revolutionized the field. Rubidium models, with their drift of one microsecond per day, democratized access to a high-quality temporal reference. OCXO oscillators offered the best alternative for obtaining precise clocks in compact packaging. With temperature exerting the primary influence on the crystal oscillation rate, thermostatization controlled and compensated for these variations, limiting daily clock drift.

NTP retains the title of the oldest distributed application in continuous operation on the Internet. Its development continues in the face of new temporal synchronization needs. Data centers and cloud computing demand ever finer temporal precision, where every nanosecond counts for modern applications. This perpetual quest for temporal precision perfectly illustrates the evolution of the Internet, from an experimental network to the critical infrastructure of our digital society.

Top

Microsoft Excel

In 1978, Dan Bricklin and Bob Frankston developed VisiCalc, which sold nearly a million copies and revolutionized microcomputing. Microsoft sensed the market’s potential, and in 1982 the Redmond firm launched MultiPlan, which adopted the R1C1 addressing system rather than VisiCalc’s A1 format. The software found its audience on CP/M systems but failed against Lotus 1-2-3 on MS-DOS. This defeat pushed Microsoft to rethink its strategy.

Excel was born in 1985, but exclusively on Macintosh. The choice seemed bold: Microsoft bet on the graphical interface, pull-down menus, and the mouse when most programs still operated in text mode. This vision proved successful. The Windows version arrived in 1987, accompanied by a streamlined version of the operating system for less powerful machines.

The following years transformed Excel into a true Swiss Army knife of calculation. In 1990, version 3.0 added toolbars, drawing capabilities, 3D charts, and outline functions. Excel 4.0, in 1992, improved usability and introduced the fill handle, that small square that duplicates formulas into other cells with a simple drag. A gesture that became so natural that users struggle to imagine doing without it.

Multi-sheet workbooks changed the game with Excel 5.0 in 1993, but it was especially the arrival of VBA that revolutionized usage. Users discovered they could record their repetitive actions as macros. Overnight, Excel moved beyond simple calculation to become an accessible programming tool.

The transition to 32-bit architecture with Excel 95 brought stability and speed. Excel 97 blessed us with the Office Assistant, that animated paperclip that annoyed as much as it helped, and perfected data validation. More sophisticated, the VBA interface attracted a growing community of developers.

Excel 2000 improved the multiple clipboard and added the auto-repair function. Excel 2002 introduced file recovery after crashes, a lifesaving feature for anyone who has lost hours of work. Excel 2003 focused on XML support and list ranges.

2007 disrupted habits. The Ribbon replaced traditional menus, disconcerting seasoned users but winning over novices. The XLSX and XLSM formats, based on XML, modernized data storage. Excel 2010 added sparkline charts, these mini-graphs embedded in cells, and more powerful filters.

This technical evolution masked a more complex reality. While Excel certainly established itself in businesses for financial analysis, project management, and statistical modeling, its ease of use concealed formidable pitfalls. Errors easily crept into complex formulas, and their consequences could be dramatic.

JPMorgan Chase learned this the hard way in 2012. An error in an Excel model calculating risks cost the bank $6.2 billion. Fidelity Magellan experienced a similar incident in 1995: an accountant forgot a minus sign in front of a $1.3 billion loss, completely distorting dividend estimates. These accidents served as reminders that Excel, despite its apparent simplicity, often manipulates considerable stakes.

This dual nature constitutes both Excel’s strength and weakness. On one hand, it democratizes data analysis and makes numerical computation accessible to millions of non-specialist users. On the other, it generates a sometimes dangerous dependency in fields where precision matters. The structured finance and quantitative analysis sectors owe their rise to it, but also some of their most bitter disappointments.

Recent versions attempt to reconcile these contradictory requirements. Artificial intelligence enters the formulas, visualization tools become more sophisticated, processing capabilities expand to handle ever-larger data volumes. Excel adapts to cloud computing and collaborative work, areas where it had fallen considerably behind.

Microsoft isn’t letting go. Recent developments show a determination to keep Excel at the heart of modern professional practices. The integration of advanced programming languages like Python, improved analytics, and adaptation to new formats demonstrate this ambition.

More than forty years after VisiCalc, Excel embodies the evolution of professional computing tools. Its trajectory reflects the transformations of office work and the progressive computerization of society. Its persistence in a perpetually changing technological world demonstrates its unique ability to meet fundamental information processing needs, despite its imperfections and the risks it generates.

Top

Microsoft Windows

In 1983, Bill Gates announced that Microsoft was working on a graphical interface. The idea was nothing truly novel—Xerox had already explored this path at PARC, Apple had just released Lisa and was preparing the Macintosh. But Microsoft wanted its piece of the pie and fully intended to transform the user experience on PC.

November 1985 saw the birth of the first version of Windows. The result disappointed. This graphical layer placed on top of MS-DOS offered windows, pull-down menus, and a few icons. Users, accustomed to typing their commands, struggled to adopt this novelty. Sales remained minimal.

Two years later, Windows 2.0 corrected several flaws. Windows could now overlap instead of neatly aligning in a mosaic pattern. Microsoft was then collaborating with IBM on OS/2 and seeking visual consistency between its different projects. This version went largely unnoticed.

The real takeoff occurred in 1990 with Windows 3.0. This time, Microsoft had a product that worked. The interface displayed 16 colors, exploited the capabilities of 386 processors, and offered three components that would make history: Program Manager, File Manager, and Print Manager. These tools simplified users’ lives and made the computer accessible to the general public. Sales soared and caught up with Apple’s.

Windows 3.1 arrived in 1992 with enhanced stability. OLE technology made its debut by enabling the integration of objects from one application into another. The first multimedia elements appeared on PC. Windows for Workgroups 3.11 added network functions essential to companies that were beginning to connect their machines.

Microsoft developed Windows NT in parallel, a system designed for professionals. Released in 1993, NT broke with MS-DOS and offered an entirely new architecture. The company targeted the server market, UNIX’s stronghold. NT’s robustness earned it C2 certification from the U.S. government, a mark of credibility in the corporate world.

August 1995 marked a milestone in personal computing history. Windows 95 arrived with its Start menu and taskbar, two elements that would become familiar to millions of users. “Plug and Play” simplified hardware installation. The 32-bit architecture improved performance. Microsoft spent $300 million on advertising and used “Start Me Up” by the Rolling Stones to accompany the launch. Computing definitively entered households.

Windows 98 refined the formula the following year. The Internet was gaining importance, and Microsoft facilitated connections. USB made its debut, the FAT32 format increased hard drive capacity. The system gained stability and offered automatic maintenance tools.

In February 2000, Windows 2000 finally unified the consumer and professional branches. Based on NT, this system brought the long-awaited reliability. Active Directory revolutionized enterprise network management. Multiprocessor support became a reality on workstations.

Windows XP, released in 2001, completed this evolution. The “Luna” graphical interface modernized the system’s appearance. Hardware compatibility improved, multimedia found its place. XP combined the user-friendliness of home versions with the robustness of professional editions. This version would remain popular for a decade.

The 2000s brought new concerns. The Internet was becoming widespread but brought its share of viruses, Trojan horses, and other malware. Microsoft responded with regular patches and strengthened security. XP’s Service Pack 2 introduced an integrated firewall and security center. The war against computer threats began.

This technical history was accompanied by commercial controversies. The integration of Internet Explorer into Windows prompted antitrust lawsuits in the United States and Europe. Competitors denounced Microsoft’s practices and dominant position. These legal battles marked the 1990s and 2000s.

Windows’s evolution reflected that of personal computing as a whole. From the austere text mode of MS-DOS to modern colorful interfaces, Microsoft managed to maintain compatibility with existing applications while innovating. This continuity partly explains the system’s persistent dominance.

Windows runs on 72% of personal computers worldwide in 2025. Four decades after its creation, Microsoft’s system withstands assaults from its competitors. Smartphones and tablets have disrupted the market, but Windows maintains its place on desktops and in enterprises.

Top

Amstrad PCW

In 1985, European personal computing took an unexpected turn with the arrival of the Amstrad PCW. Far from the colourful gaming machines that dominated the home market at the time, this British computer adopted a radically different philosophy: making word processing its sole purpose.

Alan Sugar, head of Amstrad, envisioned an integrated system comprising keyboard, screen, processing unit, and dot-matrix printer. His specifications could be summed up in a few words: simple, affordable, effective. The Z80 clocked at 4 MHz formed the technical heart of the system. This 8-bit processor allowed developers to build upon their existing expertise rather than starting from scratch with the fashionable 6502. The 256 KB memory of the PCW 8256 could be doubled in the higher model, a respectable capacity for the time. But it was the choice of the 3-inch floppy disk format that most marked this machine’s history: proprietary, it temporarily protected it from competition whilst mortgaging its future against the 3.5-inch format that was becoming standard elsewhere.

The monochrome screen of 720×256 pixels, with its characteristic green tint, displayed text exclusively. This deliberate limitation corresponded perfectly to the intended use: LocScript, the integrated word processing software, transformed each PCW into a sophisticated electronic typewriter. The CP/M operating system opened access to an existing software library, inherited from previous professional machines.

Commercial success exceeded all predictions. At £400 with the dot-matrix printer included, the PCW appealed to an audience that nobody had really targeted before: independent secretaries, small businesses, writers, local government offices. In France particularly, these white and green machines colonised the offices of independent professionals and public services. Over one million units found buyers between 1985 and 1990, a remarkable figure for such a specialised product.

Successive iterations refined the initial concept without betraying it. The 1987 PCW 9512 replaced the dot-matrix printer with a daisy-wheel printer, finally producing letter-quality documents. Its entirely white casing modernised the aesthetics whilst preserving the proven ergonomics. The PCW 9256, the final version from 1991, brought some minor technical improvements without revolutionising the usage.

LocScript itself evolved, progressively integrating French accents, basic spell-checking, and simple table management. Third-party software enriched the ecosystem: accounting, client file management, with a few games for moments of relaxation. The LocoLink interface enabled document exchange with MS-DOS PCs, a valuable opening when these began to become more affordable.

Yet the concept’s weaknesses appeared over time. No colour, no proportional fonts on screen, noticeable slowness compared to 16-bit machines. Compatible PCs, which became affordable in the early 1990s, offered more versatility for a comparable price. The PCW, prisoner of its specialisation, gradually lost its commercial relevance.

This machine proved that a personal computer could choose efficiency over sophistication, simplicity over raw power. Today, many PCWs still function perfectly, testament to their exceptional build quality.

The Amstrad PCW embodies a particular vision of computerisation: rather than imposing a complex and general-purpose system, it’s better to design a tool suited to a specific need. This approach allowed thousands of users to discover digital word processing without technical intimidation. Secretaries, craftspeople, associations: all found in this white machine their first gateway to professional computing.

This European success, facing American PC standardisation, shows that several possible paths existed to democratise computing. Amstrad explored one of them successfully, proving that demand existed for simple and affordable solutions.

Top

Atari 520 ST

After leading Commodore International to the success of the Commodore 64, Jack Tramiel left the company in 1983 following a brutal price war against Texas Instruments that wore him out. This break led him to found Tramel Technology Ltd with his sons and a few loyalists, notably Shiraz Shivij, one of the brains behind the C64.

The small team embarked on the adventure of an affordable 16-bit personal computer. The choice of processor proved tricky : the National Semiconductor NS32000 failed to deliver on its performance promises. It was finally the Motorola 68000 that won the decision, despite its higher cost. Meanwhile, Atari Inc. was sinking. The 1983 video game crash cost Warner Communications, its parent company, 10,000 dollars a day. Jack Tramiel smelled an opportunity and bought Atari’s consumer division in July 1984 for 240 million dollars in stock.

The technical team accomplished the development of the 520ST in just five months, a small miracle that seems impossible today. The name tells the machine’s story : 520 KB of RAM and that “Sixteen/Thirty-two” architecture that blended the 16 and 32-bit worlds. Microsoft did offer to adapt Windows, but the schedules didn’t align. Digital Research saved the day with a GEM license, which would give the Atari its characteristic graphical interface.

At the January 1985 Consumer Electronics Show, the 520ST’s specifications caused a sensation. The 8 MHz Motorola 68000 hid a 32-bit internal architecture behind a 16-bit external bus. Three graphics modes coexisted : 320 x 200 in 16 colors for games, 640 x 200 in 4 colors for a balanced compromise, and especially that 640 x 400 high-resolution monochrome mode that would win over professionals.

Atari’s engineers slipped a few clever finds into their machine. Those integrated MIDI ports for the modest sum of 75 cents, thanks to downgraded Motorola chips, would transform the 520ST into an electronic music instrument. The 3.5-inch floppy drive stored 360 KB, a respectable capacity when most competitors still stuck with the 5.25-inch. The TOS operating system initially loaded from a floppy before permanently migrating to ROM.

May 1985 saw the first 520STs arrive in stores, narrowly beating Commodore’s Amiga 1000. Jack Tramiel played the aggressive pricing card : 799 dollars with a monochrome monitor, 999 dollars in color. Against professional machines that easily exceeded 2,000 dollars, the argument carried weight. More than 50,000 units found buyers in six months, mainly in Europe where the reception was warm.

The machine seduced unexpected audiences. Musicians discovered the joys of MIDI with software like Cubase and Logic Pro that would go on to great careers. Amateur programmers played with GFA BASIC and STOS. As for desktop publishing professionals, they appreciated that high-resolution mode, especially since Macintosh emulation ran faster on ST than on a real Mac !

European success was no accident. This compact, integrated design matched Old World tastes. The slogan “Power Without the Price” hit the mark in the United Kingdom, won over by affordable home computers. In the United States, things were more complicated : Jack Tramiel maintained execrable relations with retailers, who returned the favor.

The 520ST’s influence exceeded its mere technical performance. Its ability to read MS-DOS floppies facilitated exchanges with the PC universe. Those standardized MIDI ports created an entire ecosystem that still survives today. Jean-Michel Jarre, Fatboy Slim, and Madonna used it in the studio and on stage, creating a legend that stuck to the machine.

The software catalog expanded. Dungeon Master arrived in 1987 and revolutionized the first-person role-playing game. Publishers like Batteries Included explored the graphical possibilities with programs like DEGAS. The nascent “demo scene” pushed the machine to its limits, creating works of digital art that defied all technical logic.

The range evolved through versions. The 520STM added an RF modulator to connect to a television. The 520STFM integrated the power supply and drive in a redesigned, more family-friendly case. TOS gradually migrated from floppy to ROM, simplifying use.

In 1993, production of the 520ST stopped, swept away by the wave of IBM-compatible PCs. Yet the machine kept its faithful followers, especially among musicians who appreciated its low MIDI latency. It embodied that era when technical innovation rhymed with democratization of technology, when a handful of engineers could still shake up the computer industry.

Toshiba T1100

In April 1985, Toshiba revolutionized personal computing with the T1100. For the first time, a laptop computer offered complete compatibility with the IBM PC. This 4 kg machine marked the birth of modern mobile computing.

The era seemed hardly conducive to such innovation. Personal computers, the Apple II since 1977 and the IBM PC since 1981, remained confined to desktops. A few manufacturers did attempt to create transportable machines: Compaq’s Portable I weighed 12.5 kg in 1982, while Seiko released its 1.6 kg HX-20/HC-20 the same year, followed by NEC’s PC8201A in 1983. But these attempts faced a technical dilemma: either maximize miniaturization with incompatible proprietary architectures, or aim for IBM PC compatibility while accepting bulk and limitations.

Toshiba’s teams refused this compromise. Their motto "anywhere, anytime, anyone" expressed a clear ambition: to transform the desktop computer into a mobile companion. The company theorized this approach as a "revolution on the desk," a shift from machine-centered computing toward human-centered use. Three objectives guided development: abandoning closed architectures, making the computer truly portable, and enabling the use of standard software without specific programming.

The technical achievement was an engineering marvel. CMOS integrated circuits replaced power-hungry NMOS technology. Where the IBM PC required nearly 100 circuits for display, the T1100 made do with five. This radical optimization produced a compact device with unprecedented battery life. Energy was a field of innovation. Toshiba created from scratch a charge-discharge system and power management software. These developments would lay the groundwork for the resume function introduced in 1989 on the J3100SS001 model, then the ACPI standard developed with Microsoft and Intel.

The high-contrast LCD screen, mounted on a tilting stand, resulted from extensive studies on visual ergonomics. But the real commercial challenge emerged elsewhere: the choice of 3.5-inch floppy disks. While the IBM PC operated with 5.25-inch format, Toshiba bet on the compact format. Only publishers were reluctant to distribute their software on this still-marginal medium, despite the work of commercial teams who conducted an intensive persuasion campaign to convince them to adopt 3.5-inch.

The marketing strategy broke with industry habits. No more technical arguments! Toshiba concretely demonstrated how to work mobile with the same software as at the fixed office. This pragmatic approach won over non-technical professionals and considerably expanded the market.

Commercial success confirmed this intuition. A study revealed two distinct expectations: portability on one side, more power and capacity on the other. Toshiba responded in January 1986 with two new models. The T2100 featured an i8086 processor and two 3.5-inch drives. The T3100 moved upmarket with an i80286 and a 10-megabyte hard drive. These machines required mains power, but PC Week validated this choice by recognizing the existence of a market for plugged-in laptops. Byte magazine called the T3100 the "King of Laptops." In 1987, Toshiba reached approximately 40% market share of the 130,000 laptops sold in Europe.

The economic impact exceeded all predictions. In 2008, the market for LCD screens under 17 inches represented $15 billion. That of 2.5-inch and smaller hard drives reached $19 billion. Lithium-ion batteries for mobile equipment generated $12 billion in revenue. A year later, laptop sales surpassed desktop machines in a market estimated at $150 billion.

The T1100 demonstrated that a user-centered approach transforms technology sectors. By prioritizing software compatibility, portability, and simplicity over raw performance, Toshiba took the computer out of IT departments and placed it in executives’ briefcases. This democratization of mobile computing heralded the current ubiquity of portable devices in our lives.

Top

Aldus PageMaker

In 1984 in Seattle, Paul Brainerd left his position at Atex to found Aldus Corporation. He was no stranger to the publishing world. A journalist by training, he had overseen the modernization of the Minneapolis Star and Tribune’s systems before joining this manufacturer of computer systems for publishing professionals. This experience gave him firsthand insight into a problem: composition equipment cost a fortune, often several hundred thousand dollars, and remained inaccessible to smaller organizations.

With its graphical interface, Apple’s Macintosh opened unprecedented possibilities. Brainerd sensed that democratizing page layout tools was possible. He assembled a small team of engineers and developed a prototype in six months. Finding funding proved far more complicated: of fifty venture capital firms contacted, only one, Vanguard, agreed to bet on this project. At the time, investors were wary of software companies, deemed too risky in a sector where Microsoft had not yet gone public.

On July 13, 1985, PageMaker 1.0 was born. This date was no accident: it came seven months after Apple’s LaserWriter printer launch. Brainerd understood that his software needed to leverage Adobe’s PostScript language capabilities, integrated into this printer, to produce professional-quality documents. He spoke of the “three-legged stool” of desktop publishing, a term he actually coined during a meeting with his investors in late 1984.

This term “desktop publishing” hit the mark. Simple and effective, it perfectly captured the project’s ambition: to bring professional page layout tools within everyone’s reach. PageMaker’s interface embodied this philosophy. The software simulated a real layout table where users could place and move text and images at will. The floating tool palette was a notable innovation, providing access to key functions. The WYSIWYG principle ensured that what appeared on screen exactly matched the printed result.

The success exceeded all expectations. Users flocked from unexpected sectors. Churches began producing parish bulletins sometimes printed in runs exceeding 600,000 copies. Steve Jobs had pushed to keep the price under 200,butthe495 ultimately chosen proved judicious: this pricing generated the margins necessary for ongoing development and technical support.

Aldus also distributed FreeHand, a vector drawing software created by Altsys, then acquired Silicon Beach Software with its range of consumer applications. These acquisitions gradually built a complete suite of graphic creation tools.

The competition responded. Ventura Publisher arrived in 1986 on PC with a different approach, based on style sheets and suited for long documents. QuarkXPress gained ground in the professional sector. Microsoft tried its luck but had to withdraw its product due to critical technical flaws.

Beyond the software, Aldus shaped industry standards. Steve Carlson developed the TIFF (Tagged Image File Format) at Aldus, which became the reference for digital image exchange. The company later created OPI (Open Prepress Interface) to streamline integration with professional printing systems.

By 1990, margins eroded under competitive pressure and the evolution of distribution channels. PageMaker grew more complex, torn between the needs of occasional users and professional requirements. This growing complexity complicated development and increased costs.

In 1994, Aldus merged with Adobe Systems. This operation, prepared with meticulous care, united both companies’ technologies. The absorption proceeded remarkably well, creating a digital graphic creation giant.

PageMaker proved that an accessible computer tool could supplant prohibitively expensive specialized equipment. It established user interface conventions, but above all, it opened access to professional-quality publishing to individuals and organizations that had never been able to afford such luxury before.

Subsequently, PageMaker’s code nourished Adobe InDesign, its spiritual heir that now dominates the desktop publishing market. Paul Brainerd turned to philanthropy after the Adobe merger. His Brainerd Foundation works to protect the environment in the American Northwest.

This story shows how a clear vision, supported by rigorous technical execution and favorable timing, can transform an entire sector. It also underscores the importance of open standards and cooperation between companies in creating new markets.

Top

IBM AIX

During the 1970s, UNIX spread through universities, but without a standardized commercial version. Each installation required adapting the source code to the local hardware. This technical constraint led to the creation of multiple variants, each reflecting the specificities of its environment.

IBM took interest in the UNIX phenomenon and launched AIX (Advanced Interactive eXecutive) in 1986, developed with the help of Interactive Systems Corporation. The name actually came from ISC, which had already created the PC-IX system for IBM personal computers.

IBM was juggling two distinct Power architectures at the time. On one side, OS/400 equipped a range of machines, on the other, AIX established itself on the IBM 6150 RT workstation in 1986. The AS/400 followed the next year with OS/400.

AIX’s real rise to prominence came in 1990 with Version 3, renamed AIX/6000. It accompanied the RS/6000 platform which definitively buried the IBM RT. This RS/6000 generation inaugurated the era of POWER and PowerPC microprocessors at IBM. AIX finally found its machine of choice.

1994 marked an important technical milestone with AIX 4, which introduced symmetric multiprocessing on the first RS/6000 SMP servers. The system matured through version 4.3.3 in 1999, while IBM rethought its product strategy.

The year 2000 transformed IBM’s organization. The company unified its server lines under the eServer banner: AS/400 became iSeries, RS/6000 transformed into pSeries. Beyond the marketing, this restructuring concealed a technical reality. The arrival of the POWER4 processor in 2001-2002 brought the hardware platforms closer together. The differences between “p” and “i” systems were now limited to the software layers.

POWER5 took an additional step in 2004 by making the System i5 570 strictly identical to the System p5 570. AIX 5.3 arrived in August, taking advantage of this standardization. POWER6 followed in May 2007, accompanied by AIX 6.1 in November. IBM pushed the logic to its conclusion in April 2008 by merging its two lines under the Power Systems designation. A single hardware, three system choices: IBM i, AIX or Linux.

The 2010s followed at a sustained pace. POWER7 arrived in February 2010, AIX 7.1 in September. POWER7+ landed in August 2012, then POWER8 in June 2014. This latest processor broke with tradition: each core handled eight hardware threads in parallel, a massively multithreaded architecture that exploited large caches to maximize memory and I/O throughput.

AIX 7.2, announced in October 2015, brought a remarkable feature: Live Kernel Update. Kernel patches could now be applied without disrupting running applications. This version also cleaned up obsolete components and restructured the network with bos.net.tcp.client, but it absolutely required POWER7 or a later generation.

POWER9 arrived in 2016 with its 14 nm fabrication and 3.9 GHz frequency. Performance gains reached 50% compared to POWER8, and doubled versus POWER7+. On AIX 7.2 TL 3, SMT8 mode activated automatically, fully exploiting the processor’s capabilities.

Today, AIX powers the most critical infrastructures. Finance, industry, retail, telecommunications, healthcare, government and public sector: wherever reliability is non-negotiable. The system has proven its ability to evolve with new hardware architectures without sacrificing compatibility with existing systems.

This technical longevity perfectly illustrates IBM’s strategy: converging its server lines toward a unified platform while preserving customers’ software investments. AIX embodies this continuity in innovation, heir to UNIX but resolutely oriented toward the future of enterprise computing.

Top

Eiffel

September 23, 1985. In the offices of Interactive Software Engineering, a small Californian company in Santa Barbara, Bertrand Meyer works on an internal project that will disrupt many established practices in the world of object-oriented programming. On that day, Eiffel is born, though it will not be until mid-1986 before the first operational version is running.

Meyer does not start from scratch. Since 1973, he has been working with object-oriented concepts using Simula 67. Jean-Raymond Abrial’s work on formal specification fascinates him, particularly his original version of the Z language. Liskov, Zilles, and Guttag have broken ground in abstract data types, a field Meyer has explored. The prestigious lineage of Algol 60, Algol W, Pascal, and Ada shapes the general architecture of the new language, while Floyd, Hoare, and Dijkstra contribute their expertise in program proof and axiomatic semantics.

This particular alchemy produces something unusual. In September 1986, at the first “Object-Oriented Programming, Systems, Languages & Applications” conference in Portland, the presentation of Eiffel causes a sensation. The concepts developed surpass what exists at the time, both in industry and research laboratories. Commercialization follows with the internal compiler becoming available in December 1986. Companies and universities can finally get their hands on it.

Version 2 arrives in 1988, the year that sees the publication of “Object-Oriented Software Construction”, the book that will propel Eiffel into the spotlight. The 1997 edition, considerably expanded, confirms the language’s growing influence. Version 2.3 from 1990 marks the peak of ISE’s original technology.

Meyer then makes a bold decision: in 1990, ISE releases the language definition into the public domain. This openness triggers a flourishing of projects—compilers and libraries spring up everywhere. The language undergoes a general overhaul, simplifying in some areas, enriching in others, but retaining its original soul. This remarkable stability makes Eiffel a language that has hardly changed since its 1985 design.

Between 1990 and 1993, the ISE team completely rethinks its technology. Version 2.3 serves as the foundation for this complete reconstruction. ISE Eiffel 3 arrives in 1993 with its graphical development environment, followed by ISE Eiffel 4 in 1997 and Eiffel 5 in 2001. These versions bring their share of innovations: agents borrow from functional languages, introspection examines programs, the Precursor mechanism simplifies repeated inheritance, object creation gains flexibility, and conversion mechanisms are refined.

The BON notation emerges in 1995 with the publication of “Seamless Object-Oriented Software Construction” by Waldén and Nerson. This Business Object Notation extends Eiffel toward analysis and design, in a language understood by managers, analysts, and system architects.

The ecosystem grows. Object Tools in Germany develops Visual Eiffel, successor to Eiffel/S. The University of Nancy offers Small Eiffel as a free version, Halstenbach GmbH markets its own German solution. Libraries proliferate: 3D graphics, lexical analysis with Gobo, DirectX interfaces, variable precision calculation. Every domain finds its solution.

Industries discover Eiffel’s virtues. Banking adopts it, finance takes interest, accounting experiments with it. Telecommunications, healthcare, CAD-CAM, simulation, and scientific computing place their trust in it. The Rainbow system at CALFP bank tells this success story: initially designed for derivatives trading, it ends up supervising the majority of banking operations.

Universities embrace the language with enthusiasm. Many faculties make it the first language taught to students. Others integrate it at different levels of the curriculum, encouraged by attractive packages from publishers.

Why this name, Eiffel? Meyer pays tribute to Gustave Eiffel, the engineer who did not merely erect the Parisian tower. The metal framework of the Statue of Liberty in New York, a railway station in Budapest—so many enduring constructions emerged from his workshops. The Eiffel Tower, begun in 1887 for the 1889 Universal Exhibition, respects deadlines and budget. It survives political hostilities, resists attempts at destruction, finds new uses with radio and television. Built from robust and elegant patterns, combined and varied to produce a powerful result, it perfectly embodies what Eiffel accomplishes in the software world.

“Design by Contract” forges Eiffel’s identity. This approach advances software construction by founding it on contracts between clients and suppliers. Mutual obligations and benefits are made explicit through assertions, clearly establishing the responsibilities of each component. This philosophy produces more reliable and maintainable software, where each element knows its role and its limits.

Top

Erlang

In 1986, in the offices of the Ericsson Telecom AB computer laboratory, Joe Armstrong faced a technical puzzle: how to program telephony applications that defied all the conventions of the time? Telephone switches handled thousands of simultaneous calls, these operations were naturally distributed across multiple machines, and the slightest failure made headlines. Nothing like a desktop application that crashes without consequence.

Armstrong first explored different approaches, developed prototypes in Smalltalk, and invented a graphical notation to describe telephony operations. It was Roger Skagervall who made the decisive observation: this notation resembled Prolog. Armstrong then switched to this language, and thus Erlang was truly born.

The technical context at Ericsson influenced the design of the new language. PLEX, created by Göran Hemdahl to program AXE switches, imposed its constraints: modifying code without stopping the system, avoiding at all costs the pointer errors that had handicapped previous generations. Armstrong and his team integrated these lessons into Erlang’s DNA.

Robert Virding joined the project, and together they developed the first versions of the language, still in Prolog. The moment of truth came in 1989 with the ACS/Dunder project. The team developed 25 telephony features in Erlang, roughly one-tenth of those in the MD110. The results exceeded their expectations: depending on the features, development accelerated from 3 to 25 times compared to PLEX.

These successes attracted attention, and Erlang gained maturity. In 1993, Bogumil Hausman developed the Turbo Erlang system, renamed BEAM (Bogdan’s Erlang Abstract Machine), which drastically improved performance. Claes Wikström added distribution support, enabling execution across multiple machines.

But it was an unexpected event that would propel Erlang beyond Ericsson’s walls. In 1998, Ericsson Radio Systems banned the use of the language for new developments. This decision, which could have spelled the end of Erlang, produced the opposite effect. In December, Ericsson released the source code, and part of the original team left the company to found Bluetail AB, a company that would use Erlang as its core technology.

Erlang’s technical architecture reflects its original mission. The language structures programs around concurrent processes that share no memory and communicate through asynchronous messages. These processes belong to the language, not the operating system, which makes them remarkably lightweight. If one of them fails, the others continue their work as if nothing had happened.

The message reception system constitutes one of Erlang’s most remarkable innovations. A process can selectively wait for certain types of messages while leaving others pending. This approach greatly simplifies the programming of complex communication protocols, a considerable asset in the telecommunications world.

The AXD301, an ATM switch developed by Ericsson, is Erlang’s technological showcase. In 2001, this system comprised 1.13 million lines of code distributed across 2,248 modules. It achieved the coveted availability of 99.9999999%, demonstrating that the language was capable of handling large-scale industrial projects with exceptional reliability.

The OTP (Open Telecom Platform) system, developed from 1996 onward, enriched the Erlang ecosystem. This collection of libraries and design patterns offered "behaviours," abstractions that encapsulated common patterns such as the client-server model or event handling. OTP transformed Erlang from an experimental language into an industrial platform.

The arrival of multicore processors retrospectively confirmed the relevance of the initial choices. The concurrency model without data sharing naturally adapted to hardware parallelism. Erlang programs exploited multicore architectures without modification, a considerable advantage at a time when other languages struggled to take advantage of this evolution.

Beyond telecommunications, Erlang found its place in various domains. The ejabberd messaging server (Erlang Jabber Daemon) handled millions of simultaneous connections, CouchDB stored and distributed data at large scale, and RabbitMQ routed messages between distributed applications. These successes validated the original approach: the constraints of telephone systems were found in many contemporary applications.

The language initially addressed the particular requirements of Ericsson, but its solutions proved relevant for many current problems. The initial architectural choices, particularly the message-passing concurrency model, seemed radical but today enable natural exploitation of modern parallel architectures.

Top

IMAP

In 1970, who could have imagined that electronic messaging would become one of the most universal applications on the Internet? At that time, messaging systems resembled internal address books, confined to users sharing a single machine.

The 1980s radically transformed the computing landscape. The arrival of individual workstations disrupted established practices. Users now demanded nomadic access to their messages from any workstation while maintaining a unified view of their mail. The POP protocol, launched in 1984, partially met this expectation by downloading messages locally. But this solution quickly showed its limitations when it came to managing emails stored on the server.

Mark Crispin observed these developments from Stanford University. In 1986, he developed IMAP, a radically different approach that placed the server at the heart of message management. The first public version, IMAP2, emerged in 1988. The major innovation lay in the ability to directly manipulate messages on the server without prior downloading. This architecture enabled the creation of multiple folders and offered sophisticated search functions within email content.

IMAP’s technical architecture rested on a clear separation between two components: the message handling system (MHS) that orchestrated transfers, and the user agent (UA) that interacted with the interface. This modularity opened an immense field for email client developers, free to innovate on user experience without worrying about transport mechanisms.

The 1990s marked IMAP’s gradual adoption by the Internet community. The protocol was enriched through successive versions: support for attachments via MIME, reinforced authentication mechanisms, disk space quota management. IMAP4 obtained Internet standard status in 1994, consecrating its technical maturity.

The University of Pittsburgh offers a striking example of this transition. In 1997, the institution launched a massive project: migrating 25,000 users to IMAP. The operation, completed in 2000, required developing custom tools to convert old messaging systems, train teams, and deploy server infrastructure equal to the stakes. This migration perfectly illustrates the organizational challenges that adopting IMAP at scale represented.

IMAP’s technical sophistication shows through in its connection state management. The protocol juggles between three modes: connected, disconnected, and synchronization. This flexibility proved prophetic with the explosion of mobile usage. IMAP already knew how to handle intermittent connections, optimize data transfers, and maintain consistency between client and server—concerns that would become central with the advent of smartphones.

In retrospect, IMAP anticipated cloud computing technologies: storage centralization, multi-device access, transparent synchronization. These characteristics explain its remarkable longevity and adoption by virtually all email providers, from web giants to local hosters.

Email clients contributed significantly to IMAP’s success. Pine, with its spartan but effective interface, Mulberry and its advanced features, then Thunderbird and its democratization successively popularized the protocol. Native integration into operating systems and web browsers completed its universal distribution.

IMAP’s impact on professional practices goes beyond the purely technical realm. Shared mailboxes, simultaneous message access by multiple collaborators, nomadic email consultation transformed work organization. Long before the smartphone era, IMAP made mobile messaging possible, an essential condition for today’s professional flexibility.

The technical innovations carried by IMAP spread to other domains. Fine-grained network connection management, robust client-server synchronization, data flow optimization inspired many other protocols. This technical fecundity testifies to IMAP’s quality of design.

Despite competition from webmail and the explosion of instant messaging, IMAP retains a prominent place in the Internet ecosystem. Its ability to evolve while preserving compatibility, its proven robustness, and its open standardization make it an enduring technology. Current developments focus on performance improvement, security reinforcement, and adaptation to new uses such as the Internet of Things.

Top

NSFNet

By the late 1970s, growing frustration gripped the American scientific community. ARPANET connected a dozen privileged universities, but the Department of Defense refused to extend this network to other institutions. This situation created a technological divide between connected establishments and others, deprived of the benefits of digital communication.

Larry Landweber grasped the magnitude of the problem. In May 1979, he convened a meeting at the University of Wisconsin to seek solutions. The assembly discovered that email and file transfer services were transforming the way researchers worked. Already, a few specialized networks were showing the way: THEORYNET brought together 200 theoretical computer scientists, SAMNET united 50 performance analysis specialists. These limited experiments revealed the potential of electronic exchanges for research.

Six months later, a consortium of universities knocked on the National Science Foundation’s door. The project seemed attractive: create a national network accessible to all computer science departments, with reasonable costs and usage-based billing. The infrastructure would rely on commercial networks like Telenet. But the evaluators remained skeptical. Wasn’t there overlap with ARPANET? How would such a project be managed? The NSF preferred to commission a detailed study.

The summer of 1980 saw the birth of a planning committee bringing together nineteen network experts. Two discoveries changed everything. The MMDF software, designed at the University of Delaware, transported messages across different media, including ARPANET and telephone lines. In parallel, DARPA adopted internet protocols that allowed communication between distinct networks. These technical innovations opened unprecedented perspectives: multiple physical networks could form a single logical organization.

A new proposal emerged in the fall of 1980. The consortium expanded to include the universities of Wisconsin, Purdue, Utah, Delaware, and the Rand Corporation. The National Science Board approved the project in early 1981, but imposed one condition: the NSF would direct the operation for two years before handing it over to an independent structure. Contracts were signed in the spring.

In 1985, the NSF launched a more ambitious challenge: connecting its supercomputing centers scattered across the country. NSFNET was born in 1986, with 56 kbps links connecting six strategic sites. The technical innovation relied on LSI-11/73 minicomputers equipped with Fuzzball software. This system integrated internet protocols with sophisticated routing and congestion management mechanisms.

The success exceeded expectations. As early as 1988, link saturation required an upgrade to 1.5 Mbps. The network expanded to regional academic networks, creating a complex web of interconnections. The Fuzzball nodes orchestrated adaptive routing, synchronized clocks with precision, and fairly distributed resources. Software agents filtered routing information between the network core and its periphery.

The rise of the internet transformed the landscape. Between 1993 and 1998, NSFNET evolved into a commercial architecture. Private providers multiplied, pushing the NSF to rethink the network organization in 1993. This new structure endures today. Contracts signed in 1995 established interconnection points between commercial networks. In April 1995, the public NSFNET service closed its doors, replaced by a mesh of private networks.

This transition was accompanied by an unexpected transfer of responsibilities. Since the beginning, the Department of Defense had managed domain names for its military users. In the early 1990s, academic institutions represented the majority of new registrations. The Federal Networking Council then entrusted this mission to the NSF. Faced with exploding demand, registration became fee-based in September 1995. The figures were staggering: 7,500 domains in 1993, 2 million in 1998.

The year 1998 marked the complete privatization of internet’s critical functions. ICANN (Internet Corporation for Assigned Names and Numbers) was created to supervise the domain name system. The NSF refocused on its core mission: supporting research on network protocols and applications. It continues to help isolated institutions connect.

NSFNET transformed the internet. From a confidential military and university network, it became the backbone of a global infrastructure. Its distributed architecture and public-private collaboration mechanisms shaped the current organization of the worldwide network. The technical innovations developed for this project—adaptive routing, precise time synchronization—remain pillars of the internet today.

Beyond technology, NSFNET invented a unique governance model. Universities, companies, and public agencies collaborate to preserve the network’s openness while ensuring its growth. This success demonstrates how a public initiative can trigger the development of infrastructure that has become indispensable to the modern digital economy.

Top

SCSI

In the 1960s, IBM was working on its mainframe computer, the 360, and developed an innovative I/O bus for its time: it could communicate simultaneously with multiple peripherals. This bus evolved to become the OEM Channel, but when IBM submitted it to ANSI for standardization, the institute refused. The reason? Its overly proprietary nature. ANSI preferred to see the emergence of a parallel I/O bus that would meet commercial needs without depending on a single manufacturer.

The story took an unexpected turn in the early 1980s. At Shugart Associates, a hard disk drive manufacturer, a few engineers who would later found Adaptec developed a parallel interface they named SASI (Shugart Associates System Interface). This specification circulated among manufacturers and met with great success. In 1982, building on this adoption, it was presented to ANSI as the basis for a new standard. The institute seized the opportunity, formalized and extended the SASI specifications, but changed its name to SCSI to avoid any reference to a particular manufacturer. On June 18, 1986, SCSI officially became an ANSI standard.

SCSI stood out for its versatility: it could control a multitude of peripherals, from hard drives to printers to CD-ROM drives and scanners. Its architecture interconnected small computers with their intelligent peripherals, particularly storage systems. Performance was remarkable: up to 4 MB/s depending on implementation, with a range that reached 25 meters thanks to differential drivers and receivers.

What made SCSI clever was its logical rather than physical addressing protocol for all data blocks. Each logical unit could be queried to determine its capacity in blocks. This abstraction considerably simplified the management and replacement of peripherals. The protocol provided for the connection of multiple initiators and targets, with a distributed arbitration system built directly into its architecture.

The 1990s revealed an interesting divide in the computer world. Macintosh computers massively adopted SCSI to connect their peripherals, while PCs remained faithful to the IDE/ATA interface. This difference was not trivial: almost all Mac components connected via SCSI, whereas IDE only handled hard drives. On PCs, users had to juggle specific controllers for CD-ROM drives (often integrated into sound cards) and connect tape drives to the floppy disk controller. SCSI avoided this proliferation of interfaces through its single bus.

The race for performance constantly pushed the technology to new heights. The original SCSI operated with an 8-bit bus clocked at 5 MHz, which yielded approximately 5 MB/s. To go faster, two paths were available: widen the bus or speed up the clock. These successive improvements gave birth to SCSI-2, which introduced variants such as Wide SCSI (16-bit bus) and Fast SCSI (10 MHz clock).

HVD (High Voltage Differential) appeared in the SCSI-2 standard to push distance limits. This technique used an additional pin that guaranteed precise transmission, allowing cable runs of up to 25 meters. Ultra SCSI, which fell under the SCSI-3 standard, refined the concepts of SCSI-2. Throughput continued its ascent with Wide Ultra SCSI, Ultra2 SCSI, and Wide Ultra2 SCSI. Ultra3 SCSI simply doubled the clock frequency while maintaining its compatibility with SCSI-3.

SCSI’s speed owed much to its underlying protocol, SCSI Block Commands (SBC). This system authorized continuous data transmission between peripherals with a single command. Take the example of an audio CD: the SCSI controller sent an SBC signal to the drive, which then streamed data to the host controller until the end of the disc or until receiving a stop signal. The drive could redirect its data directly to a hard disk, bypassing the controller and system bus to conserve bandwidth.

Yet, despite its undeniable qualities, SCSI ultimately lost the consumer market battle to IDE/ATA. The arrival of ATAPI (AT Attachment Packet Interface) changed everything: IDE peripherals could now handle devices other than simple hard drives. This extension of the IDE standard, coupled with the high cost of SCSI controllers, precipitated the gradual abandonment of this interface in personal computers. But SCSI did not disappear: it retained its place in professional systems and servers, where its technical characteristics remained essential.

Innovation did not stop there. iSCSI enabled SCSI commands to be sent over local, wide area, or Internet networks, revolutionizing remote access and management of SCSI peripherals. The influence of this interface was also found in other technologies: IEEE-1394 (FireWire) relied on a subset of SCSI-3 specifications to control its peripherals, and FireWire hubs bore a striking resemblance to simplified SCSI adapters.

Today, SCSI’s legacy still permeates modern computer storage. Its logical addressing approach and command architecture have inspired the interfaces that succeeded it. Its ability to efficiently manage peripherals and its robustness in professional environments have made it a foundational technology.

Top

Bash

In the UNIX shell universe, few stories rival that of the Bourne-Again Shell. Bash takes its name from a play on words with Steve Bourne, the man behind the original UNIX shell (/bin/sh) from Bell Labs’ seventh edition. Brian Fox, working for the Free Software Foundation, launched this project with a clear ambition: to create a free command interpreter that would become the heart of the GNU system.

The exercise proved more challenging than it appeared. Recreating a Bourne-compatible shell wasn’t limited to developing documented features. Fox and his team discovered that Unix users exploited every corner of the original shell, including its most obscure behaviors and undocumented subtleties. The Bourne shell’s grammar lacked precision, turning every detail into a technical challenge.

Despite these difficulties, Bash grew richer. The developers drew from the best of other shells: the advanced features of the Korn shell (ksh) and the innovations of the C shell (csh) came to complement the POSIX-compatible base. This hybrid approach gave birth to a tool that respected standards while offering a modern user experience.

When Chet Ramey from Case Western Reserve University took the reins of the project, Bash entered a maturation phase. Ramey brought a long-term vision and a rigorous development method. Under his direction, the shell gained stability and features. Version 1.13 adopted eight-bit clean operation. Until then, Bash used the eighth bit of characters to mark their status during expansions, which limited the handling of files containing accented characters or special symbols. This correction paved the way for better internationalization.

Bash’s interface revealed the attention paid to user experience. The readline library transformed command input into a true integrated text editor. Users of emacs or vi found their familiar keyboard shortcuts directly in the shell. Automatic completion guessed file names, available commands, and environment variables. Command history, inherited from the C shell, transformed each session into a personal database of reusable instructions.

One of Bash’s strengths lay in customization. Each user could shape their environment according to their preferences through the .bash_profile and .bashrc files. The command prompt adorned itself with colors and contextual information thanks to escape sequences. Some displayed the time, others the current directory or the system load level. This flexibility transformed the terminal from a simple interface into a true personalized dashboard.

Job control inherited from the Korn shell brought a multitasking dimension to the shell. Users juggled between multiple processes, suspended them, resumed them, or relaunched them in the background. Aliases shortened frequent commands, while brace expansion multiplied the possibilities for character string generation. Typing “file{1,2,3}.txt” automatically generated file1.txt, file2.txt, and file3.txt.

Bash’s distribution illustrated the free software dissemination methods of the 1990s. Source code circulated via anonymous FTP from MIT servers and their mirrors scattered around the world. The Free Software Foundation sold magnetic tapes and CD-ROMs for those who preferred physical media. Early Linux distributions adopted a dual strategy: a full-featured version coexisted with a minimal version intended to directly replace /bin/sh.

Technical evolution never stopped. Bash acquired support for one-dimensional arrays, copying this functionality from the Korn shell. Programmable completion grew more sophisticated, allowing developers to specify custom behaviors according to context. Dynamic loading of built-in commands extended the shell’s capabilities during its execution, transforming the interpreter into an extension platform.

Thanks to its remarkable portability, Bash ran on OS/2, DOS, and Windows NT. QNX and Minix integrated it into their standard distributions. This ubiquity reinforced its position as the reference shell and familiarized a generation of users with its specifics.

The user community shaped Bash as much as its official developers. Thousands of experience reports fed the development cycles. Bug fixes arrived from around the world, transforming each version into a collective effort. This collaborative dynamic gave birth to an exceptionally robust tool adapted to real needs.

Bash ultimately became much more than a simple shell: it embodied the UNIX spirit in its free and democratized version. Its comprehensive documentation, detailed man pages, and active maintenance made it a reliable companion for generations of programmers and system administrators. In web servers, backup scripts, or continuous integration pipelines, Bash continues to silently process millions of daily tasks, a direct heir to a tradition several decades old.

Top

GIF

In 1987, CompuServe launched the GIF format (Graphics Interchange Format) with a simple ambition: to create an image exchange system that would work across all platforms. At that time, sharing a graphic file between different systems was often an uphill battle. CompuServe bet on openness: the format specifications were freely published, which accelerated its adoption within the computing community.

The first version, dubbed GIF87a, incorporated the LZW compression algorithm developed by Terry Welch three years earlier. This process compressed images without losing information. The format supported up to 256 indexed colors, a limitation that seemed constraining but would later reveal its unexpected creative potential. Two years later, CompuServe refined its format with version GIF89a, which added transparency and, most importantly, the ability to create animations through successive images.

The honeymoon ended abruptly in 1994. Unisys, owner of the patent on the LZW algorithm, suddenly decided to demand royalties for any commercial use. The news hit the software industry like a bombshell. Many developers found themselves trapped, forced to rethink their business models or simply abandon support for the format altogether. This crisis paradoxically stimulated innovation: it led to the creation of the PNG format in 1996, designed as a free alternative to GIF.

But GIF resisted and transformed itself. With the explosion of the Web, its supposed weaknesses became strengths. This 256-color palette, deemed archaic by purists, inspired artists who found in it a minimalist and expressive aesthetic. Loop animation, initially conceived as a simple technical feature, has since become the format’s signature.

During the 2000s, creation tools became democratized while the first social platforms encouraged sharing of visual content. GIF found its calling: to become a universal language of instant emotion. "Reaction GIFs," these film or series excerpts transformed into expressive loops, revolutionized online communication. A few images were now enough to express what paragraphs sometimes struggled to convey.

Tumblr crystallized this phenomenon by encouraging content creation and reappropriation. On this platform, GIF transcended its status as a tool to become an artistic medium in its own right. Creators explored its possibilities, producing works that oscillated between kinetic art and social commentary. A digital generation grew up with this format, favoring immediate visual impact over narrative complexity.

The expiration of LZW patents between 2003 and 2004 definitively freed the format from its legal constraints. This emancipation coincided with the advent of smartphones and mobile social networks. GIF proved perfectly suited to the constraints of mobile communication: compact, universal, instantaneous. Its compatibility with all devices made it a common denominator in a fragmented technological ecosystem.

The art world eventually recognized the cultural importance of the format. MoMA exhibited GIFs in its collections, legitimizing this long-considered minor medium. This institutional recognition accompanied the emergence of new artistic practices such as the cinemagraph, which blends static photography and subtle animation to create unsettling images of suspended reality.

Despite the emergence of technically superior formats like WebM or MP4, GIF maintained its position. Its simplicity paradoxically constitutes its strength: no need for complex codecs, no compatibility issues, no restrictive licenses. This technical accessibility is coupled with cultural accessibility that makes it a democratic expression tool.

The history of GIF illustrates a rare phenomenon in the technological universe: the social appropriation of a technical format beyond its original intentions. What was meant solely to exchange static images has become a central element of contemporary digital culture. The format embodies specific creative practices, a recognizable aesthetic, an entire shared visual vocabulary.

This exceptional longevity in a perpetually changing environment underscores the importance of open and simple standards. GIF demonstrates that lasting innovations are not necessarily the most sophisticated, but those that leave users the freedom to appropriate and repurpose them according to their needs.

The format occupies a singular position at the crossroads of art, communication, and popular culture. Its influence extends far beyond the technical framework to permeate contemporary social and media practices. Thirty-five years after its creation, GIF continues to evolve, driven by the creative uses of a global community that finds in it a means of expression that is both accessible and sophisticated.

Top

Perl

In 1987, Larry Wall worked at NASA’s Jet Propulsion Laboratory. A linguist by training, he spent his days facing the UNIX screens of the era, juggling between sed, awk, and shell scripts to process data. One day, tired of this daily gymnastics between disparate tools, he decided to create his own language. Perl was born from this practical frustration: making UNIX programs communicate without going through countless contortions.

He invented nothing new but drew from what already existed, such as sed’s regular expressions, awk’s processing power, and C’s familiar syntax. But he assembled these elements with a fresh perspective and made Perl into that “glue” language that transformed the drudgery of data conversion into child’s play. Wall had shown his talents as an ingenious tinkerer with rn, one of the first Usenet newsreaders, and patch, that utility which revolutionized the distribution of software fixes.

Perl’s philosophy boils down to a few words: efficiency trumps elegance. Where other languages impose their rigid vision, Perl adapts to the programmer. Its credo, “There’s more than one way to do it,” is a flexibility that sometimes bewilders but seduces a growing community of system administrators and tinkerers of all kinds.

In the early 1990s, while Netscape popularized graphical browsing, Perl established itself as the indispensable tool for processing HTML forms, well before PHP. Its ability to parse text, its decent performance on small scripts, its availability on all platforms made it the ideal companion for early web developers. Amazon, Yahoo!, Slashdot: all these sites shaping the internet relied on Perl.

The language’s evolution followed a steady but sure pace. Version 2 improved regular expressions, version 3 handled binary data. Then came 1994 and version 5. Wall rethought the syntax, added proper object support, introduced modules. This reference version is the one developers would use for decades to come.

But Perl wasn’t limited to its creator’s whims. A community formed around the language, organized in an original way. The “porters” shared maintenance duties, coordinated by the perl5-porters mailing list. For each version, one of them inherited the “patch pumpkin,” overseeing modifications and releases. This informal governance worked remarkably well.

Perl’s real treasure is CPAN. This archive of reusable modules transformed the language into a digital Swiss Army knife. Need to process images? Connect to a database? Parse XML? There’s definitely a CPAN module for that. This wealth more than compensates for the base language’s flaws.

Wall, with his linguistic training, infused Perl with a dose of humanity. The language tolerates ambiguity, adapts to context, forgives approximations. A Perl program sometimes resembles poorly punctuated English prose. This expressiveness seduces, but it also repels: reading Perl written by another developer sometimes borders on cryptographic deciphering.

The question of licenses then stirred the free software community. Wall settled the matter by proposing a dual license: GPL on one side, Artistic License on the other, everyone chooses according to their needs. This legal flexibility heralded the open source approach that emerged in the late 1990s.

Companies smelled opportunity. O’Reilly & Associates published manuals that would become references. ActiveState adapted Perl for Windows, breaking the UNIX monopoly. These commercial initiatives didn’t undermine the project’s community spirit. On the contrary, they strengthened it by bringing resources and visibility.

The new millennium brought its share of competitors. Python seduced with its clarity, Ruby with its object elegance. These languages drew inspiration from Perl while correcting its supposed flaws. The Perl community didn’t sit idly by. In 2000, it launched the Perl 6 project, an ambitious overhaul of the language. Too ambitious perhaps. The project got bogged down, changed its name to become Raku, while Perl 5 continued its course, unperturbed.

Because Perl 5 had found its niche. Biologists use it to analyze DNA, administrators for their daily scripts, some websites continue to rely on it. The language has lost its star status, but it retains a loyal and competent community. Its conferences still gather hundreds of enthusiasts, its modules continue to enrich CPAN.

Perl showed that a free project could rival commercial solutions, that a dispersed community could maintain complex software, that an imperfect language could render immense services. In the servers running today’s internet, somewhere, Perl code surely continues to process your data, discreetly, efficiently.

Self

David Ungar and Randall Smith launched Self in 1987, a project that would challenge conventional wisdom. This new language deliberately rejected everything that seemed established at the time. Classes were gone, variables forgotten. Only objects and their conversations through messages remained.

This radical approach wasn’t gratuitous. It arose from frustration with the growing complexity of object-oriented systems. Smalltalk-80, though admired for its elegance, carried the burden of its meta-classes and their dizzying interconnections. Self proposed a different vision: why bother with molds when you can sculpt directly in the material?

In this streamlined universe, each object is a container of named “slots.” These slots store data and behaviors indiscriminately. Inheritance relies on simple “parent” pointers that create delegation chains between objects. A system so simple it’s bewildering.

Object creation follows natural logic. Instead of consulting a class blueprint, Self directly clones an existing prototype. This approach resolves at once the famous infinite regression of meta-classes that gave Smalltalk designers cold sweats. One object serves as a model for other objects, period.

Uniformity constitutes the other pillar of this architecture. Whether an object requests stored information or triggers a complex computation, the syntax is identical: sending a message. This consistency blurs the traditional distinction between properties and methods. The programmer navigates a homogeneous world where every interaction follows the same rules.

Self’s early implementations surprised with their performance. By 1989, the language caught up with and then surpassed Smalltalk in execution speed. This success concealed a discreet but decisive technical innovation. The team invented “maps,” internal structures that optimize how objects are organized in memory. These hidden maps are what would later be called “hidden classes.”

Customized compilation pushed optimization even further. Instead of producing generic code, Self generates machine code tailored specifically for each type of recipient. This dynamic specialization transformed the language’s performance.

Polymorphic inline caches emerged in 1991. These PICs memorize multiple method resolutions for each call site. Concretely, when a message can lead to different processing depending on context, the system keeps previously found solutions in reserve. Polymorphic calls, traditionally expensive, suddenly became affordable.

The 1996 version reached a new milestone with “type feedback” and adaptive optimization. The first mechanism integrates dynamic calls directly into compiled code, reducing method invocations by 70%. The second observes the program in action and reworks on the fly the most heavily used passages.

These technical advances caught Sun Microsystems’ attention. In 1991, the company recruited Ungar and Smith to continue their research. But the story took an unexpected turn when two key project figures, Urs Hölzle and Lars Bak, left Sun in 1994 to create Animorphic Systems. Their mission: develop Strongtalk, a high-performance Smalltalk.

Sun’s acquisition of Animorphic in 1997 sealed the fate of Self’s innovations. They migrated to HotSpot, the virtual machine that would propel Java to the heights. Lars Bak repeated the feat in 2008 with V8, Google’s JavaScript engine, which also drew from Self’s heritage.

This technical lineage constitutes only part of the story. Self also influenced language design. JavaScript adopted prototypes as its primary inheritance mechanism. Lua, Io, and other modern languages drew inspiration from this direct approach where objects beget other objects without intermediaries.

Self’s philosophy of concreteness is reflected in its development environment. Objects appear there as tangible entities, visible and modifiable in real time. This innovative graphical interface transformed programming into a tangible activity. You touch, you move, you transform. Abstraction gives way to direct manipulation.

Simplicity, uniformity, and concreteness are the three principles guiding this vision. Reduce the number of concepts, treat all operations identically, use comprehensible metaphors. These simple ideas produce an environment of surprising richness.

Self never conquered the programming masses. Too radical for some, too experimental for others, the language remained confined to research laboratories. Yet its influence spans decades. The optimization techniques it introduced now allow dynamic languages to rival their compiled cousins. Python, Ruby, PHP benefit from its innovations.

This singular history shows how a quest for simplicity can trigger a cascade of lasting innovations. Self demonstrated that sometimes everything must be torn down to move forward. Its minimalist design anticipated certain current trends where conceptual streamlining takes precedence over feature accumulation.

Self’s technical legacy powers the most performant virtual machines on the market. Its philosophy inspires designers seeking elegance. Forty years after its birth, Self remains a source of inspiration for those seeking to rethink object-oriented programming.

Top

Tcl

John Ousterhout and his students at the University of California, Berkeley, faced a recurring problem. They were designing interactive tools for integrated circuit design—Magic, Crystal, and others—and each required its own command language. The team devoted little effort to these languages. The result was predictable: limited syntaxes, inconsistent with each other, that eventually became cumbersome.

The solution germinated during the fall of 1987, while Ousterhout spent his sabbatical at the DEC research laboratory. Why not create a single interpreter, designed from the start as a library that any application could integrate? This language would provide the basics—variables, loops, procedures—and each program would add its own specialized commands. This extensibility would become the hallmark of Tcl, the “Tool Command Language.”

The initial development began in early 1988. A few months later, a graphical text editor was already using Tcl. Ousterhout didn’t expect other developers to take an interest in it—a misjudgment that would amuse him later.

Meanwhile, graphical interfaces were drastically complicating programmers’ lives. The 1980s saw increasingly sophisticated visual environments emerge, but their development required massive investments. Ousterhout worried: would small research teams still be able to innovate? The only way out was to assemble complex systems from reusable building blocks.

This reflection led to Tk, the graphical extension of Tcl. The idea was simple: build interfaces by combining ready-made components, with Tcl as the glue between elements. By the end of 1988, Tk development was beginning. It would take two years to obtain something truly usable.

In 1989, Tcl emerged from obscurity. Ousterhout presented it in January 1990 at the USENIX conference, where the reception was enthusiastic. He then decided to put the source code on Berkeley’s FTP server. The Internet did the rest: word of mouth spread Tcl at a surprising speed.

Don Libes of the National Institute of Standards and Technology created the first Tcl application to achieve resounding success. Expect automated interactive UNIX programs—a blessing for system administrators who spent their days juggling between terminals and scripts. Expect’s success brought Tcl to attention far beyond academic circles.

Growth became exponential in the early 1990s. From a handful of users in 1989, the number grew to several tens of thousands in 1993. Two reasons explain this enthusiasm. Tk represented the most direct way to create graphical interfaces under UNIX—five to ten times faster than with Motif. And Tcl met the needs of developers who wanted to make their applications scriptable without the burden of creating their own language.

In 1994, Ousterhout left Berkeley for Sun Microsystems. The company gave him the means to pursue his ambitions: a dedicated team that reached twelve people in three years. This period was prolific. Tcl broke free from UNIX to conquer Windows and Macintosh. The I/O system was redesigned, network sockets integrated. A compiler now transformed scripts into bytecode. Safe-Tcl provided a security model, while a plugin allowed scripts to be executed directly in web browsers.

1998 marked official recognition. Tcl received the ACM award for best system software, a distinction reserved for programs that have had a lasting impact on computing. The same year, the USENIX STUG award came to honor the tool’s technical excellence.

At the end of 1997, Ousterhout had created Scriptics to devote himself entirely to Tcl. The company developed TclPro, a suite of commercial tools, while continuing to distribute the core of Tcl for free. The April 1999 version 8.1 introduced Unicode, thread management, and a new regular expression engine.

The name Scriptics disappeared in May 2000, replaced by Ajuba Solutions to reflect the orientation toward XML. Interwoven acquired the company in October. To guarantee the independence of development, an autonomous core team—the Tcl Core Team—took over during the summer of 2000.

The history of Tcl testifies to the unpredictable trajectory of technical innovations. A modest university project can become a major tool if its design meets real needs and if a community embraces it. Tcl proved the importance of making applications scriptable and demonstrated the value of reusable components for graphical interfaces.

Top

Acorn Archimedes

In the fall of 1987, Acorn unveiled the Archimedes, a computer that would shake up the world of personal computing. Behind this name borrowed from the famous Greek mathematician lay a machine equipped with an ARM processor that ushered in the era of consumer RISC computing.

The story began a few years earlier, when Acorn’s engineers were searching for a 32-bit successor to the venerable 6502. After examining the processors available on the market, they concluded that none met their requirements. The decision was made: they would design their own chip. This technical audacity would give birth to the ARM, a processor of deceptive simplicity but formidable efficiency.

The ARM’s RISC architecture broke away from the growing complexity of processors of the era. Its 44 instructions mostly executed in a single clock cycle. This economy of means produced spectacular results: at just 4 MHz, the ARM delivered approximately four MIPS, outperforming many faster but less optimized processors. The 27 32-bit registers completed this elegant architecture, designed to work with standard dynamic memory.

Acorn developed the Archimedes in two distinct ranges. The A300 series targeted the education market, the natural successor to the BBC Micro that already equipped many British schools. These machines prioritized simplicity: no integrated hard disk controller, but the essentials for learning and discovery. The A400 series addressed more demanding users with its hard disk controller and expanded capabilities.

The A305 opened the lineup with 512 KB of RAM, while the A310 doubled that with 1 MB. At the top, the A440 impressed with its 4 MB of memory and 20 MB hard drive, a configuration that would put many competing machines to shame. These figures reflected Acorn’s ambition: to create a personal computer capable of rivaling professional workstations.

The Archimedes’ genius lay in its integration. Four specialized circuits orchestrated the electronic ballet: the ARM for calculations, the VIDC for display, the MEMC for memory, and the IOC for input-output. This approach drastically reduced the component count while optimizing performance. The results were visible on screen: 256 simultaneous colors, resolutions up to 640x512 pixels in color or 1152x976 in monochrome, and an eight-channel stereo sound system that made the Archimedes a multimedia machine ahead of its time.

In 1989, Acorn renewed its strategy with the A3000, a compact machine that integrated the keyboard into the main unit. This format recalled the Amiga 500 or Atari 520ST, but the ARM’s performance made all the difference. The A400/1 series corrected the early models’ teething problems and improved memory management. The A540, launched the following year, crossed a new threshold with its faster ARM 3 and expanded capabilities up to 16 MB of RAM.

The operating system followed this ambitious hardware evolution. Arthur, the Archimedes’ first OS, inherited the BBC Micro’s spartan approach. Its command-line interface put off novices but won over programmers with its power. Acorn understood that the future belonged to graphical interfaces and launched RISC OS in 1988.

RISC OS transformed the user experience. Its cooperative multitasking interface favored fluidity over absolute robustness. Applications civilly shared the processor, an approach that worked remarkably well thanks to the ARM’s performance. The context menu activated by the middle mouse button became an ergonomic signature, later imitated by other systems.

RISC OS’s modular architecture enabled remarkable extensibility. Modules, whether residing in ROM or loading into memory, enriched functionality without touching the kernel. This flexibility facilitated the integration of new peripherals and system evolution. The unified file system erased the differences between floppies, hard drives, CD-ROMs, and the Econet network, a modern approach that anticipated future needs.

BBC BASIC V, considerably enriched compared to its predecessors, took advantage of the 32-bit architecture for unprecedented performance. Developers discovered a responsive programming environment that transformed learning to code into a pleasure.

The Archimedes conquered British schools, perpetuating the BBC Micro’s educational tradition. Its computing power also attracted scientists and professionals who found a credible alternative to UNIX workstations. Acorn exploited this versatility by extending the ARM architecture toward the professional world with the R140 and R260 machines.

The ARM, born in Acorn’s laboratories to equip the Archimedes, would conquer the world. ARM Ltd., born from this adventure, has since become one of the semiconductor industry’s major players. Today, ARM processors equip virtually all our smartphones and tablets, retrospectively validating the British engineers’ bold choices.

The Archimedes’ legacy exceeds its modest sales. This machine demonstrated that an original approach to design could produce exceptional results. Its RISC architecture durably influenced the industry, its specialized circuits anticipated modern integration, and its operating system prefigured contemporary graphical interfaces.

The Archimedes remains a model of technical coherence. Every element—processor, memory, graphics, sound, operating system—participated in a harmonious overall vision. This global approach, rare in the computer industry, produced a machine with remarkable performance and refined ergonomics.

Top

Apple Macintosh II

March 1987: Apple upends its established practices with the Macintosh II. Three years after the first Mac, this machine breaks away from the conventions set by Steve Jobs. Gone is the compact, closed case; in its place stands an architecture open to expansion. This about-face addresses a necessity: IBM is gaining ground with its PCs, and Apple must attract professional users.

The machine’s heart beats with a Motorola 68020, clocked at 16 MHz. Apple pairs it with the 68881 math coprocessor, a combination that quadruples performance compared to previous Macs equipped with the 68000. Base RAM reaches 8 megabytes, expandable to 20 MB with a specialized kit. For storage, there’s an 800 KB floppy drive and, optionally, a 40 to 80 MB hard disk.

But the true revolution of the Macintosh II lies in its graphics capabilities. Color display makes its debut in the Mac lineup with a theoretical palette of 16.7 million shades. Only 256 display simultaneously, admittedly, but this already represents a considerable leap. More boldly, the machine accepts multiple monitors. Users choose between the 12-inch monochrome screen and the 13-inch color model, the latter exploiting Sony’s Trinitron technology and its reputation for excellence.

The exterior appearance breaks with Mac tradition. The horizontal case evokes IBM PCs rather than the vertical elegance of early Macintoshes. This form houses a motherboard equipped with six NuBus expansion slots. These slots accommodate graphics cards, SCSI controllers, or CP/M and Pascal emulators. Apple maintains its legendary simplicity: no jumpers to position, no DIP switches to set. Cards configure themselves automatically.

Connectivity doesn’t skimp on possibilities. Two serial ports, two Apple Desktop Bus connectors for peripherals, a SCSI interface, and a stereo audio output equip the machine. Sound deserves particular attention thanks to its custom chip. Four stereo voices with 44.1 kHz sampling: remarkable quality for the era.

MultiFinder accompanies this hardware power. This operating system allows simultaneous execution of multiple applications. Printing while downloading, calculating in the background during writing—all these possibilities represented innovation in 1987. The graphical interface retains Mac clarity while exploiting color.

The price reflects the product’s ambition: $5,498 with hard drive, equivalent to over $14,000 today. This pricing clearly targets professionals and affluent users. Results follow: desktop publishing, graphics, and engineering sectors readily adopt this machine.

Beyond sales figures, the Macintosh II establishes lasting standards. The importance of display quality, controlled modularity, configuration automation: these principles still guide contemporary computer design. Proof that a computer can combine power with ease of use, a philosophy that spans decades at Apple.

Production ends in January 1990, replaced by the more powerful Macintosh IIx and IIfx. The legacy endures in the professional workstations that follow. This attention to display and sound heralds the multimedia era of personal computing.

The Macintosh II crystallizes a particular moment in Apple’s history. The company demonstrates its ability to adapt to professional market constraints without abandoning its convictions. This search for balance between commercial openness and distinct identity still characterizes Apple’s development strategy.

Top

Commodore Amiga 500

Jay Miner had only one idea in mind: to build his dream computer around the Motorola 68000 processor. This Atari engineer kept hitting walls with his management, who found his project too expensive. A breakup was inevitable. In 1982, Miner walked out and founded Hi-Toro, soon renamed Amiga. The bet? Design a machine with unprecedented graphics and sound performance, featuring real multitasking, without breaking the bank for buyers.

The Amiga team worked in a bohemian spirit that contrasted sharply with the rigidity of large firms. Engineers would gather armed with foam bats to “vote” against bad ideas during their wild brainstorming sessions. This relaxed atmosphere unleashed creativity and pushed everyone to rethink personal computing. Prototypes piled up, made of gigantic printed circuit boards simulating future custom components. Innovation was boiling, but money was desperately scarce.

1984 sounded the alarm: Amiga was on the brink of bankruptcy. Atari offered a $500,000 loan, but the trap lay in the conditions. If the sum wasn’t repaid by the end of June, Atari would recover all the technology. A nightmare scenario for Miner and his team. At the last moment, Commodore emerged from the shadows with a $24 million buyout offer. The rescue came at a price: Amiga’s independence evaporated, but the technology survived.

Commodore brutally accelerated development. The Amiga 1000 arrived in 1985, a technically dazzling machine but a commercial failure. Its price discouraged families, while its positioning confused professionals. Thomas Rattigan took the helm at Commodore in 1986 and got straight to the point: the range would split into two distinct branches. On one side, the Amiga 500 to appeal to the general public. On the other, the Amiga 2000 for demanding users.

The Amiga 500 was born in 1987 from this differentiation strategy. Engineers took the 1000 model’s architecture and cut costs mercilessly. The keyboard integrated directly into the case, the power supply moved outside, direct TV output gave way to a separately sold adapter. But the technological core remained intact: three specialized processors named Agnes, Denise, and Paula orchestrated an unprecedented multimedia spectacle.

Agnes managed memory and visual effects, Denise handled display, Paula supervised sound and input-output. This electronic trinity delivered 4,096 simultaneous colors when competitors struggled to display a handful. Hardware sprites animated game characters with striking fluidity. Stereo sound across four sampled channels made other machines’ electronic beeps forgettable. The Motorola 68000 processor clocked at 7.14 MHz executed preemptive multitasking worthy of professional workstations.

AmigaOS completely rethought human-machine interaction. The Workbench interface impressed with its modernity, while the Exec kernel juggled concurrent tasks effortlessly. RJ Mical created with the Intuition API a masterpiece of programming that radically simplified interface creation. The system handled multiple screens of different resolutions, a unique technical feat at the time.

Game developers rushed to this unprecedented power. With its twelve parallax scrolling planes gliding at different speeds, Shadow of the Beast was mesmerizing. Defender of the Crown dazzled with its digitized illustrations of stunning realism. Musicians invented the MOD format, transforming the Amiga into a home recording studio. Deluxe Paint opened the doors of digital graphic creation to amateur artists.

A passionate community crystallized around the machine. Specialized magazines like Amiga World testified to this wild enthusiasm. Programmers exchanged tips and tricks to extract every last drop of performance from the hardware. Some completely disabled the operating system for total control, using the graphics processor as a debugger by changing the background color at strategic points in the code.

Europe massively adopted the Amiga 500. In the United Kingdom, it was the absolute reference for family gaming. Its affordable price and multimedia capabilities attracted budding graphic designers and musicians. Andy Warhol seized this new electronic canvas to explore the possibilities of digital art.

Yet Commodore squandered this success through disastrous management. The company accumulated strategic errors in the face of rising IBM-compatible PCs. Thomas Rattigan’s dismissal in 1987 deprived the company of any clear vision. The Amiga 600 that followed disappointed with its lack of ambition, merely a cosmetic update of an aging architecture.

The Amiga 500 remains in history as the symbol of technical innovation that took precedence over marketing considerations. Preemptive multitasking, hardware graphics acceleration, multi-channel sampled sound: concepts that wouldn’t become mainstream until a decade later. Its legacy endures in the video game industry and multimedia creation tools.

The loyalty of its community defies time. Teams like the Iraqi developers of Babylonian Twins pulled out their old Amiga projects years later to adapt them for smartphones. This determination testifies to the mark left by a machine that pushed the boundaries of personal computing, to the point of making certain professional computers obsolete.

Top

Sharp X68000

In 1987, while the Japanese computer landscape teemed with diverse machines – MSX, NEC PC-88, PC-98, Fujitsu computers – Sharp took a different direction. The company decided to create an uncompromising computer, even if it meant selling it at a premium price.

The Sharp X68000 displayed its ambitions from first glance. Its dual-tower case, equipped with a retractable handle, evoked high-end Hi-Fi equipment more than a traditional computer. This futuristic aesthetic drew inspiration from the Sharp X1 Twin, which already combined an X1 computer and a PC Engine console. The industrial design reached a milestone here: the computer became an object of desire.

Under this distinctive hood lay precision machinery. The Motorola 68000 processor ran at 10 MHz, supported by 1 MB of RAM expandable to 12 MB and 1 MB of dedicated video memory. These specifications placed the machine in a class of its own. The X68000 displayed 65,536 simultaneous colors and handled up to 128 hardware sprites – figures that made video game developers’ mouths water.

The price reflected these capabilities: 369,000 yen, approximately $6,000 in today’s dollars. Sharp wasn’t targeting the general public but an audience of knowledgeable enthusiasts and professionals. This strategy materialized through the development of tools and libraries specifically designed for game creators. The gamble paid off: the X68000 hosted versions often superior to the arcade originals of Final Fight, Ghouls’n Ghosts, and Street Fighter II.

The Human68k operating system, designed by Hudson Soft, borrowed from MS-DOS while adding its own innovations. Early machines used the VS interface, later replaced by SX-Window on subsequent models. A third interface, Ko-Window, resembled Motif and was developed by third parties.

The X68000’s story unfolded over the years through a succession of models. In 1988 came the ACE, followed the next year by the EXPERT which included 2 MB of RAM as standard. The PRO model adopted a conventional PC case to accommodate more expansion slots. 1990 marked the arrival of the SUPER versions, which replaced the SASI interface with SCSI. The XVI of 1991 pushed the processor to 16 MHz, while the Compact XVI of 1992 offered a more compact format with 3.5-inch drives. The series peaked in 1993 with the X68030, powered by a 68030 processor at 25 MHz.

The richness of the X68000 ecosystem remains impressive. MIDI expansion cards allowed connection to synthesizers like the Roland MT-32. The TS-6BGA graphics card combined acceleration and PCM sound. Some accessories now reach heights in terms of rarity, such as the POLYPHON card which integrated FPU, MIDI, PCM, and 8 MB of RAM in a single module. For controls, joysticks included the Cyber Stick for simulations and the surprising XE-1 with its unconventional design.

The creativity of the homebrew community produced remarkable works. Cho Ren Sha 68k, developed by two people in 1995, remains a striking example. This shoot-’em-up displayed 512 simultaneous sprites through advanced multiplexing techniques – a technical feat all the more remarkable as it was programmed primarily in C rather than assembly. This achievement testifies to the quality of the development tools made available.

Yet the X68000 adventure ended in 1993. Sharp had not significantly evolved the video and audio hardware for six years. Facing the rise of PCs equipped with powerful graphics and sound cards, the machine lost its appeal despite its intrinsic qualities. A PowerPC version project was also abandoned. But an unexpected gesture changed everything. In 2000, Sharp, Hudson, and other involved companies released the Human68k operating system, BIOS, and various tools into the public domain. This exceptional decision keeps this system accessible through legal and faithful emulation of the machine.

The X68000 embodies a particular Japanese vision of 1980s personal computing. Where others favored compromise between price and performance, Sharp chose technical excellence and bold design. Its ability to reproduce the arcade experience, exceptional graphics performance, and advanced architecture make it a machine apart. However, power supply reliability issues and its prohibitive price limited its distribution.

Top

OS/2

In 1987, IBM and Microsoft forged an unlikely alliance to create OS/2, the operating system designed to run on IBM’s new PS/2 computers. This collaboration between two giants stemmed from a simple observation: DOS was showing its limitations against Intel 80286 processors. They needed to do better, to be more modern. The idea looked attractive on paper, but it already concealed latent tensions between two diametrically opposed visions of computing’s future.

OS/2 delivered what users of the time were missing: a proper graphical interface and features previously reserved for mainframe systems. It shared certain traits with Windows, UNIX, or Xenix, but its ambitions exceeded those of its competitors. This first iteration gave no hint of the resounding divorce that would shake the industry.

The two partners were looking in opposite directions. Microsoft was eyeing Windows, which it considered the future. IBM wanted to continue with OS/2. This growing opposition finally exploded in 1992, causing what observers called a "divorce." Microsoft took its marbles and part of the code, which it recycled into Windows NT and Windows 95. IBM found itself alone with its system.

To bounce back, IBM turned to Commodore Business Machines and drew inspiration from AmigaOS. OS/2 version 2.0, launched in March 1992, introduced the Workplace Shell, a transformation in the user interface. This innovation marked the entry of object-oriented programming into the mainstream world of operating systems. A first that didn’t go unnoticed.

The Workplace Shell shook up established habits. Dragging and dropping icons into folders, accessing menus through right-clicks, triggering printing by simply moving an icon: so many gestures that changed human-machine interaction. The catch came from hardware limitations, as computers lacked RAM, forcing users into costly upgrades to fully exploit these innovations.

1994 saw the birth of OS/2 Warp, the version that would mark the system’s peak. This release appealed through its ease of use and 32-bit architecture, while maintaining DOS compatibility. A significant technical asset against the Windows versions of the time, mired in their 16-bit DOS base. Yet despite these undeniable qualities, OS/2 couldn’t break through in the consumer market.

The system found refuge in specialized niches, particularly bank ATMs where its stability and reliability worked wonders. Microsoft, meanwhile, deployed an aggressive commercial strategy to impose Windows, relegating OS/2 to these technical niches. A frustrating situation for IBM, which saw its system confined to specialized uses.

IBM finally threw in the towel in 2005, stopping OS/2 production, followed by support in 2006. Die-hard fans of the system demanded its release under a free license, but this approach ran up against Microsoft’s intellectual property rights on certain parts of the code.

The story of OS/2 doesn’t end there. In 2001, Serenity Systems obtained a license from IBM and marketed the system under the name eComStation. Later, in 2015, Arca Noae picked up the torch with ArcaOS.

OS/2 introduced innovations that are part of our digital daily life. Its rigorous design and pursuit of stability established new benchmarks. Yet its commercial failure reminds us that technical qualities aren’t enough: commercial strategies, industrial alliances, and the ability to build an ecosystem largely determine a technology’s success.

Top

Unicode

In 1968, ASCII became a standard in the United States. The principle was to assign a numerical value between 0 and 127 to each character. The letter “a” takes code 97, “Z” takes code 90. Simple, effective... but terribly limited. It was impossible to write “naïve” or “café” correctly with this system designed for English.

During the 1980s, personal computing on 8-bit machines took off. The possible values extended up to 255. Manufacturers and publishers rushed to fill these 128 additional positions with accented characters. Each came up with their own solution. Commodore placed French accented letters in its own way, IBM too with its code page 437. A file created on one machine was unreadable on another. What looked like French on an Amstrad CPC turned into hieroglyphics on a PC compatible.

Conventions gradually emerged. The International Organization for Standardization standardized certain encodings, others gained dominance through commercial use. But the fundamental problem remained: 255 characters was laughable compared to actual needs. It was impossible to integrate French accents and the Cyrillic alphabet simultaneously. Users juggled between encodings depending on the language of the moment. KOI8 for typing in Russian, Latin1 for French. Writing a text quoting Tolstoy in the original version was a technical feat.

In the late 1980s, a few visionaries embarked on an ambitious project: creating a system capable of representing all characters from all human languages. Unicode was born from this ambition, with the initial idea of moving to 16-bit characters. Instead of 255 positions, 65,536 slots became available. Enough room for Chinese, Arabic, Hebrew, Cyrillic, and Latin alphabets within the same system.

ISO worked in parallel on its 10646 standard. Two projects, one objective. Fortunately, the teams eventually merged their efforts with Unicode version 1.1. The 65,536 positions proved insufficient, and the modern specification extends to 1,114,111 possible characters (0x10FFFF in hexadecimal notation).

Unicode goes far beyond a simple character catalog. The standard defines complex properties and usage rules. The responsible consortium publishes a manual of over 1,000 pages. Added to this are 14 technical annexes, 7 standards, 6 technical reports, and 4 stabilized reports. Hyphenation, bidirectional rendering, vertical layout: everything is covered. The scope of this documentation reveals the true complexity of multilingual text processing.

But in practice, how do you store these characters in memory? The brute force solution consists of reserving 32 bits per character. But it quadruples disk space and bandwidth compared to ASCII. Not to mention portability problems between processors that organize bytes differently.

UTF-8 provides the elegant answer. This format uses a variable number of bytes depending on the character. A single byte for ASCII (values below 128), two bytes for values up to 2,047, three or four bytes beyond. ASCII compatibility preserved, no troublesome null bytes, possibility of resynchronization in case of corruption: the advantages pile up.

UTF-16 also exists but remains less popular. It encodes characters on two or four bytes with a special mechanism called “surrogate pairs” for characters outside the basic multilingual plane. A byte order marker indicates the organization of data in memory.

Standard updates sometimes create turbulence. Amendment 5 to ISO 10646 moved and extended the Korean Hangul block, invalidating existing data. This episode, dubbed “the Korean mess”, pushed standardization committees to commit against this type of incompatible modification.

Unicode manages sophisticated mechanisms. Characters combine, like an accent modifying its base letter. The standard defines bidirectional rendering rules for mixing Arabic texts (right to left) and Western texts (left to right) in the same document.

The adoption of Unicode transformed computing. Operating systems, Internet protocols, file formats: everything shifted toward this unified representation. Software internationalization became greatly facilitated. Exchanging a multilingual document finally became natural.

The Unicode Consortium continues its enrichment work. New characters to cover historical scripts, support for emerging needs: the standard constantly evolves. The arrival of emojis illustrates this capacity to adapt to new uses of digital communication. Each addition requires meticulous care to preserve overall coherence.

Unicode marks a breakthrough in the history of textual computing. This standard establishes a solid technical foundation for multilingual processing. Its universal adoption proves that it is possible to create complex international standards that effectively serve all users on the planet.

Top

FrameMaker

In 1986, an astrophysics student at Columbia University dropped out to embark on the adventure of technical publishing. Charles Corfield had just developed a prototype called /etc/publisher and was convinced he could do better than what existed at the time. Dissatisfied with the available tools and eager to secure a stable financial future, he left university to devote himself entirely to programming.

This decision marked the birth of Frame Technology, a company founded by four individuals with complementary profiles. Alongside Corfield and his expertise in mathematics and development, Steven Kirsch brought entrepreneurial experience forged at Mouse Systems. David Murray mastered the intricacies of user interfaces and publishing, while Vickie Blakeslee excelled in organization and operational management. This combination of diverse expertise created fertile ground for innovation.

The technical publishing market was then dominated by Interleaf, which ruled the roost on UNIX workstations. Its system, sold for around $30,000, required complex installation and imposed its own interface standards. Facing this monopoly, Frame Technology chose a radically different approach. FrameMaker was offered at $2,500 with a native interface to operating systems and remarkably simple installation.

This disruptive strategy was built on remarkable technical innovations. FrameMaker succeeded in combining the functionality of word processors and desktop publishing software. The program handled automatic numbering of chapters and sections, real-time cross-references, offered sophisticated table management and advanced multilingual support. Its ability to handle large documents spread across multiple files appealed to technical writers. The architecture based on the concept of frames offered unprecedented flexibility in integrating text and graphics.

Success came quickly. From beta version 0.6, marketed in 1986, the company sold several hundred licenses. The first customer, an engineering group at John Deere, paved the way for rapid adoption. U.S. federal agencies were attracted by this approach not tied to specific hardware. Partnerships with UNIX workstation manufacturers like Sun Microsystems accelerated this expansion.

Initially developed for UNIX, FrameMaker was adapted for Macintosh and then Windows. This portability was based on an innovative software architecture, the Device Independent Maker, which separated the core code from interfaces specific to each system. The MIF (Maker Interchange Format) exchange format ensured interoperability between platforms, a growing concern in an increasingly heterogeneous computing world.

Frame Technology also innovated commercially. The shared licensing system allowed multiple users to share a limited number of simultaneous access points, significantly reducing costs for organizations. The company developed strategic partnerships, notably with Toshiba, which generated over $5 million in royalties. This flexible approach to the business model greatly contributed to the software’s adoption.

The company’s growth was spectacular. From 5 employees in 1986, Frame Technology grew to 297 people in 1991. Revenue followed a similar trajectory: $3.4 million in 1987, $41.7 million in 1991. The February 1992 IPO valued the company at $146 million. Adobe Systems acquired Frame Technology in 1995 for approximately $500 million, thus cementing the success of this entrepreneurial venture.

The corporate culture was distinguished by its openness and diversity. The teams included a remarkable proportion of women, minorities, and LGBT people, including in management positions. This diversity nurtured creativity and stimulated innovation. Project management was inspired by the approach described by Richard Feynman in Surely You’re Joking, Mr. Feynman!, favoring open discussion followed by clear and decisive decisions.

FrameMaker’s impact on the industry far exceeded its creators’ expectations. The software democratized access to professional technical publishing, previously reserved for large organizations with substantial budgets. It established new standards for user interface and operating system integration. Its modular architecture and advanced structured document management had a lasting influence on the development of future publishing tools.

Three decades later, FrameMaker remains a reference tool in technical publishing, used by over 700,000 people in 30,000 companies. Adobe continues to evolve the software, adapting it to contemporary needs with support for formats like XML and DITA, as well as multichannel publishing to HTML5 and digital formats.

Top

IBM AS/400

While enterprise computing was fragmenting among a multitude of incompatible systems, IBM was secretly preparing a machine that would shake up the established conventions. The AS/400 would be born in 1988 in the Rochester, Minnesota laboratories, far from the major research centers of the American East Coast. This particular gestation partly explains why this architecture developed such unique characteristics.

Frank Soltis’s team had been working since 1978 on the System/38, a computer with pioneering concepts but modest commercial success. The Rochester engineers had chosen a radically different path from that explored by the creators of UNIX, VMS, or the future designers of Windows NT. Their approach remained so confidential that IBM did not share all the secrets of this architecture with its other divisions.

When the AS/400 was unveiled on June 21, 1988, the announcement surprised by its scope. Six processor models were offered simultaneously, accompanied by more than 1,000 ready-to-use applications. This immediate software availability constituted a record in the computer industry. The promised performance was impressive: memory was multiplied by 24, storage by 48, and computing power tenfold compared to previous generations.

The real innovation lay in the internal architecture. The AS/400 was based on a technology-independent machine interface called TIMI. Programs no longer communicated directly with the processor but with a virtual machine. This abstraction enabled a technical tour de force in 1995: IBM replaced the 48-bit CISC processors with 64-bit RISC PowerPC chips without any existing application requiring the slightest modification. All programs continued to function as if nothing had happened.

The system organized everything as objects. Files, programs, printers, users: each element had a precise description of its authorized uses and function. This object-oriented design, unusual for the time, strengthened security and simplified management. An administrator could manipulate any system component with the same standardized commands.

Integration constituted the other pillar of this architecture. Where systems laboriously assembled databases, security, communications, and backup tools from different vendors, the AS/400 natively incorporated all these functions into its operating system. DB2/400 managed data, security controlled every access, communication protocols were integrated. This unity drastically simplified daily administration.

Memory management revolutionized usual practices. The “single-level store” created a unique 64-bit address space, approximately 18 quintillion bytes. Programs and their data received permanent addresses in this gigantic space. No more juggling between main memory and secondary storage: the system automatically handled placing objects in the right location according to their usage.

The 1990s saw the AS/400 evolve to adapt to new needs. In 1994, the Advanced Series integrated Lotus Notes and Internet access. Portable models appeared, as well as versions intended for small organizations. Reliability improved by a factor of 20 between 1988 and 1992, while performance progressed by 30% each year.

Commercial success accompanied technical excellence. 250,000 systems from previous lines (System/34, /36, and /38) were already installed at launch. In 1992, IBM delivered the 200,000th AS/400 to Heineken brewery. These figures testified to massive adoption in companies worldwide.

The machine’s identity evolved through IBM’s acquisitions and marketing strategies. AS/400e, then eServer iSeries, eServer i5, System i5, System i, and finally IBM i in 2008. This last name emphasized the system’s new capabilities, now able to simultaneously execute IBM i, AIX, and Linux on the same POWER hardware.

The development environment progressively modernized. The Integrated Language Environment replaced the old programming model by improving modular program performance. RPG and COBOL, the platform’s historical languages, coexisted with C/400, Pascal, and Java. The command-line interface, recognizable by its mnemonics like WRKOBJ (“work with object”), survived all changes while enriching itself with modern graphical interfaces.

This exceptional longevity reveals the relevance of a different vision of enterprise computing. While most manufacturers developed modular systems assembling heterogeneous components, IBM Rochester prioritized integration, consistency, and ease of use. The AS/400 proves that an alternative approach to dominant standards can survive and prosper for more than three decades.

Top

Internet Relay Chat

In 1988, in the premises of the University of Oulu, a Finnish student named Jarkko Oikarinen was working on a problem that frustrated him: how to enable multiple people to converse simultaneously on BBS systems, those electronic bulletin boards that preceded the Web? Existing tools like talk or rmsg under UNIX were limited to two-person conversations. Oikarinen then imagined something different—discussion rooms where anyone could enter and participate in the general conversation.

This idea gave birth to Internet Relay Chat, better known by the acronym IRC. For the first time, dozens of people could exchange messages in real time in a virtual space. Oikarinen called these spaces “channels,” identified by the hash symbol that would later become famous on other platforms.

IRC’s first months remained confidential. The network was confined to Scandinavia, constrained by the fledgling international connections of the era. The situation changed when Mike Jacobs, from MIT, managed to connect to Oikarinen’s system. This transatlantic contact opened the floodgates: the code circulated, new servers were added, the network grew.

In January 1991, during the Gulf War, IRC experienced its first massive use during a global event. Users gathered to relay war information in real time, reaching over 300 simultaneous users for the first time. The logs of these exchanges are preserved in the ibiblio archives.

In August 1991, during the attempted coup against Mikhail Gorbachev, IRC played its part. During the media blackout imposed by the putschists (August 19-21), users in Moscow used IRC to transmit information in real time to the rest of the world, thus circumventing censorship. The logs of these exchanges are also preserved in the archives. Sources sometimes mention IRC use during the Russian constitutional crisis of September-October 1993, but this information remains to be confirmed by primary sources.

Success attracted ambitions and disagreements. Quarrels erupted over the management of IRC’s single network. Who could create new servers? According to what criteria? Tensions mounted until the breaking point. Some users broke away to create Undernet, leaving Oikarinen’s original server to form the basis of EFnet. These two rival networks continued to exist, joined by hundreds of others born from the same divisions.

For IRC fragmented. Each network developed its own rules, its own servers, its own code. Programmers experimented, modified, adapted. From this ferment emerged three major technical families: Undernet’s P10 protocol, hybrid for EFnet, bahamut for Dalnet. The technical diversity reflected IRC’s decentralized philosophy, where no one controlled the whole.

This decentralization came at a price. IRC networks regularly suffered from netsplits, those temporary disruptions that divided a network into several pieces. When two servers lost contact, their users found themselves separated until the connection was reestablished. These incidents, initially perceived as annoying failures, gradually became an accepted characteristic of the system.

IRC’s architecture remained strikingly simple. Messages traveled in plain text, without encryption or persistent storage. This technical lightness partly explained the protocol’s robustness, which also worked on sluggish connections with servers of limited resources. Where other systems demanded powerful servers and fast connections, IRC made do with little.

Users compensated for the protocol’s limitations through their creativity. They developed chat bots, conversation logging systems, sophisticated graphical interfaces. The software ecosystem around IRC flourished. Everyone programmed their own tools, shared their scripts, improved those of others.

This culture of tinkering marked IRC. The protocol influenced the platforms that would succeed it. The @ of channel operators, the # of room names—these conventions born in the 1980s would reappear much later on Twitter and other social networks. IRC invented a symbolic language that the mainstream Web would eventually adopt.

Designed for a few hundred users at most, IRC exceeded its theoretical limits. At the height of its success, the four largest networks brought together over 500,000 simultaneously connected people. The servers held firm, the technology adapted, communities prospered.

Far more than a simple discussion tool, entire communities developed with IRC, with their codes, their habits, their rituals. Free software developers found a place for technical exchange. Computing enthusiasts gathered there to debate the latest innovations. Friendships were formed, projects were launched, collaborations were organized.

IRC survives discreetly in the shadow of the Web social giants. Younger generations often ignore it, preferring more colorful interfaces and richer features. But technical communities remain faithful to it. On thousands of channels, conversations continue, projects are developed, knowledge is transmitted. IRC embodies a certain idea of the Internet: decentralized, open, controllable by its users.

Top

SNMP

By the late 1980s, computer networks were experiencing unprecedented growth. This expansion revealed a thorny problem: each manufacturer was developing its own management tools, creating a real headache for administrators. IBM had its solutions, Cisco had its own, and none spoke the same language. Juggling between these multiple interfaces bordered on acrobatics, especially since some tools were limited to simple cryptic commands.

The Internet Architecture Board grasped the magnitude of the challenge. In 1988, the organization published recommendation RFC 1052, which outlined the contours of a unified standard for Internet network management. The mission was assigned to the IETF with a tight deadline: 90 days to design what would become SNMP. The constraints were clear: simplicity of implementation and inspiration from ISO’s CMIP protocol.

Three RFC documents emerged simultaneously that year. The first introduced the Structure of Management Information, an abstract language based on ASN.1 for formally describing management data. The second defined the Management Information Base, that tree-structured database where all monitored objects would reside. The third detailed the operational mechanisms of the protocol itself.

The architecture relied on manager-agent logic of disarming simplicity. On one side, agents dispersed across each device tirelessly collected local information. On the other, a central manager queried these agents on demand or received their spontaneous alerts, called traps. This asymmetry worked remarkably well in early installations.

SNMPv1’s rapid success revealed, however, its congenital weaknesses. Security was merely a facade: community strings traveled in clear text over the network, accessible to anyone intercepting packets. This vulnerability was not an oversight but a deliberate choice. The designers had bet on the benign nature of exchanged information and counted on closed networks where trust prevailed.

This naivety pushed the IETF to develop SNMPsec in the early 1990s. This extension promised authentication, encryption, and sophisticated access control. But its complexity frightened developers, and it was overshadowed by SNMPv2 in 1993.

The protocol’s second version corrected irritants from the first draft. The MIB gained grouping capabilities, making the description of complete equipment more natural. The GetBulk operation allowed massive data retrieval, saving precious network round trips. Notifications gained reliability through the Inform mechanism, which required acknowledgment.

Unfortunately, internal feuds within the development community shattered the standard’s unity. SNMPv2 fragmented into incompatible variants: SNMPv2, SNMPv2c, SNMPv2u, SNMPv2*. This technical cacophony discouraged adoption and maintained SNMPv1 in a paradoxically dominant position.

The IETF learned from these failures in designing SNMPv3, completed in 2002. This third version preserved SNMPv2’s gains while offering a modular security architecture. The User-based Security Model authorized different encryption algorithms like DES or AES. The View-based Access Control Model offered remarkable granularity in defining access permissions.

Meanwhile, specialized extensions enriched the ecosystem. In 1991, RMON arrived with advanced network monitoring functions adapted to Ethernet networks. RMON2 extended this monitoring six years later to application layers. SMON completed the arsenal in 1999 with tools dedicated to switches and virtual networks.

Yet, defying all logic, SNMPv1 maintains the majority of deployments today. Its rusticity explains this surprising longevity. More recent versions, technically superior, have never managed to convince practitioners en masse, attached as they are to the proven simplicity of the original version.

The SNMP protocol validated the pragmatic approach against sophisticated but heavy solutions. It established enduring architectural principles such as separation between data format and content, and the manager-agent model that became a universal reference.

SNMP also testifies to the extensibility of Internet standards. Initially limited to network equipment, it now monitors servers, UPS systems, air conditioning systems, and a multitude of other devices. This versatility has consolidated its position in the IT ecosystem.

The protocol’s limits nevertheless surface with infrastructure evolution. SNMP collects elementary data but delegates their intelligent processing to the manager. This philosophy shows its weaknesses against contemporary networks, generators of torrential information flows. Technologies like NETCONF are emerging to address these new constraints, without challenging SNMP’s omnipresence in existing infrastructure.

Top

ZIP

In 1989, could Phil Katz and Gary Conway have imagined that their compression format would durably revolutionize computing? The history of ZIP begins in controversy. PKWARE, their company, had just been sued by Systems Enhancement Associates for allegedly copying the ARC archiving system. Rather than defending themselves, Katz and Conway decided to create something radically new.

The name ZIP reflects this ambition. Robert Mahoney, a friend of Katz, suggested this term to evoke the speed of their new solution, far superior to competing formats. On February 14, 1989, PKWARE and Infinity Design Concepts announced the release of their format into the public domain. This strategic decision, a true stroke of genius, instantly transformed ZIP into an open standard. The technical specification, documented in the APPNOTE.TXT file, was accessible to everyone. Developers seized this opportunity, multiplying compatible tools and libraries.

The early versions of the format reflected the hardware constraints of the era. PKZIP 2.04g could contain only 16,383 files per archive, with a maximum size of 2 GB. These limitations corresponded to the capabilities of available machines at the time. The rapid evolution of computer hardware nevertheless pushed designers to constantly rethink their format.

In 1993, version 2.0 appeared, introducing DEFLATE compression and traditional PKWARE encryption. This iteration laid the foundations for the modern architecture of ZIP files. Three years later, Deflate64 improved compression performance. But it was in 2001 that a major transformation occurred with version 4.5 and the introduction of the ZIP64 format. Gone was the 4 GB limit, finished was the restriction to 65,535 files per archive.

The year 2002 marked an opening toward new encryption methods: DES, Triple DES, RC2, and RC4 joined the available arsenal. In 2003, AES made its appearance. Version 6.3.0 of 2006 crossed another threshold by integrating Unicode for file names and new algorithms like LZMA and PPMd. Each evolution testified to a remarkable capacity for adaptation to users’ changing needs.

The internal structure of the ZIP format reveals clever design. The central directory, placed at the end of the file, catalogs all elements of the archive. This organization allows adding or removing files without reprocessing the entire archive. In the era of floppy disks, when each write operation took considerable time, this characteristic represented a decisive advantage.

The format’s success pushed operating system vendors to adopt it natively. Microsoft integrated ZIP support into Windows in 1998 via the Windows Plus! pack for Windows 98, naming this functionality "compressed folders." Apple followed in 2003 with Mac OS X 10.3. Free software systems were quick to follow suit.

Since 2006, Microsoft has used the compressed ZIP format as the basis for its Office Open XML. The .docx, .xlsx, and .pptx files are nothing more than ZIP archives containing XML files and various resources. This repurposed use illustrates the versatility of a format that has become invisible infrastructure of modern computing.

Official recognition came in 2015 with the publication of ISO/IEC 21320-1 standard. This standardization defines a minimal compressed archive format, compatible with different standards such as OpenDocument, Office Open XML, and EPUB. The standard imposes restrictions, however: only DEFLATE compression is authorized, encryption is prohibited.

Security issues have long troubled the ZIP format. The original encryption system, ZipCrypto, proved vulnerable to known-plaintext attacks. These weaknesses motivated the introduction of more robust methods. A controversy erupted in 2003 when WinZip integrated its own AES encryption system while PKWARE maintained its competing specification. The agreement reached in 2004 between the two companies facilitated mutual support of their respective formats.

The legacy of ZIP can be read in the adoption of the DEFLATE algorithm by other technologies such as gzip and zlib. The format continues to evolve: version 6.3.8 of 2020 integrated modern compression methods such as Zstandard, MP3, and XZ. This capacity for evolution, made possible by a modular design accepting new fields, ensures the format’s longevity.

Thirty-five years after its creation, ZIP remains ubiquitous. Contemporary implementations support ZIP64 and its theoretical archives of 16 exabytes. This exceptional longevity is explained by a rare combination of ease of use, backward compatibility, and technical adaptability. Phil Katz, who died in 2000, would not see his format cross into the new millennium and establish itself as one of the silent pillars of modern computing.

Top

Intel 80486

The year 1989 witnessed the birth of the Intel 80486 processor, a chip that marked a breakthrough in the world of microprocessors. For the first time in the history of the x86 family, one million transistors were etched onto a single component. This technical feat was accompanied by innovations that radically transformed the performance of personal computers.

The true breakthrough of the 80486 lay in its redesigned architecture. Intel directly integrated a floating-point calculation unit onto the chip, previously confined to a separate coprocessor, the 80387. This merger simplified motherboard design and considerably accelerated complex mathematical calculations. Added to this innovation was a unified 8 KB cache memory, positioned at the heart of the processor. Gone were the slow external cache accesses that penalized the 80386.

The introduction of an instruction pipeline gave the processor the ability to simultaneously process different instruction stages. Simple arithmetic operations now executed in a single clock cycle, halving the time required compared to the 80386. This new efficiency propelled performance well beyond the gains brought by simply increasing frequency.

The first versions of the 80486 ran between 16 and 33 MHz. In 1991, Intel attempted to push the frequency up to 50 MHz, but encountered heat dissipation problems. The solution involved reducing the etching fineness to 0.8 micrometers. Despite these adjustments, this fast model struggled to attract buyers, notably due to its limited compatibility with the local buses essential to graphics cards of the era.

The 80486 range diversified. The 486SX, a stripped-down version without a floating-point unit, targeted the consumer market concerned with savings. More cleverly, the 486DX2 inaugurated a technique that would become standard practice: doubling the internal frequency relative to the system bus. This increased performance without requiring a complete overhaul of existing motherboards. The DX4, despite its misleading name, didn’t quadruple but tripled the bus frequency.

A legal element disrupted Intel’s strategy. The company no longer had the right to trademark purely numerical designations beginning with "80". This constraint pushed Intel to rethink its product communication and heralded the era of brand names like Pentium.

The commercial success of the 80486 attracted competition. AMD, IBM, Texas Instruments, Cyrix, UMC, and STMicroelectronics developed their own versions. AMD distinguished itself by offering frequencies absent from Intel’s catalog, such as the 40 MHz bus. The firm marketed original models: 486DX-40, 486DX/2-80, and 486DX/4-120. In 1995, AMD’s Am5x86 was the fastest 486 ever designed, clocked at 133 MHz, while experimental versions reached 150 and 160 MHz.

Cyrix adopted a different strategy by developing its chips through complete reverse engineering, without relying on Intel’s plans. The first models, 486DLC and 486SLC, constituted hybrid solutions compatible with 386 sockets. Handicapped by their mere 1 KB cache, they struggled against Intel and AMD’s 8 KB models. Later Cyrix versions, equipped with more generous cache, unfortunately arrived too late to upset market shares.

The golden age of MS-DOS games coincided with the 80486’s peak. The DX2-66 MHz model was the reference for video game enthusiasts in the early 1990s. This dominance faltered with the emergence of real-time 3D. These new graphics intensely taxed the floating-point unit and demanded considerable memory bandwidth. Developers began optimizing their creations for the Pentium’s P5 architecture, progressively condemning 486 processors.

In the personal computer world, the 80486 survived into the 2000s in budget configurations. The withdrawal of Windows 95 support and the growing demands of subsequent operating systems accelerated its obsolescence. Paradoxically, Intel maintained production until September 2007 to supply the embedded systems market.

This chip embodied a period when computing performance progressed at a breakneck pace, enabling new applications and accelerating the democratization of computing tools.

Top

Intel i960

Intel delivered a major blow in September 1989 with the launch of the i960CA, its first 32-bit superscalar processor dedicated to embedded systems. This chip marked a breakthrough in the embedded computing world by executing instructions simultaneously on each clock cycle. The result came quickly with 66 MIPS, a figure that caused a sensation at the time.

The philosophy behind the i960 surprised with its boldness. Intel married the traditional RISC core with its 32 registers to an enriched instruction set that borrowed from the CISC world. This genre-mixing, far from being a shaky compromise, responded precisely to the particular constraints of embedded applications. Intel’s engineers knew how to balance this hybridization to create a processor that draws the best from both worlds.

The family expanded, the i960CF succeeded the CA and doubled its performance thanks to a redesigned cache memory. Intel then declined its creation according to a well-established commercial logic: the SA/SB for tight budgets, the KA/KB for the mid-range, and the hardened MC versions for military applications. This product line strategy testified to a mature industrial approach.

Under the hood, innovations abounded. The interrupt controller didn’t just manage priorities, it did so autonomously. Local registers saved themselves without intervention during subroutine calls, eliminating a traditional source of latency. Instruction caches equipped all models, while certain versions benefited from data caches that accelerated repetitive processing.

Data transfers reached 160 MB per second on high-performance buses, a remarkable speed. This capability transformed information flow management in demanding applications. Object code compatibility between different models constituted a major asset: developers could migrate their applications without starting from scratch, a guarantee of sustainability that was appreciated.

Integration pushed innovation further. The VH model brought together on a single chip a 32-bit PCI v2.1 interface at 33 MHz, a memory controller, and various peripheral functionalities. This concentration reduced costs, footprint, and consumption, three critical parameters in embedded systems.

The i960’s application domains drew an impressive map. Networks adopted it for their communication controllers, bridges, and routers. Medical imaging, particularly ultrasound, exploited its processing capabilities. Industrial automation, robotics, and computer vision found an excellent ally in this processor. The aerospace sector integrated it into its flight control equipment and satellite navigation systems.

Intel didn’t limit itself to silicon. The Solutions960 program federated more than 200 tools developed by 70 partner companies. Optimized compilers, operating systems, debugging tools, evaluation boards: the ecosystem took shape around the processor. This software richness transformed i960 adoption into a smooth experience for developers.

The integrated DMA controller impressed with its performance: 59 MB/s in fly-by transfers, 32 MB/s in two-cycle transfers. The interrupt controller managed up to 248 external sources with 32 programmable priority levels. Lockable cache memory guaranteed optimal execution of critical algorithms, a valuable feature in real-time systems.

Specialization guided the development of different versions. Network models processed packets without saturation, while those dedicated to imaging adapted to different bus widths and data conventions. This targeted approach reinforced processor attractiveness according to their application domain.

Superscalar execution became democratized, peripheral integration on the main chip became the norm, the importance of a complete development ecosystem remains relevant. The late 1990s saw the birth of new competing architectures that disrupted the landscape. The i960 nevertheless paved the way toward increasingly integrated and high-performance solutions.

Top

World Wide Web

At the end of 1989, Tim Berners-Lee was working at CERN in Geneva. The British computer scientist faced the same problem daily: how to find information in what had become a Tower of Babel at the European nuclear research organization? Researchers were accumulating documents about their experiments, equipment, and discoveries, but everything remained scattered and inaccessible. Berners-Lee then conceived a system to connect this dispersed information.

For decades, visionaries had dreamed of machines to organize human knowledge. Paul Otlet had sketched his Mundaneum as early as 1934, a connected universal library. Vannevar Bush theorized the Memex in 1945 in “As We May Think”, a machine capable of navigating between documents linked by associations of ideas. Ted Nelson coined the term “hypertext” for this non-linear navigation in 1965 with his Xanadu project, which never came to fruition despite decades of development. Apple brought part of these dreams to life in 1987 with HyperCard, the first commercial success of hypertext.

Berners-Lee took decisive action. Rather than designing a complex system, he focused on simplicity. In 1990, with Robert Cailliau’s help, he laid the foundations of his architecture: the URL uniquely identifies each resource, the HTTP protocol organizes exchanges between machines, the HTML language structures documents and their links. Shortly after, he developed the first Web browser, which he called WorldWideWeb, before renaming it Nexus.

The first website went live on August 6, 1991 at the address info.cern.ch. Tim Berners-Lee explained what the World Wide Web was, how to create pages, and how to install a browser. The information spread through Usenet forums. The following year, Les Cernettes, a music group formed by CERN employees, saw their photo become the first non-scientific image published on the Web. A nod to the democratization to come.

The turning point came on April 30, 1993. On that day, CERN relinquished its copyright on Web technologies and placed them in the public domain. This decision, seemingly trivial in retrospect, changed everything. Anyone could now use, modify, and distribute the code without constraints. Universities seized the technology, and businesses would follow.

Marc Andreessen seized the opportunity. In 1994, his Mosaic browser revolutionized the user experience with its intuitive graphical interface. Gone were command lines and austere interfaces: the Web became accessible to the general public. That year also saw the birth of the World Wide Web Consortium under Berners-Lee’s direction. This organization established the technical standards of the Web.

1995 marked the explosive entry of commerce onto the Web. Yahoo! launched its directory, meticulously classifying sites by categories. Jeff Bezos opened his Amazon bookstore from his Seattle garage. Microsoft integrated Internet Explorer into Windows 95, triggering the first browser war against Netscape. The Web was no longer just a research tool; it had become a marketplace.

Three years later, two Stanford students disrupted the game. Larry Page and Sergey Brin created Google with their PageRank algorithm, which ranked pages according to their popularity measured by incoming links. This mathematical approach to relevance radically transformed information retrieval. The Web became the universal gateway to digital knowledge.

Evolution continued in successive waves. The early Web, from 1989 to 2005, resembled an immense library. Sites remained static; users were content to read. HTML, HTTP, and basic protocols were sufficient to operate this still-simple universe.

Between 2004 and 2016, everything shifted toward interaction with what would later be called Web 2.0. Users no longer simply consulted; they created, shared, and commented. Facebook opened its doors in 2004, YouTube followed in 2005. JavaScript, XML, and Ajax enriched interfaces, making them dynamic and responsive.

From 2015 onward, a third generation emerged: the Semantic Web. The ambition changed scale: the goal was to make information understandable by machines themselves. RDF, RDFS, and OWL structured data so that computers could grasp the meaning, not just display it.

The HTTP protocol accompanied these transformations. Its version 1.1, standardized in 1997, introduced persistent connections that avoided renegotiating each exchange. HTTP/2 arrived in 2015 with stream multiplexing, which allowed multiple simultaneous requests on a single connection. HTTP/3, finalized in 2022, abandoned TCP in favor of QUIC based on UDP, gaining speed.

The appearance of Apple’s AppStore in 2008 redefined Web access. Mobile applications complemented the traditional browser, sometimes replacing it. The Web moved out of computers and into our pockets, cars, and watches. This ubiquity was unthinkable in 1989.

The architecture conceived by Berners-Lee has withstood all these transformations. The separation between content and presentation, unique identifiers for each resource, the standardization of protocols: these technical choices enabled continuous expansion without major disruption. The client-server model and REST architecture provided a framework flexible enough to accommodate all uses.

Of course, today’s Web bears little resemblance to the documentary system envisioned at CERN. It has transformed how we communicate, inform ourselves, conduct commerce, and work. While certain developments are concerning (concentration of services among a few players, exploitation of personal data, disinformation), Berners-Lee’s vision of a universal and decentralized information space has largely been realized.

Top

Microsoft Office

On August 1st, 1988, in the bustling atmosphere of Las Vegas, Bill Gates took the stage to announce an office productivity suite. At the time, no one truly grasped the magnitude of what was unfolding. Yet this announcement launched a venture that would revolutionize working habits worldwide.

Two years later, in 1990, the first version of Microsoft Office was released. It brought together three programs that can now be considered basic: Word for writing, Excel for calculations, PowerPoint for presentations. This combination was far from trivial. Microsoft had identified the need to have all the necessary tools for daily professional work at hand, without juggling between different publishers and their sometimes contradictory approaches.

The history of Word actually begins much earlier, in 1981, when Microsoft hired Charles Simonyi to develop a word processor. The first version came out in 1983, but it was confusing. WordPerfect reigned supreme and users didn’t understand this very different interface. Microsoft didn’t give up. A Macintosh version arrived in 1985, then in 1987 the RTF format appeared, finally simplifying document exchange between different systems.

Excel, for its part, established itself as the spreadsheet reference. The software could handle complex calculations, draw graphs, and analyze data through pivot tables. With the arrival of Visual Basic for Applications, users could program automations and free themselves from repetitive tasks.

PowerPoint has a unique genesis. Initially named Presenter by Forethought, Microsoft acquired it in 1987 for 14 million dollars. Robert Gaskins then suggested the new name. This software radically transformed the way presentations were conceived and delivered. In its first 1990 version, it was impossible to go back through slides, and customization options remained limited.

Microsoft released Windows 95 and Office 95 simultaneously... in 1995. This synchronization was no accident: the deep integration between the operating system and the office suite created a unified work environment that massively appealed to businesses. Sales skyrocketed and Microsoft established its dominance.

The suite gradually expanded. Access arrived to manage databases, Publisher to create marketing materials, OneNote for digital note-taking. Outlook completed the ensemble by managing emails and calendars. Each addition strengthened the ecosystem’s appeal.

Microsoft had to adapt to technological developments. Versions were released for Windows and macOS. The company developed mobile applications like Office Lens and Office Remote to keep pace with new usage patterns. SharePoint and Skype for Business Server extended collaborative capabilities to enterprise servers.

The advent of cloud computing changed the game. Office Web Apps offered a lightweight version accessible through browsers. Office 365 then revolutionized the business model by switching to subscriptions. This transformation addressed contemporary users’ expectations for mobility and real-time collaboration.

The suite’s consistency strengthened with each version. The spell checker was shared, data flowed between applications, the interface became harmonized. Learning one application in the suite made mastering the others easier. The .docx format, introduced with Office 2007, improved document compatibility and security.

Education became a strategic conquest ground. Generations of pupils and students learned on Office, creating a user base familiar with these tools. Availability in 35 languages accelerated worldwide adoption. It initially confused longtime users, but the 2007 ribbon interface modernized access to features.

Office shapes contemporary professional practices. Word establishes its standards for document formatting. Excel revolutionizes data analysis in businesses and financial management. PowerPoint redefines corporate communication codes, and these practices persist in the digital universe.

This story reflects the transformations of office work since the 1990s. From a collection of three programs, Office evolved into a complete ecosystem blending cloud services, mobile applications, and collaborative tools. This capacity for continuous adaptation explains its longevity in an otherwise volatile sector.

NeXTSTEP

When Steve Jobs left Apple in 1985, he took with him an obsession: to create the perfect computer. This obsession would give birth to one of the most fascinating adventures in computing history, that of NeXTSTEP. Here was an operating system that would break the mold.

In the computing world of the mid-1980s, graphical interfaces were still finding their way, systems crashed at the slightest misstep, and programming was an uphill battle. Jobs and his team at NeXT Computer decided to start from scratch. Their gamble was to marry UNIX’s robustness with a new interface. They chose BSD as their base system, grafted the Mach kernel onto it, and there was their war machine ready to challenge the industry.

But NeXTSTEP’s true boldness lay in its entirely object-oriented architecture. Where its competitors were tinkering with procedural code, NeXT bet everything on Objective-C. This decision seemed crazy at the time for anyone trying to understand this approach. Would developers follow? The team persisted and transformed every system component into an object: windows, buttons, files, processes. This architectural consistency transformed the user experience.

NeXTSTEP’s interface was striking in its elegance. Display PostScript ensured surgical precision in graphics rendering. What you saw on screen matched exactly what the printer would produce. The famous Dock made its first appearance, that icon bar that simplified application access. A more discreet but equally remarkable innovation: the Workspace Manager offered column navigation that revolutionized file management.

NeXTMail pushed multimedia integration further. Send an image in an email? A sound file? A formatted document? Nothing could be simpler. This modern vision of messaging was years ahead of its time. Other systems struggled to display simple text properly when NeXT already offered the equivalent of the rich emails we know today.

Interface Builder revolutionized software development. Gone was the laborious coding of interfaces: programmers now assembled their windows visually, through drag-and-drop. This approach still inspires current tools. Services introduced unprecedented interoperability between applications. Retouching an image from within a word processor became child’s play.

NeXTSTEP’s file system anticipated our contemporary needs. Extensible metadata allowed enriching each file with contextual information. The Digital Librarian indexed all documents and offered sophisticated searches across the entire machine. Google Desktop and Spotlight would adopt this idea twenty years later.

Tim Berners-Lee chose a NeXT workstation to invent the Web at CERN. This choice was no accident, as NeXTSTEP’s development tools made it possible to prototype complex applications. The first web browser and the first web server were born on this platform. Wall Street also took an interest in the system. Investment banks discovered its stability and adopted NeXT workstations for their mission-critical applications.

But technical excellence alone was not enough. The prohibitive price of NeXT machines hindered their adoption. Only wealthy professionals and a few universities could afford them. The general public remained out of reach. In 1994, NeXT abandoned hardware and refocused on software. OPENSTEP, the portable version of NeXTSTEP, ran on different architectures. This strategy broadened the audience but perhaps came too late.

The acquisition by Apple in 1996 revived everything. NeXTSTEP was reborn as Mac OS X. Jobs’s system DNA shaped Apple’s new operating system: object-oriented architecture, Cocoa frameworks, user interface philosophy. iOS directly inherited from this lineage. The iPhone and iPad owe much to NeXTSTEP’s innovations.

This story proves that in computing, technological advancement always pays off in the end. The best ideas survive commercial failures and find their way to success.