Skip to Content
HeadGym PABLO
Skip to Content
PostsUncategorizedThe Evolution of Digital Notes: From Memex to AI
Tags:#knowledge_management#software_engineering

The History of Digital Notes Apps: From Mainframes to AI

Every day, billions of people open an app on their phone or computer to jot down a thought. A grocery list. A meeting note. A flash of inspiration at 2 AM. It feels so natural now that we rarely pause to consider the extraordinary journey that brought us here—a six-decade evolution from room-sized computers to the AI-powered knowledge systems in our pockets.

This is the story of how humanity learned to think in pixels. How We Went from Punch Cards to Second Brains

This is the story of how notes apps evolved from from punch cards to become ‘second brains’.


The Visionaries: 1960s–1970s

The history of digital notes doesn’t begin with an app. It begins with a dream.

In 1945, Vannevar Bush, an American engineer who had coordinated scientific research during World War II, published an essay in The Atlantic titled “As We May Think.” In it, he described a hypothetical device called the “memex”—a mechanized desk that could store and retrieve documents, creating trails of associated ideas. Bush envisioned a future where humans could augment their memory with machines.

Two decades later, Douglas Engelbart decided to build it.

Working at the Stanford Research Institute, Engelbart assembled a team to create what he called the “oN-Line System,” or NLS. On December 9, 1968, he gave what would later be called “The Mother of All Demos”—a 90-minute presentation in San Francisco that introduced the world to the computer mouse, hypertext links, video conferencing, and collaborative real-time editing. The audience of 1,000 computer professionals watched, stunned, as Engelbart demonstrated concepts that wouldn’t become mainstream for another 30 years.

NLS allowed users to create, organize, and link text documents in ways that feel remarkably modern. You could outline ideas hierarchically, jump between related concepts, and work on documents simultaneously with colleagues miles away. In 1968.

But NLS required expensive mainframe computers and specialized training. It was a glimpse of the future, not a product for the masses. The dream would need to wait for the hardware to catch up.

Meanwhile, the first true text editors were emerging. In 1971, Michael Lesk at Bell Labs created “roff,” a text formatting program for Unix. The same year, Butler Lampson and Charles Simonyi developed Bravo at Xerox PARC—the first WYSIWYG (What You See Is What You Get) text editor. These tools weren’t designed for note-taking per se, but they established the fundamental paradigm: text as digital data that could be created, edited, saved, and retrieved.


The Personal Computer Revolution: 1980s

The 1980s changed everything by putting computers on desks.

When IBM launched the Personal Computer in 1981, it created a platform. When Apple released the Macintosh in 1984, it created a paradigm—the graphical user interface that would define how we interact with computers to this day.

And tucked inside Microsoft Windows 1.0, released on November 20, 1985, was a humble application called Notepad.

Notepad was almost comically simple: a plain text editor with no formatting, no features, no frills. You could type, save, and open files. That was it. But in its simplicity lay its genius. Notepad became the digital equivalent of a blank sheet of paper—instantly accessible, zero learning curve, universal.

Thirty-eight years later, Notepad still ships with every copy of Windows. Over a billion people have access to it. It remains one of the most-used text applications in history, a testament to the enduring power of simplicity.

The 1980s also saw the emergence of personal information managers (PIMs)—software designed to help individuals organize their digital lives. Programs like Lotus Agenda (1988) and GrandView (1987) introduced outlining and information organization concepts that would influence note-taking apps for decades.

But perhaps the most prophetic development was hypertext.

In 1987, Apple released HyperCard, created by Bill Atkinson. HyperCard let users create “stacks” of virtual cards containing text, images, and buttons that could link to other cards. It was, in essence, a personal wiki before the web existed. Teachers used it to create interactive lessons. Writers used it to organize research. Hobbyists used it to build everything from games to recipe collections.

HyperCard planted a seed: the idea that notes didn’t have to be linear. They could be networked, interconnected, alive with relationships. That seed would take 30 years to fully bloom.


The Mobile Dawn: 1990s

The 1990s asked a new question: What if your notes could travel with you?

Apple’s Newton MessagePad, launched in 1993, was ahead of its time—perhaps too far ahead. It promised handwriting recognition and portable computing but delivered frustrating inaccuracy and a $700 price tag. The Newton became a punchline, famously mocked in The Simpsons when it misread “Beat up Martin” as “Eat up Martha.”

But three years later, Palm got it right.

The Palm Pilot, released in March 1996 at $299, understood something crucial: people would accept limitations in exchange for reliability. Instead of trying to recognize natural handwriting, Palm created Graffiti—a simplified alphabet users could learn in an hour. Instead of trying to replace a computer, the Palm Pilot synced with one.

Within 18 months, Palm had sold over a million units. By 2000, the company controlled 70% of the handheld market.

The Palm Pilot’s Memo Pad application established conventions that persist today: a list view of notes, tap to open, automatic saving, simple sync. For millions of people, it was their first experience with digital note-taking that actually worked.

Microsoft entered the game with Windows CE devices and Pocket PC, while a small Canadian company called Research In Motion launched the BlackBerry in 1999, introducing the world to mobile email and the addictive red notification light.

The infrastructure for ubiquitous digital notes was being built, one device at a time.


The Cloud Awakens: 2000s

The 2000s brought a revolution hiding in plain sight: the browser became a platform.

Google launched Gmail in 2004 with an unheard-of 1 gigabyte of free storage, signaling that the cloud could hold our data. The same year, a startup called Writely launched a web-based word processor that Google would acquire and transform into Google Docs.

But the defining note-taking app of the decade came from an unlikely source: a serial entrepreneur named Stepan Pachikov who had spent years working on handwriting recognition in the Soviet Union.

Evernote launched in June 2008 with a radical promise: “Remember everything.”

The pitch was simple but profound. Take notes anywhere—phone, computer, web. Evernote syncs them everywhere. Search finds them instantly, even text within images thanks to optical character recognition. Your notes become a searchable external memory, a “second brain.”

The timing was perfect. Apple had launched the iPhone in 2007, creating a new category of always-connected pocket computers. Evernote was there with an app on day one of the App Store in July 2008.

By 2011, Evernote had 11 million users. By 2014, it had 100 million and a $1 billion valuation. CEO Phil Libin declared that Evernote would be a “hundred-year company.”

Microsoft, meanwhile, had been quietly building its own vision. OneNote launched in 2003 as part of Microsoft Office, offering a free-form canvas where users could place text, images, and drawings anywhere on the page—like a digital whiteboard. It was powerful but initially overlooked, overshadowed by Word and Excel.

The 2000s established cloud sync as the expected default. Notes that existed only on one device began to feel broken, incomplete. The expectation had shifted: our thoughts should follow us everywhere.


The Smartphone Era: 2010s

The 2010s saw note-taking apps multiply like digital rabbits.

Apple Notes, which had existed as a simple yellow-lined notepad since the original iPhone, received major upgrades—rich text formatting in 2013, sketching and checklists in 2015, document scanning in 2017. By virtue of being pre-installed on every iPhone and Mac, Apple Notes became one of the most-used note apps in the world, despite rarely being anyone’s favorite.

Google Keep arrived in 2013, emphasizing quick capture and visual organization with color-coded cards. Dropbox Paper launched in 2015, betting that notes and documents were converging. Microsoft made OneNote free in 2014 and began pushing it aggressively across platforms.

But the most significant development was philosophical, not technical.

In 2013, a small startup called Notion launched with an audacious idea: what if notes, documents, databases, and project management were all the same thing? What if everything was made of “blocks” that could be rearranged, embedded, and transformed?

Notion’s founders, Ivan Zhao and Simon Last, had grown frustrated with the fragmentation of productivity tools. Why use one app for notes, another for tasks, another for wikis, another for databases? Notion proposed a unified workspace where a single page could contain paragraphs of text, a Kanban board, a calendar, and a spreadsheet—all working together.

The initial launch sputtered. Notion nearly died in 2015, down to its last months of funding. But the team retreated to Kyoto, Japan, rebuilt the product from scratch, and relaunched in 2018 to explosive growth. By 2021, Notion was valued at $10 billion with 20 million users.

Notion proved that people wanted more than a place to store text. They wanted a tool for thinking.


The Networked Thought Revolution: Late 2010s–2020s

In 2019, a small app called Roam Research ignited a movement.

Roam’s creator, Conor White-Sullivan, had been obsessed with a question: Why do our note-taking tools force us to think in hierarchies when our brains think in networks?

Roam introduced bidirectional linking. When you linked to a page, that page automatically linked back to you. Every note became a node in a growing web of ideas. The double-bracket syntax [[like this]] became a symbol of a new way of thinking.

The app attracted a devoted following who called themselves “Roamans” and paid $165 per year—unheard of for a note-taking app. They created courses, YouTube channels, and Twitter communities dedicated to building “second brains” and practicing “networked thought.”

Roam’s success spawned a generation of alternatives. Obsidian, launched in 2020, offered similar linking features but stored notes as plain Markdown files on your own computer—appealing to users wary of cloud lock-in. Logseq provided an open-source alternative. Craft offered Apple-native design polish. Mem promised AI-powered organization.

The “tools for thought” movement had arrived, drawing explicit inspiration from Engelbart’s 1960s vision. After 50 years, the dream of augmenting human intellect was finally reaching mainstream users.


The AI Era: 2022–Present

The release of ChatGPT in November 2022 sent shockwaves through every software category, and note-taking was no exception.

Within months, Notion launched Notion AI, allowing users to summarize notes, generate content, extract action items, and translate text—all within their existing workspace. Mem reimagined itself as an “AI-native” note app that could answer questions about your own notes. Reflect promised to surface relevant past notes automatically as you write.

The implications run deep. For sixty years, digital notes were passive—they stored what you put in them. AI makes notes active. They can remind you of forgotten ideas, connect disparate concepts, and even generate new content based on your existing thinking.

We’re witnessing the emergence of true “second brains”—not just storage systems, but thinking partners.


What the Journey Teaches Us

The history of digital notes is really a history of how we think about thinking.

In the 1960s, visionaries imagined augmenting human intellect. In the 1980s, we got simple text files. In the 1990s, we learned to carry notes with us. In the 2000s, we synced them everywhere. In the 2010s, we connected them together. In the 2020s, we’re teaching them to think alongside us.

Each era solved the previous era’s limitation while revealing new possibilities. Notepad was liberating until you needed sync. Evernote was revolutionary until you needed structure. Notion was powerful until you needed intelligence.

The tools keep evolving because human thought keeps demanding more. We don’t just want to record ideas—we want to develop them, connect them, build upon them. We want our tools to be partners in cognition, not just filing cabinets.

Douglas Engelbart, giving his demo in 1968, called his work “augmenting human intellect.” He believed computers could make us smarter, more capable, more creative—not by thinking for us, but by expanding what we could think about.

Fifty-eight years later, every time you open a note app on your phone, you’re living in the future he imagined.

Last updated on