Welcome to History of Data Science. Discover the stories of heroes who transformed our daily lives!

BROUGHT TO YOU BY Dataiku Dataiku

xperiences-ico Xperiences
Applications / Applied Data Science

IBM: The Founder of Big Data

4 min read
Founded in 1911 as Computing-Tabulating-Recording Company, IBM was a data processing pioneer, creating technology that revolutionized the way that large organizations, from big companies to the U.S. government, collect and interpret massive amounts of data.

While IBM may look stodgy and old-fashioned compared to the Silicon Valley firms that have stolen the spotlight in tech over the past 20 years, Big Blue is still on the cutting edge of data science and artificial intelligence.

Digital information

Strange as it may sound today, the digital information revolution that IBM brought to the world in the early 20th century was based on paper.

The story actually begins in the late 19th century, when Herman Hollerith invented the “Hollerith card,” a paper card that stored information with holes that could be read by a machine. In 1896, Hollerith founded the Tabulating Machine Company, one of the four firms that later merged to form International Business Machines.

IBM steadily enhanced the capabilities of punched cards, allowing them to store more information. The technology became a mainstay of records-keeping in business and government. The U.S. government adopted punched cards to store the Social Security numbers of U.S. citizens. In a dark moment in IBM’s history, the technology was used by Nazi Germany to track Jews, Romani, political dissidents, and other victims of the Holocaust.

King of computers

Eventually, IBM and the rest of the world moved beyond punched cards and into more sophisticated data processing technology.

A key turning point came in February 1944, when IBM unveiled the Automatic Sequence Controlled Calculator, later dubbed the Harvard Mark I. The electromechanical computer could perform three additions or subtractions per second, while it took six seconds to multiply and 15 seconds to divide. That may not sound fast, but when it came to complex calculations, it was a heck of an improvement over pencil and paper.

“IBM existed a good 50 years before mainframes — we started with scales.”

Ginni Rometty, IBM CEO

The Harvard Mark I was put to work almost immediately on behalf of the U.S. war effort. Famed mathematician John von Neumann, who was working to develop the first atomic bombs as part of the Manhattan Project, used the computer to determine that the “implosion design” would lead to a faster and more efficient detonation.

IBM continued to drive innovation in computing after the war, becoming a dominant player in supplying large computer systems to businesses. In 1982, it released the IBM Personal Computer (IBM PC), which transformed the then-nascent personal computing market.

Turning human intelligence into a machine

One of IBM’s most notable achievements was broadcast to millions during a broadcast of Jeopardy!, the popular American trivia show.

The show typically features three contestants answering questions about science, art, pop culture, current events, and history, but on one February day in 2011, viewers were treated to an unusual spectacle: two of the greatest Jeopardy! players of all time getting trounced by a robot named Watson.

Watson, named for the company’s legendary former CEO, Thomas J. Watson, was a multibillion-dollar artificial intelligence project. In contrast to a traditional search engine, which simply spits back a variety of content that corresponds to the search terms, Watson’s task was to interpret questions, often phrased in ways that are not Google-friendly, and find not a number of possible answers, but the one correct answer. Instantly.

Responding to the machine’s stellar performance, the New York Times commented that IBM “has taken a big step toward a world in which intelligent machines will understand and respond to humans, and perhaps inevitably, replace some of them.”