January 1, 1970: The Day the Software Industry was Created
The Unix epoch is an arbitrary date no more.
When I first learned about the Unix epoch, I assumed it meant that something grand and inspiring happened in Unix land on January 1, 1970. Like it was the date that Dennis Ritchie or Ken Thompson finished their prototype for Unix on their PDP-7. Maybe they just felt so close to getting something working that they skipped celebrating the New Year and stayed up programming late into the night.
It didn’t happen like that though. Farhad Manjoo found out the true origins and wrote about it in Wired on Sept 8, 2001 which was right before the Unix timestamp hit 1,000,000,000 seconds (as I write this, the current Unix time is now 1,674,540,529).
He wrote:
According to Dennis Ritchie, one [of] the engineers who worked on Unix at Bell Labs at its inception.
"At the time we didn't have tapes and we had a couple of file-systems running and we kept changing the origin of time," he said. "So finally we said, 'Let's pick one thing that's not going to overflow for a while.' 1970 seemed to be as good as any. "
If there’s anything to take away from this, it's that at least it was practical and made sense. Funnily enough, this is also how I’d describe Unix and all its derived operating systems: practical and make sense.
To make matters even more meaningless, it wasn't always January 1, 1970. It started off as January 1, 1971 and instead of measuring seconds, it measured 1/60th of a second according to an early man page from 1971. This meant that 2^32 would have overflowed in 2.26 years which isn't too ideal.
It was then changed to Jan 1, 1972 a bit later and then eventually in 1973 it was changed to the date we still have to this day.
I really wanted this date to have some sort of meaning but unfortunately there just wasn't one.
One day when reading about computer history I came across an interesting event that corresponds exactly with the date of the Unix epoch that left a huge impact on the software industry.
The best book I've found on the history of software is Martin Campbell-Kelly's incredibly thorough and excellent book: From Airline Reservations to Sonic the Hedgehog.
Software before the Unix epoch
The 1950s
In the early 1950s, the computer industry was entirely different than it is today. One example of this is just the scarcity of computer hardware. In the 1960s, there was a census done on the number of computers in the US and it totaled about 14,000.
Computers were very expensive to rent and the IBM 701 as an example cost $15,000 per month. There was also a scarcity of computer software. The companies that rented these behemoths had to write their own software and had to manage all the complexity that that entailed. The number of computers combined with the difficulty of programming them made it hard to establish any sort of market for software – the economics just weren't feasible. When a company rented a computer, the expectation was that the manufacturer would help them with customer support and some training but that was it.
There were some interesting evolutions of the very, very early software market but they didn't involve money. It was a group called SHARE that was very similar to what we'd see as the open source ecosystem today, just way smaller and far more primitive. According to the group the name "is not an acronym; it's what we do." It was all based in good will and comradery.
There weren't many programmers at this time either. The well known nonprofit research and development organization, RAND, had some estimates on the number of programmers out there. From Campbell-Kelly's book:
In December 1955, the RAND Corporation created an autonomous Systems Development Division to undertake the programming work. At that time, the corporation reckoned that it employed about 10 percent of the top programmers in the United States—but this amounted to only 25 people. It was estimated that there were no more than 200 "journey-man" programmers—those capable of the highest class of development work—although there were probably 6 times that number of professional programmers working on relatively simple commercial applications.
By 1959, 5 years later, the largest group of programmers worked for a software company called System Development Corporation (SDC) a company that was spun out of RAND. It employed 700 programmers that all worked on a system called Semi-Automatic Ground Environment which was a defense system to protect the country from the Soviets. This was estimated to be about half the programmers in the US at this time.
Another interesting aspect was that SDC was a defense contractor and their client was the United States. Much of the early funding of the software industry (as well as the hardware industry) was bankrolled by the Department of Defense. SAGE is the first and largest example of this at the time. The government support funded many projects and that continued on through the 1950s and beyond. This was a very unique trait of the United States: the amount of money that the US put into computing helped establish it as a leader of computing across the world.
Outside of defense projects that the government funded, in the private market software projects were often built for very specific use cases. The largest private software project at this time was SABRE which was a system for airline reservations that was created by American Airlines. Bank of America also had a large financial project that they programmed. The pattern was pretty common: large companies were best equipped for custom programming that fit their particular market.
The 1960s
The term "software industry" wasn't even a thing during these times. It wasn't until the later part of the decade that this term started to take on the meaning that we now understand it as today.
The software industry was so small that there was little analysis done on this nascent industry. Much of what we know about this era is somewhat incomplete because of how young the industry was.
This was not true for hardware though. The hardware industry started to take off during the early 1960s and it was dominated by IBM. IBM had such a commanding lead on the rest of the competitors that a cute nickname was coined to describe this: "IBM and the Seven Dwarfs" since there were 7 competitors that made up most of the rest of the market share.
The companies that rented these mainframes from IBM and the Seven had to do much of the programming themselves. If a particular task could be automated, companies tried to write software for the mainframes they owned. Companies usually didn't keep programmers on the payroll because the economics just didn't make sense. Companies instead hired software contractors for their needs. The program that was created was just a one off thing for a specific task and it wasn't going to be marketed or sold by that company – it was just not feasible at this time.
The economics of this arrangement meant that the software industry was growing slowly because there wasn't much demand for programmers. As an example of how slowly it was growing: the company SDC (that built that air defense system) was finally surpassed when it came to the largest programming contractor. It was passed by a company called Computer Sciences Corporation and it only passed SDC in 1969 – nearly 15 years after the founding of SDC.
The early 1960s was a time when companies started to understand what software was and what it could do for them. Things really started to change once we hit the later 1960s.
Later 1960s
IBM and the Seven Dwarfs continued to innovate and improve the hardware. The hardware was improving at a far faster pace than the software. It started to resemble supply side economics because it gave space for software to grow and do more computations. It was no longer a requirement for a programmer to spend so much time optimizing and tweaking things for insufficiently powered hardware.
This was also the time that the first few early software products became breakout products. These products constituted the first real entries into what could be achieved with this new industry. This included ADR's Autoflow which was a flowcharting software. It was the first big commercial success of the software industry. Another was Informatic's Mark IV which was a file management software. Although they aren't recognized as killer apps, they were the killer apps of what software could hope to be given the fledgling industry that it was.
IBM continued its commanding lead through the 60s and it was still far and away the behemoth of the computer industry. The scope and scale of their dominance was incredible. It was so good in fact that competitors did everything they could do to slow IBM down including lawsuits. In my favorite book of all time, The Soul of a New Machine, Tracy Kidder explains it so well:
Firms fought over patents, marketing practices and employees, and once in a while someone would get caught stealing blueprints or other documents, and for these and other reasons computer companies often went to court. IBM virtually resided there. Everyone sued IBM, it seemed.
IBM could tell that software was a growing industry and by owning the hardware, they could make it more appealing with good software. When a company bought IBM products (nobody was ever fired for buying IBM) they didn't just get hardware, they also got access to IBM's entire catalog of software that they had written at no additional cost. The price of hardware and software was bundled. IBM's software was very good and very expensive. It was viewed as a major reason for going with IBM in the first place. A lot of this custom software was more generic than the types of software we saw earlier that handled a very specific use case.
This made it hard for IBM's competitors to compete on either the software or hardware front. IBM was too good at hardware and software. Other hardware manufacturers didn't have the catalog of software that IBM had so even if a customer didn't buy IBM, the customer would then have to then solve their software problem. Software makers had a hard time selling their software because companies just preferred to rely on IBM especially if they were spending all that money on hardware. This meant that companies just preferred to buy the bundled package of hardware and software from IBM.
The Great Unbundling
This hardware/software advantage that IBM had started to become very apparent. This power that IBM had over the market was just one of the things that CDC, one of the Seven Dwarves, took issue with. According to Martin Campbell-Kelly, IBM was also known for premature product announcements and predatory pricing. All of this was enough that CDC filed a lawsuit against IBM in December 1968.
Days before this lawsuit was filed, IBM miraculously announced that they would unbundle their hardware and software. This was just the teaser though because IBM didn't have all the details yet. They set a date of June 1969 of when they would release all the information. It’s believed that IBM made this half baked announcement in an attempt to show that they were proactive in self governing and possibly thwart more antitrust action.
IBM organized a task force to determine the logistics and details of this unbundling and it had 6 months to come up with the plan.
The next bomb to drop was the formal investigation into antitrust claims by the Department of Justice that was announced a bit over a month later in January of 1969. The walls were closing in on IBM in both the federal court and the court of public opinion.
When July rolled around, IBM was true to their word and formalized the unbundling and how the pricing was going to work.
IBM announced the date that this new unbundling and pricing was going to go into effect: January 1, 1970
The Creation of the Software Industry
This date, the Unix epoch, was a pivotal moment in the software industry. After this date, IBM put a price on the software that they were providing to customers along with the hardware.
Martin A. Goetz was a product manager for ADR who worked on Autoflow (the first big successful commercial application) and was interviewed as part of the Computer History Museum's oral history series. He went on to say:
IBM was a significant part of the story because they had conditioned market and one of the problems was reconditioning the market to buy. It was one thing to have products to sell, but the biggest problem was the mental attitude of the people buying. “So what is this thing I’m buying?”
This would go on to take years for companies to start realizing that software was something they could purchase. Lee Keets was an engineer at IBM throughout much of the 60s and left in 1967 to start his own company. He argued the unbundling was a legitimizing force for the market:
Legitimatized is the right word. What IBM did is it gave the customer the message that buying software is not only okay, it's required. And it also put a flag in front of the customer saying if you're going to spend your money you're going to have to justify the expense. And this didn't happen over night. That's the other thing.
The few software companies that were out there suddenly could see the value of IBM’s software and now they could compete with it alone and not with the hardware. The programmers that were working as contractors now could start selling software of their own. This lead to a flourishing of innovation in the 1970s.
It’s the decade that the software industry became a thing of its own. Throughout this time, countless software companies were formed that started selling their products. This brought in new programmers and greatly increased what the industry was capable of.
Martin Campbell-Kelly argued that this was the date of the creation of the software industry. JoAnne Yates, another historian of computing, argued that it was an "inflection point."
The Unix epoch is a date that is hardcoded in computers everywhere. It's possibly the most well known date in computing and will be around for as long as we have Unix derived systems.
An epoch is, by definition, a beginning of a distinctive part in the history of something. Unfortunately as it is now, the Unix epoch is just an arbitrary date. Sure, it is close to the date that Unix started under development but that's not how it was chosen as we learned.
While historically the Unix epoch is an arbitrary date, it just so happens to coincide with a momentous date for the computing industry. It’s an incredible coincidence given the importance the date had on the software industry. It's especially meaningful for software engineers everywhere because it was the date that the market and industry were legitimized.
We should remember the Unix epoch as the date that the software industry was created.
Really appreciated this article! Well researched and well said. Built on some your research and linked to your article from my latest newsletter here, looking at the creation of the American software industry with a bit more of an antitrust lens: https://newsletter.employbl.com/p/january-1-1970-ibm-microsoft-and