Innovation Over Time: A History of Software Development
The history of software development spans centuries, not decades. From the creation of the first adding machine hundreds of years ago, our civilization has been improving technology in ways previously deemed impossible. Along with those hardware upgrades have been software improvements that have made our lives easier, from calculators to calendars to retail software that lets customers use their debit cards to buy a sweater. The evolution of software from the first computer up until today is a fascinating look into how we manipulate the world around us, including the digital world that we’ve created.
While some may think computers didn’t exist in any form until the 20th century, the first real computer was invented in the early 19th century. It was invented by Charles Babbage and called the Difference Engine, and its purpose was to be a simple calculator. This developed quickly into the Analytical Engine, which was much closer to the computer we know today. Using punch cards, a human could program the engine to process several different calculations at a time, and its output could be printed or graphed. This was the original custom software development, and it set the stage for the 20th century.
Around 1940, the world began to see computers as we know them today, more or less. One of the earliest models was the Atanasoff-Berry computer. It was too large to fit through a doorway but could be used to solve linear equations. The official first piece of software to be written was created with punch cards, and it calculated the greatest divisor of 2 to the power of 18. It took 52 minutes to complete the calculation! This achievement led to a boom in software development, including the creation of the first programming language, FORTRAN. This was a language like we might see today, with English words representing lower-level functions to make it easier to read and program. It was released in 1957 and then updated in 1958 to include reusing code. The reason this language in particular was so important was how widely it was applied. Up until this time, every program was basically written on one machine in its own language. FORTRAN was implemented on many different machines made by many different people, allowing people to transfer programs from one machine to another. This was further standardized by FORTRAN 66, the very first standard for a programming language.
- The Difference Engine and Analytical Engine
- The History of FORTRAN
- A History of Computer Programming Languages
The development of a personal computer changed software drastically. Now, any person could have a computer in their living room; there was no need for a server room just for one computer terminal. The Apple II revolutionized computing, followed quickly by the beginning of the retail sale of software. Programs could be loaded onto floppy disks and given to users to run on their computers. These computers did not have large hard drives to begin with; all of the software needed to fit on a floppy disk, so the programs had to be kept simple.
During this time, programming languages were becoming more and more familiar to the modern developer. This is the time period in which C was developed, as were Pascal and SQL. Some other languages that didn’t survive the march of time were used as well, such as Simula and Prolog. These languages began to build the foundation for languages today, with an emphasis on object-oriented programming and logic.
In the 1980s, while personal computer programs existed, it was business programming that really pushed the industry forward. Large-scale systems were created, and rapidly advancing hardware became available to run bigger and more complex systems. Mainstays of today such as Microsoft Office were released during this time, as were the first Web browsers. The advent of the Internet made finding programs easier. This also began the earliest custom Web app development. While creating websites isn’t always seen as “programming” today, it was very technical in the mid-1990s, when the Internet was still in its infancy.
- A Very Brief History of Object-Oriented Programming
- Learning BASIC Like it’s 1983
- In the 1980s, it Was All About the Software
Mobile Computing Devices
The first publicly available predecessor of the modern smartphone came in 1994: IBM’s Simon. It featured a touchscreen, a calendar, and an email client. It did not give you the ability to surf the Internet, as Web browsers had only recently been invented. But quickly, there was a boom in the development and sale of these PDAs, or personal digital assistants.
For mobile phones, the operating system had to be completely different from the OS of a standard computer, and the programming languages were often different as well. For the first smartphones, it was impossible to add new programs to them; the phone came with what it came with and had no room for new programs, even if they could be loaded onto it.
However, soon, programming languages would be released for mobile phones that were simple enough for anyone to use. By the 2000s, programmers were creating apps for smartphones, and these apps and devices only grew more and more sophisticated from then to the present day.
Software development today stands on the shoulders of what came before. C is still a popular programming language, although it’s been edged out by its younger brother C++. Mobile operating systems are alive and well, as are the programming languages used to code them. And we continue to have more digital space in the same physical space, pushing the limits of technology farther and farther.
One of the biggest movements in modern software development has been including chips in previously “dumb” devices, such as a slow cooker. In the past, a slow cooker would have a dial with three options: off, low, and high. A modern slow cooker, however, might contain a chip with unique programming on it that lets you set the amount of time it should cook for and the temperature. Then, it will switch automatically to a warming setting to prevent the food from overcooking. This is just one example of a way in which custom software development has changed how we interact with technology.
In addition, custom Web app development has exploded with the Internet, and a lot of the resulting software has moved back into the server room, now that we don’t need to be tethered to servers that are in the same room as we are. Nowadays, it’s much more convenient to have an entire program running on a server somewhere else and served to you online, either on a desktop computer or a wireless device.