Though machines built to help with complex mathematical functions had been around for some time, the last quarter of the 20th Century saw computers become one of the most common tools of our modern world. Let's go over some of the events that brought us to where we are today. To check out the latest advances in home computing, try this list of the best mini gaming PCs.

What's The Difference Between A Mac And A PC?

The term "PC" means "personal computer," but is generally used to describe any computer that is made by a company other than Apple. "Mac" refers to the Apple Macintosh, and is usually used to refer to any product made by Apple. PCs often employ the Windows operating system, and come in two parts, a tower and a monitor. Apple became recognizable for computers like the iMac, which housed all elements in one unit. Because of the open architecture system, there are many companies that produce PCs, and they aren't all equal, as each has different hardware. Apple is known for cultivating user experience, which is why their computers are popular among those who want their laptops to operate similarly to their tablets and phones. PCs have more variety, and are more widely used, so it's a good idea to learn to use a PC, since that's what you'll most likely find in the workplace. As for which is better, that's an argument that will seemingly never end.

Looking Back At The Homebrew Computer Club

What Is Binary?

Binary is the most basic language of computing, where every action is reduced to a series of commands in the form of either a 0 or a 1. Alan Turing employed binary to create machines capable of performing computations much faster than humans, and came up with the idea of memory-based machines that can store tasks. Modern computers still employ binary, but modern programs are far too complex for developers to write them out in strings of digits. Programming languages exist to allow developers to create and edit programs that are then translated into simple commands a machine can understand. Thankfully, you won't have to spend your time creating punch cards or typing out ones and zeros in order to build a website.

Steve Jobs Introduces The iPhone

More Information

From room-sized electronic devices such as the "ENIAC" employed during the second World War, computers have come a long way, going from making simple calculations to being a necessary aspect of every business and household. Let's take a look at the evolution of personal computers.

One of the biggest innovations that made personal computing possible was Intel's single-chip CPUs, or microprocessors, which allowed a single circuit board to run an entire computer. In 1974, Intel released their 8-bit microprocessor, "8080," that would spark the PC revolution. The first minicomputer kit that was based on this microchip was the "Altair 8800," which ultimately became the first commercially successful personal computer. The Altair was a build-it-yourself kit invented by Ed Roberts, the founder of electronics company MITS.

The Altair could be given instructions through the use of light switches, but this type of programming was too tedious to work with, even for computer enthusiasts. Later that same year, college students Bill Gates and Paul Allen contacted MITS to propose the use of a simple programming language called "BASIC," and developed the interpreter needed for it to run on the machine. This gave birth to the "Altair BASIC" software, the first product of Microsoft.

The formation of hobbyist groups grew after the launch of Altair, including the Home Brew Computer Club. One of its members, Steve Wozniak, was inspired to design his very own microcomputer kit. He introduced his prototype to the group in 1976. It was the first low-cost PC that could be connected to a monitor and had a text interface. And it was Steve Jobs, his college friend and fellow club member, that suggested they start selling it. Apple's first product, the "Apple 1," was released shortly thereafter.

In 1977, three types of home computers known as the "1977 Trinity" pushed the PC into the mainstream market. Each could be bought as a complete set, and they were bundled with BASIC interpreters to make them easy for non-computer wizards to use. The first was the "Apple II," which boasted rudimentary color graphics and sound output. The next was the "TRS-80 Model I" by the Tandy Corporation. It had fewer advanced features, but was sold for half the cost of the Apple II. The third was the "Commodore PET 2001," which combined all the components of the PC into one device.

By 1979, the first spreadsheet program, "VisiCalc," was developed for the Apple II by Dan Bricklin and Bob Frankston. This application turned the personal computer from a hobbyist gadget to a serious business tool, now with a growing demand in both businesses and households.

It was in 1980 that IBM, the largest computer company at the time, realized they had to adapt to these technological advancements. A group of engineers led by PC Development Head Bill Lowe came up with the "open architecture" strategy. This meant buying and assembling non-IBM components instead of building everything from scratch. That included acquiring Intel chips for CPUs and licensing Microsoft DOS for the operating system. It allowed them to keep their production costs low, and gain partnerships with other firms. IBM's "5150," which was launched within a year, became an instant hit as the first low-cost PC.

Although the open architecture helped developer businesses thrive, it also meant more competition. Eventually, competitors like Compaq and Dell were able to legally create their own IBM-compatible PC clones that also licensed MS DOS. Soon, Microsoft became the most popular creator of operating systems. Apple, on the other hand, maintained their "closed architecture" approach, using only their own OS, monitor displays, keyboards, printers, and everything else. This allowed them to control the user experience and the quality of their computers.

In 1983, the "Apple Lisa" was launched, the first personal computer on the market with a graphical user interface and a mouse. The GUI feature allowed users to interact with the device through icons and visual indicators, and then use the mouse to navigate the screen and select commands. However, it had a hefty price tag, and consumers preferred efficiency over advanced features. Apple then released the "Macintosh" in 1984, which was sold for a quarter of the cost of Lisa. Over 70,000 units were purchased in its first one hundred days.

Although Apple brought GUI-based PCs to the market, this groundbreaking feature had been around since the early seventies. The first computer with a graphical user interface was the "Xerox Alto," developed by Xerox researchers at the Palo Alto Research Center in 1973. However, it was only meant for internal use and was never sold commercially. It was when some of the Apple staff were invited over to the PARC for a partnership proposal that Steve Jobs discovered the advanced machine. He immediately knew that this was the future of computers and took the opportunity to make use of it.

A year later, Microsoft launched their own GUI-based OS called "Windows 1.0." Although the graphics were of much lower quality than the Macintosh, the company's superior partnerships gave them a larger customer base. By the late eighties, laptops started gaining mass appeal, especially for busy, on-the-go consumers. While the first portable microcomputer is considered to be the "Osborne 1" from 1981, it was the "Compaq SLT 286" in 1988 that boasted of being the first battery-powered laptop with desktop-like performance, built-in hard disk and floppy disk drives, and an LCD screen.

In 1989, the "World Wide Web" was invented, a catalyst that would push the boundaries of PC technology. Tim Berners-Lee, a researcher at CERN, originally developed the world wide web as a data-sharing space between scientists. A year later, he came up with the "Hypertext Markup Language," or HTML, which made linking and obtaining information more efficient.

Although the Internet began in 1969 as a packet-switching network called the "ARPANET" for government communications, it was the invention of the World Wide Web that helped make accessing and sharing information available for widespread, public use. This collection of web pages became an entire network of resources that could be obtained by anyone connected to the net. Soon, there was an explosion of web browsers like Internet Explorer and search engines like Google, which made surfing the net much easier. By the end of the decade, users could connect to the Internet without wires, through Wi-Fi.

At the start of the 21st century, the rise of 64-bit processors combined with the evolution of the Internet to allow PC's to run more demanding software such as massive multiplayer online games. By the mid 2000s, social media sites such as Facebook and YouTube were born, and advanced graphics led to the rise in virtual reality.

By 2007, the advancement of smartphones and tablets transformed mobile devices into handheld personal computers. This evolution took a major step with the emergence of the iPhone, which brought many computer functions into the palms of consumers and further simplified how users interacted with applications and each other. While there are a lot of innovations we didn't cover, these are some of the most revolutionary. As technology continues to evolve, PCs are becoming more of an extension of ourselves, letting us bring computers with us wherever we go, and interacting with the world through our fingertips.


Please note that runicsoft.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for runicsoft.com to earn fees by linking to Amazon.com and affiliated sites.