How “Personal” Could Personal Computing Become?
The Evolution of Personal Computing
Personal computers have evolved quite steadily over the past 50 years from BASIC (Beginner’s All-purpose Symbolic Instruction Code) programmable computers to traditional, fully assembled desktops to modern laptops. What will the personal computer look like in the years to come?
Personal computers have evolved steadily over the past 50 years, starting with rudimentary electronic kits only of interest to hobbyists and technicians, followed by the first successfully mass-marketed desktop computers
In the 1980s. Now half a century later, with mobile devices in common use, we’re contemplating the possible death of the PC. The decline of the personal computer has been a highly debated subject for quite a while now, but irrespective of the fate of traditional desktop PCs and laptops, one thing is certain: personal computing will surely change in the years to come.
The Early Days: Desktop Computers
The mid-1960s heralded the arrival of the world’s first desktop computer—the Programma 101. Early personal computers like the Programma, generally called “microcomputers” at the time, came as electronic kits with no power supply, case or keyboard and sold primarily to the few hobbyists and technicians who understood how to use them. The mid-1970s, however, finally brought mass-market microcomputers to everyday people with the emergence of microprocessors, which made computers smaller, more affordable and more interesting for the average user.
By 1980, the world’s first successfully mass-marketed personal computer, the Commodore PET had arrived. Subsequently, the Apple I and later Apple II, Radio Shack’s TRS-80 and the IBM 5150 came to market, computers capable of delivering much better personal productivity, programming and gaming experiences than before. The era of the personal computer had officially arrived. Mass-market, ready-assembled systems were now allowing people from outside the research and technical communities to incorporate computers into their daily lives.
By the time desktop computers acquired a graphical user interface with desktop icons, a mouse, Ethernet networking and programming language systems, a new era in computing had begun. The average nontechnical person could now use a computer—it had truly become personal. But much richer experiences were still ahead.
From Personal Computers to Portable Computers
Bill Gates famously said that someday there will be “a computer on every desk and in every home.” But portable computers changed that, and even Microsoft had to adjust to the change. With the portability to follow users throughout their days, laptops and notebooks added a more personal touch and greater freedom to computing. Computer users could now escape the stationary desk and carry their computing devices anywhere. Of course users could still sit at a desk, and in fact that may still be the most efficient way to type and move a cursor, but users now had a choice. However, even more fluid computing experiences were on the horizon.
Hand-held Form Factors and Mobile Computing
Smartphones and tablets made computing experiences more readily accessible than any devices of the past. Arguably, smartphones are not yet capable of replacing traditional desktop or laptop computers—certainly not for the majority of use cases. In fact, smartphones and tablets are not actually considered PCs. Instead they are commonly referred to as personal digital assistants (PDAs).
But is it fair to see personal computing as an experience that is strictly limited to spreadsheets, email, documents and games? Perhaps not. Mobile devices and, increasingly, wearables may potentially revolutionize personal computing. Perhaps the first wave of wearable technologies did not really take the market by storm. But in the future we may see wearables blending into our attire so subtly that we hardly notice it, which would make our relationship with technology even more intimate.
One might argue that smartphones, tablets and wearables truly represent the future of personal computing. Yet despite the major blows the PC market has suffered thanks to smartphone and tablet sales, it’s still hard to imagine PCs completely disappearing from our homes or workstations.
Either way, the increasingly intimate nature of our interactions with these devices will certainly make privacy and security even more serious concerns. But that is a challenge we must overcome. As it is we’re already giving away quite a lot of our privacy. Perhaps if we could get more meaningful experiences in return for sharing our information, it might seem like a fair deal.
Machine Learning Is Key
Carlos Guestrin, a long-time researcher, academician and founder of a company that provides machine learning software and training believes that “in the next five years, every successful breakthrough app is going to use machine learning at its core. Machine learning is what’s going to make an app truly useful and different to other things out there.”
Guestrin also adds that key barriers to widespread industry adoption of machine learning are skills and resources, since not all companies, like Google, have access to large teams to build innovative new intelligence applications. However, in the future, as technologies mature and become more widespread, we can expect to see an increase in machine learning tools and APIs to address this need.
Transcending Traditional Computing Devices
In the future, it will probably be difficult, and perhaps irrelevant, to think of personal computing as an experience restricted to what we have traditionally deemed “computing devices,” making the question of whether or not smartphones, tablets and wearables can be considered “PCs” inconsequential. Instead, the future of personal computing may very well emerge in the form of connected devices throughout our surroundings that deliver pervasive computing experiences.
Released over six years ago, this video from MIT’s Media Lab provides one vision of what personal computing might look like in 2019. For personal computing to reach such levels of intuitiveness, artificial intelligence, robotics and machine learning will have to mature rapidly. We haven’t quite gotten there yet, but with four years to go we can remain optimistic, as these technologies continue to evolve.