Today I will be examining the history and future of computers
and relating this topic to the lessons I have learned throughout INT100. A
survey of history will reveal several precursors to what we now call a
computer, from the Pascalina mechanical calculator to the Jacquard weaving
machine, which used punched cards as a form of mechanical memory containing
pattern instructions for a loom. Perhaps most ubiquitous with early computing
is the Difference Engine developed by Charles Babbage in the early 1800s. The
Difference Engine was an assembly of mechanical movements which was designed to
allow automatic production of mathematical tables through digital computation
(Copeland, 2000). Computing machines were limited to mechanical movements to
facilitate their functions until researchers in the 1940s developed a device
using electronic vacuum tubes instead to decrypt secret messages (Copeland,
2000). However, these machines were limited to specialized functions and were
not readily adaptable to more general-purpose tasks. The development of more
sophisticated memory arrangements involving magnetism further propelled
computing into the modern age.
Vacuum tubes require significantly high voltages to operate
and occupy significant physical space, so it is natural that a smaller and more
efficient mechanism would be preferred if it could perform the same functions.
This mechanism came in the form of the transistor, which further revolutionized
the development of computing machines and allowed significantly reduced form
factor, which we can observe today, as computers have become increasingly
small, and the density of transistors that can occupy a silicon substrate has
increased exponentially. For much of history, computers were generally so large
that they occupied entire rooms. The 1970s saw the introduction of personal
computers such as the Altair 8800 and the Apple II, which sparked the
industries we see today that have made computers accessible to an unprecedented
number of people (Ceruzzi, 2010).
Computers certainly did not stop developing in the 1970s, and
we have seen a continuation and acceleration of the development of smaller,
more powerful, and cheaper machines since then. Partly due to the ever-increasing
density of transistors that can occupy a silicon wafer, processors have
continued to become faster and more capable of performing increasingly complex
tasks every year. The development of other hardware components, such as
solid-state drives, has recently increased the speed at which a computer can
access data.
Although the future is unpredictable in computer technology,
recent developments in computer networking and hardware hint at some possible
trends we might see growing in the future. Cloud computing uses the advanced
state of computer networks to transfer data storage from local hardware to
massive offsite servers. Cloud computing provides convenient access and
portability to users working across multiple devices, often at the cost of a subscription
fee. We see an increasing philosophy of software as a service rather than as a
product. In the past, most software products were purchased through a one-time
transaction for an unlimited license to use the software. More recently,
software companies have instead charged recurring subscription fees for a
time-limited license to use their software. When the time limit expires, the
user will need to renew their subscription or forfeit their license to use the
software. This trend appears only to be increasing, and I would expect it to
continue to expand into the future.
Recently, the term “quantum computing” has increasingly come
up in speculation of the future of computing. Quantum computing stems from the
well-known but difficult-to-understand field of quantum physics (Bova et al.,
2021). Currently, computer algorithms are rooted in classical physics, whereby
an object can occupy only one point in space and time. Thinking of binary
logic: a switch can be either on or off, which is the fundamental property on
which all computing is currently based. The advantage of quantum computing
would be that the information does not need to exist in either an on or off
state. It could exist in both or somewhere in between, which is an exciting
prospect in the field of computing because it would mean that a computer would
not have to cycle through many iterations of on’s and off’s to perform its
functions as it currently does. Quantum computing promises much faster
computing speeds because of this prospect. This means that any functions
presently limited by the serial nature of digital logic could potentially be
made possible or much easier with the advent of quantum computing.
I will more directly relate some aspects of the history of computers with the lessons I learned in INT100 below.
Fundamentals and History
The history and future of computers relate directly to the fundamentals of information technology. By observing the historical developments in hardware, software, and philosophy that have taken us to our current situation in information technology, we can identify the patterns that can inform our predictions for the future and inspire us to seek the future we would like to see actively. The history and future of computers show us how the concepts of information technology and computer science have developed over the years and how the development of major hardware components has directly influenced the usage and availability of computers. Since their introduction, computers have operated on digital logic, which has taken us from simple punch cards to the incredible computing power that we see today. Quantum computing is the first prospect that has challenged this fundamental means by which computers work and offers exciting possibilities for the future of computing.
Hardware, Programming Languages, and Software
Throughout the history of computing, we have seen the development of programming languages alongside advances in hardware components to take advantage of the possible applications for which we can use computers. Machine language in the most primitive early computer interfaces has developed into complex and adaptable high-level languages that abstract the granular instructions needed to direct a machine into coherent languages that humans can use to direct a computer’s functions easily.
Databases, Networks, and Security
Application software has historically been developed to provide user-friendly interfaces with which people can control specific aspects of a computer’s functions. Individual computers and computer networks continue to develop to allow accessing and processing information in databases across multiple platforms. An application like Microsoft Excel might allow a user to process information in a database, while increasing access to the internet and computer networks has made this information more easily shared with others.
As
computer networks have expanded throughout history, they have become more
vulnerable to security breaches. Indeed,
computers have a long history with security, as some of the earliest computers
were developed to breach enemy security by decrypting secret codes. Today,
security threats come in the form of malware, DDoS attacks, and various other
malicious activities designed to compromise computing systems. As security
measures become more robust, so do the methods that malicious actors use in their
attacks. Therefore, it has been necessary for network security to evolve
throughout history, as it will continue to be so in the future.
References
Bova,
F., Goldfarb, A., and Melko, R. (2021, July 16). Quantum computing is coming.
What can it do? Harvard Business Review.
https://hbr.org/2021/07/quantum-computing-is-coming-what-can-it-do
Ceruzzi,
P. (2010, July). "Ready or not, computers are coming to the people":
Inventing the PC. OAH Magazine of History. 24(3):25-28.
https://www-jstor-org.proxy-library.ashford.edu/stable/25701418
Copeland,
B. Jack. (2000, December 18). The modern mistory of computing. The Stanford
Encyclopedia of Philosophy. https://plato.stanford.edu/archives/win2020/entries/computing-history/
0 comments:
Post a Comment