Current location - Education and Training Encyclopedia - Graduation thesis - Philosophy of computer origin
Philosophy of computer origin
Throughout human history, the closest thing to the computer is the abacus, which is actually considered as a calculator because it needs manual operation. On the other hand, computers automatically perform calculations by following a series of built-in commands called software.

In the twentieth century, technological breakthroughs made it possible for the ever-developing computers we see today. However, even before the advent of microprocessors and supercomputers, some famous scientists and inventors helped lay the foundation of this technology, which later completely changed our lives.

The general language for computers to execute processor instructions, which originated from17th century, is a form of binary digital system. This system was invented by German philosopher and mathematician Gottfried Wilhelm Leibniz. It is a method that only uses two digits to represent decimal numbers, that is, zero sum number 1. His system was partly inspired by the philosophical interpretation of China's classic Book of Changes, which understood the universe from the perspective of light and darkness, and the duality of men and women. Although his new coding system had no practical use at that time, Leibniz believed that one day it would be possible for machines to use these long strings of binary numbers.

1847, British mathematician george boole introduced a newly designed algebraic language based on Leibniz's works. His "Boolean algebra" is actually a logical system, which uses mathematical formulas to express statements in logic. Equally important, it adopts binary method, and the relationship between different mathematical quantities can be true or false, 0 or 1. Although there was no obvious application of Boolean algebra at that time, another mathematician, Charles Sanders Pierce, spent decades expanding this system, and finally found in 1886 that it could be calculated by electronic switching circuits. Over time, Boolean logic will become a tool for computer design. The British mathematician Charles Babbage is believed to have assembled the first mechanical computer-at least technically. At the beginning of19th century, his machine was characterized by the way of inputting numbers, memory, processor and outputting results. His first attempt at what he called the "difference engine" was to build the world's first computer, but after spending more than 65,438+07,000 pounds on development, this effort was almost abandoned. This design needs a machine to calculate the numerical value and automatically print the result on the table. This is a hand crank, which weighs four tons. The project was finally cut off after the British government cut off Babbage's funds.

This forced the inventor to turn to another idea, which he called analytical engine, a more ambitious general-purpose computing machine, not just arithmetic. Although Babbage's design can't run through and build a working device, his design basically has the same logical structure as the electronic computer used in the 20th century. All computers have analysis engines, such as integrated memory, a form of information storage. It also allows the computer to execute a set of instructions and loops that deviate from the default sequence order, and these instructions are sequences of instructions that are continuously and repeatedly executed. Although Babbage failed to build a fully functional computer, he still stuck to his ideas. Between 1847 and 1849, he drew a new and improved design of the second edition of the differential machine. This time, it calculates as many as 30 decimal digits, which is faster and simpler because it needs fewer parts. However, the British government thinks it is not worth investing. Finally, Babbage's greatest progress on the prototype was to complete one seventh of his first differential engine.

In this early era of computing, there were some remarkable achievements. 1872, the tidal forecasting machine invented by Sir william thomson, a Scottish-Irish mathematician, physicist and engineer, is considered as the first modern analog computer. Four years later, his brother james thomson put forward the concept of computer solving mathematical problems, that is, differential equations. He called his device "integrator", which will be the basis of a system called differential analyzer in the next few years. 1927, American scientist Vaneva Bushsh began to develop the first machine with this name, and published the description of his new invention in the scientific journal 193 1. Until the beginning of the twentieth century, evolutionary computing was just a mechanical design that scientists set foot in, and it could effectively perform various calculations for various purposes. Until 1936, a unified theory about what a general-purpose computer is and how it works was finally put forward. That year, the British mathematician alan turing published a paper entitled "On Countability and Application Judgment", which outlined a theoretical device called "Turing Machine", which can perform various mathematical calculations. Theoretically, this machine will have unlimited memory, which can read data, write results and store instruction programs.

Turing's computer is an abstract concept. It is a German engineer named Conrad Zuze who will build the world's first programmable computer. His first attempt to develop an electronic computer Z 1 was a binary-driven computer that read instructions from a perforated 35 mm movie. The problem is that this technology is unreliable, so he later introduced Z2, a similar device that uses electromechanical relay circuits. However, it was when he assembled his third model that everything came together. Z3 was introduced in 194 1, which is faster and more reliable, and can perform complex calculations better. But the biggest difference is that the instructions are stored on the external tape, which makes it a completely operable program control system. Perhaps most notably, Chuze did a lot of work by himself. He never knew that Z3 was Turing complete, or in other words, it could solve any computable mathematical problem, at least in theory. He is also unaware of other similar projects taking place in other parts of the world at the same time. The most notable is the launch of Mark I of Harvard University sponsored by IBM, 1944. However, more promising is the development of computer electronic systems, such as 1943 computing prototype colossus in Britain. The first fully operational electronic general-purpose computer was put into use at the University of Pennsylvania 1946.

In computer projects, the next major leap is computing technology. Hungarian mathematician john von neumann asked about a computer project that laid a computer foundation for storing programs. At this point, the fixed program of the computer needs to be manually readjusted to change its functional operations, such as performing calculations and word processing. For example, Chuze spent several days reprogramming. Ideally, Turing suggests storing the program in memory, which will allow the computer to modify it. Von Neumann's concept, 1945, drafted a detailed stored program calculation, provided a feasible framework and provided. His published papers will be widely distributed among research teams studying various computer designs. 1948, a British team introduced the Manchester Small Laboratory Machine, which was the first computer to run a stored program based on the Von Neumann structure. Manchester Machine, nicknamed "Baby", is an experimental computer, which is the predecessor of Manchester Mark I. Electronic data computer, von Neumann's report was originally designed for electronic data computer, and it was not completed until 1949.

The first modern computer was completely different from the commercial products used by consumers today. They are delicate and clumsy devices that often occupy the whole room. They also absorbed a lot of energy, the infamous carriage. Because these early computers used heavy vacuum tubes, scientists hoped to increase the processing speed, or they had to find a bigger room or come up with an alternative. Fortunately, this much-needed breakthrough has been carried out in the project. 1947, a group of scientists in Bell Laboratories developed a new technology called point contact transistor. Like vacuum tubes, transistors amplify current and can be used as switches. But more importantly, they are much smaller (about the size of pills), more reliable and consume much less power. * * * Inventors john bardeen, Walter Brandon and william shockley will finally win the Nobel Prize in Physics with 1956.

Although Bardeen and bratton continued their research work, shockley turned to the further development and commercialization of transistor technology. The first employee of his new company was an electrical engineer named robert noyce. He eventually split up and set up his own company, Fairchild Semiconductor Fairchild Camera and Instrument Branch. At that time, Noyce was integrating seamless transistors and other components into integrated circuits, and they pieced them together by hand. Jack kilby, an engineer at Texas Instruments, had the same idea and finally applied for a patent. But Noyce's design will be widely adopted.

Among them, the most important influence of integrated circuits is to pave the way for a new era of personal computing. As time goes by, it opens up the possibility of running process-millions of power marks on all circuit chips. Essentially, it makes our ubiquitous handheld devices more powerful than the earliest computers.