Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
The Shockwave Rider wikipedia , lookup
Human-Computer Interaction Institute wikipedia , lookup
Wearable computer wikipedia , lookup
Computer vision wikipedia , lookup
Philosophy of artificial intelligence wikipedia , lookup
History of artificial intelligence wikipedia , lookup
Computer Go wikipedia , lookup
OBORO PRECIOUS ISIOMA MEDIA AND COMMUNICATION 100 LEVEL Throughout human history, the closest thing to a computer was the abacus, which is actually considered a calculator since it required a human operator. Computers, on the other hand, perform calculations automatically following a series of built-in commands called software. In the 20th century breakthroughs in technology allowed for the ever-evolving computing machines we see today. But even prior to the advent of microprocessors and supercomputers, there were certain notable scientists and inventors that helped lay the groundwork for a technology that has since drastically reshaped our lives. The Language Before the Hardware The universal language in which computers use to carry out processor instructions originated in 17th century in the form of the binary numerical system. Developed by German philosopher and mathematician Gottfried Wilhelm Leibniz, the system came about as way to represent decimal numbers using only two digits, the number zero and the number one. His system was partly inspired by philosophical explanations in the classical Chinese text the “I Ching,” which understood the universe in terms of dualities such as light and darkness and male and female. While there was no practical use for his newly codified system at the time, Leibniz believed that it was possible for a machine to someday make use of these long strings of binary numbers. In 1847, English mathematician George Boole introduced a newly devised algebraic language built on Leibniz work. His “Boolean algebra” was actually a system of logic, with mathematical equations used to represent statements in logic. Just as important was that it employed a binary approach in which the relationship between different mathematical quantities would be either true or false, 0 or 1. And though there was no obvious application for Boole’s algebra at the time, another mathematician, Charles Sanders Pierce spent decades expanding the system and eventually found in 1886 that the calculations can be carried out with electrical switching circuits. And in time, Boolean logic would become instrumental in the design of electronic computers. The Earliest Processors English mathematician Charles Babbage is credited with having assembled the first mechanical computers – at least technically speaking. His early 19th century machines featured a way to input numbers, memory, a processor and a way to output the results. The initial attempt to build the world’s first computer, which he called the “difference engine,” was a costly endeavor that was all but abandoned after over 17,000 pounds sterling were spent on its development. The design called for a machine that calculated values and printed the results automatically onto a table. It was to be hand cranked and would have weighed four tons. The project was eventually axed after the British government cut off Babbage’s funding in 1842. OBORO PRECIOUS ISIOMA MEDIA AND COMMUNICATION 100 LEVEL This forced the inventor to move on to another idea of his called the analytical engine, a more ambitious machine for general purpose computing rather than just arithmetic. And though he wasn’t able to follow through and build a working device, Babbage’s design featured essentially the same logical structure as electronic computers that would come into use in the 20th century. The analytical engine had, for instance, integrated memory, a form of information storage found in all computers. It also allows for branching or the ability of computers to execute a set of instructions that deviate from the default sequence order, as well as loops, which are sequences of instructions carried out repeatedly in succession. Despite his failures to produce a fully functional computing machine, Babbage remained steadfastly undeterred in pursuing his ideas. Between 1847 and 1849, he drew up designs for a new and improved second version of his difference engine. This time it calculated decimal numbers up to thirty digits long, performed calculations quicker and was meant to be more simple as it required less parts. Still, the British government did not find it worth their investment. In the end, the most progress Babbage ever made on a prototype was completing one-seventh of his first difference engine. During this early era of computing, there were a few notable achievements. A tide-predicting machine, invented by Scotch-Irish mathematician, physicist and engineer Sir William Thomson in 1872, was considered the first modern analog computer. Four years later, his older brother James Thomson came up with a concept for a computer that solved math problems known as differential equations. He called his device an “integrating machine” and in later years it would serve as the foundation for systems known as differential analyzers. In 1927, American scientist Vannevar Bush started development on the first machine to be named as such and published a description of his new invention in a scientific journal in 1931. Dawn of Modern Computers Up until the early 20th century, the evolution of computing was little more than scientists dabbling in the design of machines capable of efficiently perform various kinds of calculations for various purposes. It wasn’t until 1936 that a unified theory on what constitutes a general purpose computer and how it should function was finally put forth. That year, English mathematician Alan Turing published a paper called titled "On computable numbers, with an application to the Entscheidungsproblem," which outlines how a theoretical device called a “Turing machine” can be used to carry out any conceivable mathematical computation by executing instructions. In theory, the machine would have limitless memory, read data, write results and a stored program of instructions. While Turing’s computer was an abstract concept, it a German engineer named Konrad Zuse who would go on to build the world’s first programmable computer. His first attempt at developing an electronic computer, the Z1, was a binary-driven calculator that read instructions from punched 35 millimeter film. The problem was the technology was unreliable, so he followed it up with the Z2, a similar device that used electromechanical relay circuits. However, it was in assembling his third model that everything came together. Unveiled in 1941, the Z3 was OBORO PRECIOUS ISIOMA MEDIA AND COMMUNICATION 100 LEVEL faster, more reliable and better able to perform complicated calculations. But the big difference was that the instructions were stored on external tape, allowing it function as a fully operational program-controlled system. What’s perhaps most remarkable is that Zuse did much of his work in isolation. He had been unaware that the Z3 was Turing complete, or in other words, capable of solving any computable mathematical problem – at least in theory. Nor did he have any knowledge of other similar projects that were taking place around the same time in other parts of the world. Among the most notable was the IBM-funded Harvard Mark I, which debuted in 1944. More promising, though, was the development of electronic systems such as Great Britain’s 1943 computing prototype Colossus and the ENIAC, the first fully-operational electronic general-purpose computer, which was put into service at the University of Pennsylvania in 1946. Out of the ENIAC project came the next big leap in computing technology. John Von Neumann, a Hungarian mathematician who had consulted on ENIAC project, would lay the groundwork for a stored program computer. Up to this point, computers operated on fixed programs and altering their function, like say from performing calculations to word processing, required having to manually rewire and restructure them. The ENIAC, for example, took several days to reprogram. Ideally, Turing had proposed having the program stored in the memory, which would allow it to be modified by the computer. Von Neumann was intrigued by the concept and in 1945 drafted a report that provided in detail a feasible architecture for stored program computing. His published paper would be widely circulated among competing teams of researchers working on various computer designs. And in 1948, a group in England introduced the Manchester Small-Scale Experimental Machine, the first computer to run a stored program based on the Von Neumann architecture. Nicknamed “Baby,” the Manchester Machine was a experimental computer and served as the predecessor to the Manchester Mark I. The EDVAC, the computer design for which Von Neumann’s report was originally intended, wasn’t completed until 1949. Transitioning Toward Transistors The first modern computers resembled nothing like the commercial products used by consumers today. They were elaborate hulking contraptions that often took up the space of an entire room. They also sucked enormous amounts of energy and were notoriously buggy. And since these early computers ran on bulky vacuum tubes, scientists hoping to improve processing speeds would either have to find bigger rooms or come up with an alternative. Fortunately, that much-needed breakthrough had already been in the works. In 1947, a group of scientists at Bell Telephone Laboratories developed a new technology called point-contact transistors. Like vacuum tubes, transistors amplify electrical current and can be used as switches. But more importantly, they were much smaller (about the size of a pill), more reliable and used much less power overall. The co-inventors John Bardeen, Walter Brattain, and William Shockley would eventually be awarded the Nobel Prize in physics in 1956. OBORO PRECIOUS ISIOMA MEDIA AND COMMUNICATION 100 LEVEL And while Bardeen and Brattain continued doing research work, Shockley moved to further develop and commercialize transistor technology. One of the first hires at his newly founded company was an electrical engineer named Robert Noyce, who eventually split off and formed his own firm, Fairchild Semiconductor, a division of Fairchild Camera and Instrument. At the time, Noyce was looking into ways to seamlessly combine the transistor and other components into one integrated circuit to eliminate the process in which they were pieced together by hand. Jack Kilby, an engineer at Texas Instruments, also had the same idea and ended up filing a patent first. It was Noyce’s design, however, that would be widely adopted. Where integrated circuits had the most significant impact was in paving the way for the new era of personal computing. Over time, it opened up the possibility of running processes powered by millions of circuits – all on a microchip the size of postage stamp. In essence, it’s what has enabled the our ubiquitous handheld gadgets millions of times more power than the earliest computers. DEFINITION For most of the people, computer is a machine used for a calculation or a computation, but actually it is much more than that. Precisely, “Computer is an electronic device for performing arithmetic and logical operation.” Or “Computer is a device or a flexible machine to process data and converts it into information.” To know about the complete process that how computer works, we will have to come across the various terms such as Data, Processing and Information. First of all we will have to understand these terms in true sense. DATA “Data” is nothing but a mare collection of basic facts and figure without any sequence. When the data is collected as facts and figure, it has no meaning at that time, for example, name of student, names of employees etc. PROCESSING ‘Processing’ is the set of instruction given by the user or the related data to output the meaningful information. Which can be used by the user? The work of processing may be the calculation, comparisons or the decision taken by the computer. INFORMATION ‘Information ’is the end point or the final output of any processed work. When the output data is meaning it is called information OBORO PRECIOUS ISIOMA MEDIA AND COMMUNICATION 100 LEVEL DEVELOPMENT OF COMPUTER Actually speaking electronic data processing does not go back more than just half a centaury i.e. they are in existence merely from early 1940’s. In early days when our ancestor used to reside in cave the counting was a problem. Still it is stated becoming difficult. When they started using stone to count their animals or the possession they never knew that this day will lead to a computer of today. People today started following a set of procedure to perform calculation with these stones, which later led to creation of a digital counting device, which was the predecessor the first calculating device invented, was know as ABACUS. THE ABACUS Abacus is known to be the first mechanical calculating device. Which was used to be performed addition and subtraction easily and speedily? This device was a first develop Ed by the Egyptians in the 10th century B.C, but it was given it final shape in the 12th century A.D. by the Chinese educationists. Abacus is made up of wooden frame in which rod where fitted across with rounds beads sliding on the rod. It id dividing into two parts called ‘Heaven’ and ‘Earth’. Heaven was the upper part and Earth was the lower one. Thus any no. can be represented by placing the beads at proper place. NAPIER’S BONES As the necessity demanded, scientist started inventing better calculating device. In thus process John Napier’s of Scotland invented a calculating device, in the year 1617 called the Napier Bones. In the device, Napier’s used the bone rods of the counting purpose where some no. is printed on these rods. These rods that one can do addition, subtraction, multiplication and division easily. PASCAL’S CALCULATOR In the year 1642, Blaise Pascal a French scientist invented an adding machine called Pascal’s calculator, which represents the position of digit with the help of gears in it. LEIBNZ CALCULATOR In the year 1671, a German mathematics, Gottfried Leibniz modified the Pascal calculator and he developed a machine which could perform various calculation based on multiplication and division as well. ANALYTICAL ENGINE OBORO PRECIOUS ISIOMA MEDIA AND COMMUNICATION 100 LEVEL In the year 1833, a scientist form England knows to be Charles Babbage invented such a machine. Which could keep our data safely? This device was called Analytical engine and it deemed the first mechanical computer. It included such feature which is used in today’s computer language. For this great invention of the computer, Sir Charles Babbage is also known as the father of the computer. GENERATION OF COMPUTER As the time passed, the device of more suitable and reliable machine was need which could perform our work more quickly. During this time, in the year 1946, the first successful electronic computer called ENIAC was developed and it was the starting point of the current generation of computer FIRST GENRATION ENIAC was the world first successful electronic computer which was develops by the two scientists namely J. P. Eckert and J. W. Mauchy. It was the beginning of first generation computer. The full form of ENIAC is “Electronic Numeric Integrated And Calculator” ENIAC was a very huge and big computer and its weight was 30 tones. It could store only limited or small amount of information. Initially in the first generation computer the concept of vacuum tubes was used. A vacuum tube was such an electronic component which had very less work efficiency and so it could not work properly and it required a large cooling system. SECOND GENERATION As the development moved further, the second generation computers knocked the door. In this generation, transistors were used as the electronic component instead of vaccum tubes .A transistors is much smaller in the size than that of a vaccum tube. As the size of electrons components decreased from vaccum tube of transistor, the size of computer also decreased and it became much smaller than that of earlier computer. THIRD GENERATION The third generation computers were invented in the year 1964. In this generation of computer, IC (Integrated circuits) was used as the electronic component for computers. The development of IC gave birth to a new field of microelectronics. The main advantage of IC is not only its small size but its superior performance and reliability than the previous circuits. It was first developed by T.S Kilby. This generation of computer has huge storage capacity and higher calculating speed. FOURTH GENERATION OBORO PRECIOUS ISIOMA MEDIA AND COMMUNICATION 100 LEVEL This is the generation where we are working today. The computers which we see around us belong to the fourth generation computers. ‘Micro processor’ is the main concept behind this generation of computer. A microprocessor is a single chip (L.S.I circuit), which is used in a computer for any arithmetical or logical functions to be performed in any program. The honaur of developing microprocessor goes to Ted Hoff of U.S.A. He developed first micro-processor, the Intel 4004, as he was working for Intel Corporation, U.S.A with the use of microprocessor in the fourth generation computers, the size of computer become very fast and efficient. It is evident that the next generation of computer i.e. fifth generation will be developed soon. In that generation, computer will possess artificial intelligence and it would be able to take self decisions like a human being. SCOPE OF COMPUTER Certain characteristics of computer interaction can make computers well suited for distance learning. The features listed below the prospect of the computer use look more promising: Access to expert and respected peers. One to One and much communication. Active learner participation. Linking of new learning to concrete on the job problems. Follow up, feedback and implementation support from pears or experts. Self direction control over stop or start, time, pace and place of learning or communication activity. LANGUAGES OF COMPUTER A language is defined as the medium of expression of thoughts . All the human beings in this world communicate with each other by a language. Similarly, computer also needs some expression medium to communicate with others A computer follows the instructions given by the programmer to perform a specific job. To perform a particular task, programmer prepares a sequence of instructions, know as programmed. A program written for a computer is known as Software. The programmed is stored in RAM. The CPU takes one instruction of the programmed at a time from RAM and executes it. The instructions are executed one by one in sequence and finally produce the desired result. The Journey of computer software machine language to high level languages to modern 4GL / 5GL languages is an interesting one. Let us talk about this in detail. FIRST GENERATION LANGUAGES 1GLs (Machine language) OBORO PRECIOUS ISIOMA MEDIA AND COMMUNICATION 100 LEVEL When the human being stared programming the computer the instruction were given to it in a language that it could easily understand. And that language was machine language. The binary language a language, a language of Is and Os is known as Machine language. Any instruction in this language is given in the form of string of 1s and 0s. Where the symbol I stand for the presence of electrical pulse and 0 stands for the absence of electric pulse. A set of 1s and 0s as 11101101 has a specific meaning to a computer even through it appears as binary number to us. The writing of programmer in machine language is very cumbersome and complicated and this was accomplished by experts only. All the instructions and input data are fed to the computer in numeric form, specifically a binary form. SECOND GENERATION LANGUAGES 2GLs (Assembly Language) Lots of efforts are made during last 50 years to obviate the difficulties faced for using the machine language. The first language similar to English was developed in 1950 which was known as Assembly Language or Symbolic Programming Languages. After 1960, the High Level Languages were developed which bought the common man very to the computer. And this was the main reason for tremendous growth in computer industry. The high level languages are also known as Procedure Oriented Languages. THIRD GENERATION LANGUAGES (3GLs ) (High Level Languages) The assembly language was easier to use compared with machine la language as it relieved the programmer from a burden of remembering the operation – codes and addresses of memory location. Even though the assembly languages proved to be great help to the programmer, a search was continued for still better languages nearer to the conventional English language. The languages developed which were nearer to the English language, for the use of writing the programmer in 1960 were known as High Level languages. The different high level languages which can be used by the common user are FORTRAN, COBOL, BASIC, PASCAL, PL-1 and many others. Each high level language was developed to fulfill some basic requirements for particular type of problems. But further developments are made in each language to widen its utility for different purposes. FOURTH GENERATION LANGUAGES (4GLs) The 3GLs are procedural in nature i.e., HOW of the problem get coded i.e., the procedures require the knowledge of how the problem will be solved . Contrary to them, 4GLs are non procedural. That is only WHAT of the problem is coded i.e., only ‘What is required’ is to be specified and rest gets done on its own. Thus a big program of a 3GLs may get replaced by a single statement of a 4GLs. The main aim of 4GLs is to be cut down on developed and maintenance time and making it easier for users. OBORO PRECIOUS ISIOMA MEDIA AND COMMUNICATION 100 LEVEL MY VISIT TO LEARNING RESOURCE LAB ( C45-C49) I visited the E- library which is located at college one c45 – c49 . The library is very spacious and well ventilated with morethan 10 air conditioners . The place is very quiet because it is a place to study and make research . There are different kinds of I.C.T devices there such as; 1. 2. 3. 4. 5. 6. 7. 8. 9. 75-100 systems A router for wireless systems A server- Abba used for registration A desk jet wireless printer A back up system and inverter for power Ups Cpu Mouse Extension socket/ wire