* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Notes - MyWeb
Wearable computer wikipedia , lookup
Visual Turing Test wikipedia , lookup
Chinese room wikipedia , lookup
Human–computer interaction wikipedia , lookup
Computer Go wikipedia , lookup
Kevin Warwick wikipedia , lookup
History of artificial intelligence wikipedia , lookup
Alan Turing wikipedia , lookup
INB 201 Development of the Personal Computer The Origins of the Personal Computer and Software Industry Development of the Computer Industry Historical Epochs in Computers 1) Wartime Development – 1940-1947 Alan Turing and John von Newman 1947 – Invention of the transistor at Bell Labs 2) Development of Early Commercial Computers 1950-1958 3) 1958 – Invention of integrated circuit 4) Mainframe Era – 1960 – 1975 5) 1971 – Invention of Microprocessor at Intel 6) Early PCs 1975-1980 7) IBM PC 1981 8) Rise of the Clones and Wintel 1983-1990 9) The Decline of Apple and IBM and the Rise of Microsoft – 1986 – 1995 10) Early Mobile Phones – 1992 - 2000 11) The Rise of the Internet – 1994-2000 – E-commerce 12) Apple Comes Back from Near-Dead – 1997 – 2003 13) Miniaturization of Computers – Pagers 1996; iPod 2001; BlackBerry 2003; iPhone 2007 14) Emergence of AI – Deep Blue 1997; Watson 2011; Barbie 2015 15) Computers in Products – 1990 Lexus LS 400 antilock brakes; 2015 World War II and aftermath US military and need for large computations for codebreaking and calculating trajectory for firing large guns Partnership between US military and major US universities to build first electronic computers Research support to design the most efficient and effective electronics for computers Leads in 1947 to invention of the transistor at ATT Bell Labs Antitrust actions against ATT and patents transferred to many firms Alan Turing and Computer as Enigma code breaker: Bombe World War II’s Greatest Hero: The True Story of Alan Turing A computer pioneer who helped defeat the Nazis, Turing was a war hero working in secret, a gay man in an era of extraordinary prejudice and a genius before his time Alan Turing in 1928. Sherborne School, via Agence France-Presse — Getty Images WHEN ALAN TURING DIED IN 1954, a modest obituary in the Manchester Guardian spoke of an academic who pioneered the creation of the new electronic calculating machines. He liked long-distance running, chess and gardening, and entertained the idea that “electrical computators” would one day “do something akin to thinking.” No writer of the time could fathom the true nature of Turing’s life and genius, which are still being deciphered decades later. Unknown to the general public at the time of his death, Turing was a World War II code-breaking hero who, as Winston Churchill would later recall, made the single biggest contribution to the Allied victory in World War II. Turing’s brilliant work in mathematics and logic laid out the blueprint for modern computers and, in turn, the digital age. To honor his trailblazing contribution, the Turing Prize — considered the Nobel Prize of computing — was established in 1966 as the field’s highest distinction. A three-rotor Enigma machine in a wooden case that carries its serial number. Enigma Museum Innovative, forward thinking and brave in the face of prejudice, Alan Turing was an enigma in his own time And, as Prime Minister Gordon Brown noted in a national apology a half century after Turing’s presumed suicide, the war hero faced cruel and inhuman persecution for being gay, a fate shared by tens of thousands of others. Innovative, forward thinking and brave in the face of prejudice, Alan Turing was an enigma in his own time, one that we are only just beginning to figure out. World War II Turning the Key: Turing’s Wartime Code-Breaking Heroics THE MACHINE THAT SANK THE U-BOATS Working in secret as part of the massive Bletchley Park cryptology operation, Alan Turing helped crack the Nazi military’s vexing Enigma machine. Since it possessed a set of rotors that were constantly scrambled to confuse the Allies, the encryption device had quintillions of possible settings. German military assumed there simply wasn’t enough time to break it. Turing’s breakthrough came from observing that morning Uboat communications included a weather report, a pattern to be exploited. He then built a gargantuan machine that could quickly sort through millions of possibilities to divine the code. The first of many, the device set the stage for a massive computing operation that would eventually crack up to two messages a minute. Alan Turing’s office at Bletchley Park. Tom Jamieson for The New York Times HACKING THE FUHRER Making the Atlantic safe for British shipping was a massive accomplishment. No less steely a Brit than Winston Churchill admitted the U-boat menace was his biggest fear. Turing’s next wartime advance was cracking the Tunny codes, a high-powered cipher the Fuhrer used to communicate with commanders in the field. The system allowed the Allies to read “what Hitler and his generals were saying to each other over breakfast,” according to one of Turing’s contemporaries. Open access to various levels of encrypted German communication allowed British intelligence to plant false rumors and run a stable of successful double agents, undermining the Nazi war machine. An R.A.F. officer in charge of distributing “Ultra Secret” information intercepted from the German Enigma machine during World War II recounts just how valuable this information was to British and American war efforts. Published by The New York Times, December 29, 1974 The Ultra Secret [1974] READ MORE THE ENIGMA MACHINE The Enigma machine encodes characters by completing a circuit through a series of a plugboard, rotors, and a reflector. Type any message into the input box to trace how the connection is made. @ http://paidpost.nytimes.com/the-weinstein-company/world-war-iis-greatest-hero-the-true-story-of-alanturing.html#.VNURX8aiMfk Rotors: The signal is relayed through a series of mechanical rotors. Each rotation further scrambles the letter. Reflector: A special final rotor sends the coded letter back through the gears, to be re-encrypted one last time. Persecution Gay Persecution in Mid-Century England: “Evil Men” Committing “Gross Indecency” THE CAMBRIDGE FIVE AND COLD WAR PARANOIA Arrested in 1952, Alan Turing was charged with “gross indecency” under the Victorian-era Criminal Law Amendment Act, once used to imprison Oscar Wilde. Turing’s punishment came during a backlash against homosexuals, a “drive against male vice” the Home Secretary enacted to “rid England of this plague.” The persecution intensified after two of the Cambridge Five spy ring defected to the Soviet Union in 1951. Deeply embedded in the British government and Secret Service, the saboteurs had been passing classified information for decades. Turing’s trial occurred months after the British Secret Service asked him to help crack Soviet codes. Alan Mathison Turing by Elliott & Fry, 1951 © National Portrait Gallery, London ONE OF 49,000 During sentencing, a judge offered Turing the choice of prison or “organo-therapy,” a type of chemical castration via estrogen injection that killed a man’s sex drive. Turing’s choice, hormone therapy, caused him to grow breasts and become depressed, triggers for his suicide. He was far from alone. More than 49,000 men, including politicians and celebrities were arrested or experienced similar punishments during a dragnet compared to McCarthyism. In 1962, an allegedly gay Army captain was killed during a course of doctor-supervised aversion therapy. Injected with a vomit-inducing drug while being shown pictures of naked men, he died of dehydration. His death certificate said natural causes. A QUIET DISSIDENT This widespread persecution created a climate where newspapers ran stories about “how to spot a homo,” a gay slang (“polari”) evolved to evade surveillance, and many gay men, marginalized and blackmailed, committed suicide. Turing refused to cower. He was arrested only after self-reporting a burglary and telling the police about the relationship he had with a man he suspected was involved. In both 1952 and 1953, he traveled to Norway and Greece to get out from under English law and pursue relationships as he saw fit. It wasn’t until 1967, after thousands of its gay citizens had been persecuted, that gay relationships began to be decriminalized in the United Kingdom. Pardon all 49,000 To join a petition to pardon the 49,000 men who, like Alan Turing, were persecuted under British law for being gay, click here. Computers How Turing Imagined The Modern Computer THE THEORY OF THE ULTIMATE TOOL AND THE UNIVERSAL TURING MACHINE When the Cambridge scholar initially spoke about his landmark 1936 paper “On Computable Numbers, with an Application to the Entschiedungsproblem,” few attended Turing’s lecture, and only two asked for reprints. Few then could appreciate the radical nature of his ideas. His Universal Turing Machine concept, an abstract calculator that performed multiple tasks by changing software, stands as the evolutionary forebear of modern computing, primogeniture of the first crude arrays of cathode tubes and ancestor of today’s sleek laptops. Nearly every early model — from the Army’s ENIAC to Princeton’s EDVAC to the ACE, built from Turing’s designs at the National Physical Laboratory in London and the first real example of utilizing software and programming — owes a debt to Turing’s paper. The “Baby,” designed and built at The University of Manchester in 1948, was the first machine that had all of the components now classically regarded as characteristic of the basic computer. The University of Manchester THE MYSTERY OF ARTIFICIAL INTELLIGENCE AND THE TURING TEST While Turing all but established the field of computer science with a clever tool that utilized both software and hardware, he constantly turned over the question of how to define this new intelligence he helped birth. In 1950, his paper “Computer Machinery and Intelligence” grapples with this concept of “artificial intelligence,” and advances the “imitation game,” or the Turing Test, which posits that for a computer truly to think, it must fool a human interrogator into thinking it’s another human during conversation. Rather than suggest there’s “no mystery about consciousness,” as he put it, Turing’s concept has become the benchmark used to measure our pursuit of intelligent machines. Tracing back contemporary understanding of electronic digital computing in the 1960s to the foundational principles of George Boole, Norbert Wiener, Alan Turing, John von Neumann and Claude E. Shannon discovered decades prior. Published by The New York Times, January 9, 1967 The Electronic Digital Computer: How It Started, How It Works and What It Does [1967]READ MORE MODERN ADVANCES OF THE TURING LEGACY By developing algorithms and the philosophical and practical underpinnings of the digital age, Turing’s influence continues to be felt. His ACE designs, used as blueprints by the British to create a device to track Soviet aircraft, served as the DNA of the first home computer, the Bendix G-15. These days, Turing’s legacy lives on in Silicon Valley, where the founders of high-tech’s most iconic companies have praised his elemental contributions to the field of computer science. IBM Remington Rand and the first corporate computer – UNIVAC Use in 1950 census and prediction of the 1952 election IBM helps build early computer at Harvard Leverages knowledge to build better fully digital computer – 1952-1954 Mainframes were leased - $3000 – $10,000 per month in the 1950s and 1960s Buy software from IBM - proprietary IBM dominates the industry – 70% global market share by 1960 Computer industry 1960-1975 IBM was a vertically integrated firm – what does this mean? Proprietary systems – top to bottom How else could firms be organized? Main competitors were Sperry Rand, ATT, GE, RCA: IBM and 7 dwarfs Huge barriers to entry – especially economies of scale European firms supported by national governments are weak Japanese firms supported by Japanese government are much stronger 1960s mainframe computer 1970s VAX Minicomputer The History of the Microprocessor and the Personal Computer By Graham Singer on September 17, 2014 The PC is a story of the creation of a new industry around a product that was entirely new: a computer for anyone who wanted one. It led to: A new way to organize production that led to globalization and the growth of China and many emerging economies. An industry created by small startup firms and individuals Created a product that revolutionized the way business is conducted across most industries Many very smart people and many very good companies did not see the opportunities in PCs and blew the chance to make hundreds of billions of dollars The people who did see the opportunities were very often not business people at all The new PC firms disrupted existing firms, bringing the largest mainframe and minicomputer firms down PC industry created the main patterns for technology and knowledge related firms Intel and invention of the microprocessor Intel and its founders worked for Fairchild Semiconductor but left to start a new firm Robert Noyce, Gordon Moore, Andrew Grove Intel’s first great product was a DRAM memory chip A Japanese firm need chips for a calculator and Intel decided to make one of them do all the processing duties. Intel created the 4004: Intel 4004, the first commercial microprocessor, had 2300 transistors and ran at a clock speed of 740KHz. in the late 1960s and early 1970s computing was the province of mainframes and minicomputers. Less than 20,000 mainframes were sold in the world yearly and IBM dominated this relatively small market (to a lesser extent UNIVAC, GE, NCR, CDC, RCA, Burroughs, and Honeywell -the "Seven Dwarfs" to IBM's "Snow White"). Meanwhile, Digital Equipment Corporation (DEC) effectively owned the minicomputer market. Intel management and other microprocessor companies, couldn't see their chips usurping the mainframe and minicomputer whereas new memory chips could service these sectors in vast quantities. Thus Intel did not see the microprocessor they created as related to small computers. History of the Microprocessor and the Personal Computer, Part 2 By Graham Singer on September 24, 2014 One of the most remarkable features of the PC industry is how slowly the idea of a personal computer developed. Virtually no one worked to use the Intel microprocessor to build a PC for almost 4 years Computers were still seen as an expensive business and research tool, and the markets for a new generation of relatively inexpensive personal machines and industrial controllers didn't exist, nor was it imagined in many cases. The imagination needed to see the future came mostly from a small number of amateur technologists who wanted to have a computer of their own Intel finally figured out what could be done and hired Gary Kildall to write an O/S for a microcomputer CP/M; this fizzled Intel saw the microprocessor as little more than a component of a package which could be leveraged to sell more memory products. A company making ham radio equipment in New Mexico owned by Ed Roberts built the first PC using the Intel microprocessor: the Altair News of this development prompted Bill Gates, 19 a sophomore at Harvard and Paul Allen, 22 working in Boston to develop an O/S for the Altair. This happened in 1975-1976 and the Gates-Allen version of BASIC was used on many personal computers other than the Altair. 1975-1977 led to the birth of the personal computer Gates – Allen software Steve Jobs and Steve Wozniak - Apple Steve Jobs and Steve Wozniak work on the original Apple I, powered by the MOS 6502 processor. While still technically a kit, since the buyer had to source an enclosure and peripherals, the mainboard was sold fully assembled. Radio Shack – TRS 80 The market for PCs became clear in 1977-1980, with many new PC developed and sold. The Apple II https://video.search.yahoo.com/video/play;_ylt=AwrTcd6GXslUQEcA5gonnIlQ;_ylu =X3oDMTBsOXB2YTRjBHNlYwNzYwRjb2xvA2dxMQR2dGlkAw-?p=youtube+triumph+of+the+nerds&tnr=21&vid=73AC109EC288BB5ECC3D73AC1 09EC288BB5ECC3D&l=3038&turl=http%3A%2F%2Fts4.mm.bing.net%2Fth%3Fid %3DUN.608012815155921515%26pid%3D15.1&sigi=11rd4bqmg&rurl=https%3A %2F%2Fwww.youtube.com%2Fwatch%3Fv%3DzlnGh7WZkMs&sigr=11b79224c& tt=b&tit=Triumph+Of+The+Nerds+Part+1&sigt=10r4o61tb&back=https%3A%2F% 2Fsearch.yahoo.com%2Fyhs%2Fsearch%3Fp%3Dyoutube%2Btriumph%2Bof%2Bt he%2Bnerds%26ei%3DUTF-8%26hsimp%3Dyhs001%26hspart%3Dmozilla&sigb=138q0arbc&hspart=mozilla&hsimp=yhs-001 31:00 A large part of Apple's success in the business market -- a market not originally foreseen as being the Apple II's primary focus -- stemmed from the close association with the hugely influential VisiCalc spreadsheet software that was only initially compatible with the Apple machine. VisiCalc's appeal across the entire business spectrum was such that it alone justified the purchase of the computers needed to run it. The Xerox Alto Many aspects of modern personal computing we now take for granted had been in development first by SRI International's Augmentation Research Center (ARC) under the stewardship of Douglas Engelbart, and later Xerox's Palo Alto Research Center (PARC). The primary focus of semiconductor companies (almost entirely U.S. derived) remained on high profit DRAM circuits. The microprocessor was mostly seen as part of a range of chips that could be sold as a multi-chip package. Intel, and more recently Mostek, were built on the profits of dynamic memory. That changed as Japanese semiconductor companies with little regard for U.S. patents and copyrights received generous tax breaks, low interest loans, and institutionalized protectionism from a government desperately trying to keep the Japanese computer industry from falling into the abyss. Demand for new integrated circuits, particularly in the U.S., had been growing at an average of 16% a year through the mid to late 1970s and Japan's government along with electronics companies saw ICs and particularly the lucrative DRAM market as an ideal opportunity to build their industry. Backed by $1.6 billion in government subsidies, tax credits, and low interest loans as well as large private investments, Japanese companies embarked on building state of the art foundries for IC manufacturing. These same Japanese companies also needed increased imports of U.S. made DRAM for their consumer and business products while their own plants were being built. iPhone 6s inside