Download notes - Gonzaga University Student Web Server

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
1
Workstations:
A workstation is a high-end microcomputer designed for technical or scientific applications. Intended
primarily to be used by one person at a time, they are commonly connected to a Local Area Network
(LAN) and run multi-user operating systems.
In the 1980s, these workstations brought power to the individual desktop by using an inexpensive
microprocessor, typically the Motorola 68000.
Their architecture and physical design was similar to the PC; however they cost more.
The difference was their use of the UNIX operating system, and their extensive networking abilities that
allowed sharing data and expensive peripherals like plotters (A graphics printer that draws images with
ink pens).
Workstations Cont.:
The first computer that might qualify as a “workstation” was the IBM 1620, introduced in 1959, which
was a small scientific computer designed to be used interactively by a single person sitting at the
console.
In the early 1980s, with the advent of 32-bit microprocessors such as the Motorola 68000, a number of
new participants in this field appeared, including Apollo Computer and Sun Microsystems.
Apollo, founded by Bill Poduska and located in Chelmsford, MA, was the first company out of the gate
when in 1981; it delivered a product that used the Motorola microprocessor and its own operating and
networking systems, called Domain.
The price for a single workstation was about $40,000; however, despite the cost, it was agreed upon by
many that having a computer at each workers’ desk, networked to other machines, was more efficient
than having a centralized time-shared computer accessed through “dumb” terminals.
The workstation sold well; however, competition arose in 1989 and the Apollo workstation was
eventually acquired by HP.
However, competition also came in the form of another new company located in Silicon Valley, just
down the road from Apple…Sun Microsystems.
2
SUN Microsystems:
Sun Microsystems was founded in 1982 by Vinod Khosla and it was a company that came out of a
transfer of technology thanks to the Stanford University Networked (SUN) workstation and of software
in the form of Berkeley’s UNIX operating system, brought by Bill Joy’s move to SUN.
UNIX (1 & 2):
Berkeley UNIX was a key to SUN’s success and helped push the Internet out of its ARPA roots in the
1990s.
UNIX was created at Bell Laboratories, which was a part of AT&T. However, AT&T would eventually
agree not to engage in commercial computing activities before 1981, and UNIX was allowed to spread
because AT&T gave it away.
UNIX was created by Ken Thompson and Dennis Ritchie, whose goal was to create an easier way of
sharing files. Ironically, although written for researchers like Thompson and Ritchie, UNIX would
eventually find its way into general use.
Universities could obtain a UNIX license for a nominal cost and because it was not a complete operating
system but rather a set of basic tools that allowed users to manipulate files in a simple manner, the
result was that UNIX was a godsend for university computer science departments.
UNIX was first developed in assembly language but because UNIX was rewritten in C, any machine with a
C complier could run it and no one minded if a university modified the UNIX to enhance its capabilities
and thus students, like Bill Joy, were allowed to make modifications.
Change slide:
One of UNIX’s tenets was that the output of any UNIX process be usable as input for another- and this
gave the UNIX enormous power and flexibility; however, it was not useful for the general public during
its early stages.
Bill Joy and his fellow students at Berkeley set out to make UNIX more accessible and among the many
enhancements added was support for networking by a protocol known as TCP/IP, which ARPA promoted
as a new way to interconnect networks. This protocol, and its bundling with Berkeley UNIX, forever
linked UNIX with the Internet.
Video #1
3
DEC and the VAX:
Just as the personal computer field was divided into the DOS and Macintosh camps, there was also a
battle going on in the scientific and engineering field.
Workstation companies competed by selling networks of machines, whose collective power they alleged
was greater than the sum of the parts.
For example, SUN’s famous slogan was that “The Network is the computer.”
Throughout the 1980s, DEC had a powerful strategy of its own that combined SUN’s emphasis on
networking with IBM’s concept of a unified family architecture.
DEC’s plan was to offer the consumer a single architecture, the VAX, on a single operating system, VMS,
in solitary or networked configurations.
Change slide:
The VAX/VMS is a computer server operating system that has a graphical user interface (GUI) with
complete graphics support and is a multi-user multi-processing virtual memory-based operating system
designed for use in time-sharing, batch processing, real time and transaction processing.
The only part of the strategy that wasn’t DEC’s was the networking- Ethernet, which DEC obtained in an
agreement with Intel and Xerox.
The plan had risks as DEC had to convince the customer that the VAX could supply everything while not
convening a sense that DEC was charging high prices.
DEC had to design and build products with good performance across the entire line and because the
PDP-10 series wasn’t compatible with the VAX, DEC decided to cut the line, which caused considerable
costumer outcry.
The VAX strategy worked well through the 1980s; however, after the Market crash in 1987, and by 1990,
the drawbacks of the VAX strategy were apparent, along with DEC’s inability to bring new VAX products
to market and DEC suffered immense losses.
Now known as OpenVMS, it is used by such customers as banks and hospitals as it is often used in
environments where system uptime and data access is critical
RISC and SPARC:
The VAX architecture had a lot in common with the IBM System/360 and its instruction set was
contained in a micro program, stored in read only memory. And like the 360, the VAX presented its
programmers with a rich set of instructions, with the VAX 11/780 with over 250.
4
However, men like John Cocke of IBM during the mid 1970s looked at the rapid advances in compilers
and concluded that a smaller set of instructions, using more frequent commands to load and store data
to and from memory , could operate faster than the System /360 and 370.
His ideas lead to an experimental machine called the IBM 801, which, although IBM held back on
introducing a commercial version, word got out and in 1980, a group at Berkeley led by David Patterson,
after hearing rumors of the 801, started a similar project called RISC- “Reduced Instruction Set
Computer.”
Change Slide:
Although their work received skepticism when introduced, beginning in 1987, and probably owing to Bill
Joy’s influence, SUN introduced a workstation with a RISC chip based on Patterson’s research called the
SPARC- Scalable Processor Architecture.
Data showed that the RISC offered a way of improving microprocessor speeds more rapidly than mini
and mainframe speeds were improving- or could improve.
The SPARC design did more than anything else to overcome the skepticism about RISC and SUN went a
step further in promoting it when they licensed the SPARC design so that other companies might adopt
it and make it a standard, which many went on to do and the culture of the Power PC was born.
Video #2
Ethernet: A Way to Network and Ethernet and PCs:
A RISC architecture, UNIX, and scientific or engineering applications differentiated workstations from
PCs.
Another distinction was that workstations were designed from the start to be networked, especially at a
local level and this was done using Ethernet, one of the most significant of all inventions that came from
the Xerox Palo Alto Research Center.
Ethernet enabled small clusters of workstations, and later, PCs to work together effectively.
Ethernet was invented at Xerox-PARC in 1973 by Robert Metcalfe and David Boggs.
Metcalfe connected Xerox’s MAXC to ARPANET, but the focus at Xerox was on local networking: to
connect to a single-user computer to others like it, and to a shared, high-quality printer within the same
building.
When Metcalf and Boggs arrived, there was already a local network established; however Metcalf
believed that this was too expensive and not flexible enough for what Xerox was looking for.
Metcalf recalled a network he saw in Hawaii that used radio signals to link computers among the islands
called ALOHAnet. With this system, files were broken up into packets with the address of the recipient
5
attached. Other computers would then tune into the UHF frequency and listened for the packets and
accepting the ones addressed to it, and ignoring the others.
Metcalfe took this idea and proposed using a cheap coaxial cable for the “ether” that carried those
signals and a new computer could simply be added to the “Ethernet” by tapping into the cable. And if
two computers transmitted at the same time, they would recall and try again and if it kept happing, they
would transmit less.
Clusters of small computers now provided an alternative to the classic model of a large central system
that was time-shared and accessed through dumb terminals.
Ethernet would have its biggest impact on the workstation, and later, the PC market, but its first success
came in 1979, when DEC, Intel, and Xerox joined to establish it as a standard, with DEC using it for the
VAX.
Change Slide:
Workstations and “Vaxen” found a market among engineers and scientist, but with only a few
expectations the commercial office environment continued to use office automation systems from IBM,
and others. As such, the PC was gaining in popularity and by the mid 1980s it was clear that no amount
of corporate policy directives could keep the PC out of the office.
The solution was to network the PC s to one another, in a local area network (LAN).
And although networking of PCs lagged behind the networking the UNIX workstations enjoyed from the
start, the personal computer’s lower coast and better software drove the market.
Local networking took the “personal” out of Personal computing with regards to the office environment;
however, most office workers hardy noticed how much this represented a shift way from the forces that
drove the invention of the PC in the first place.
The Internet:
Though most benefits of connecting office workers to a LAN went to administrators and mangers, one
very important thing that users connected to a LAN got in return was access to the Internet.
Descended from ARPANET, the Internet uses “packet switching” like other networks; however, there are
several major differences.
The internet is not a single network but rather a connection of many different networks across the
globe. Some of those networks are open to the public, while others are not.
Also, the internet allows communication across these different networks by is use of a common
protocol, TCP/IP (transmission control protocol/ internet protocol).
The internet made its way into general use by a combination of social and technical factors.
6
Among the former was a shift of financial and administrative support form ARPA, eventually to entities
that allowed internet access to anyone in the 1990s.
In 1992, internet users were distributed about evenly among governmental, educational, military, etc.
however by 1995, commercial users consisted the biggest chunk.
The technical factors behind the emergence of the internet include:
ARPA’s support for the development of, and its decision in 1980, to adopt the TCP/IP protocol.
Another factor was the rise of local networks which made it possible for large numbers of people to gain
access to the internet.
However, what hadn’t been anticipated was how advances in PC, driven by more powerful processors,
brought the capabilities found associated with UNIX workstations and the academic and research
worlds, into the offices. By 1995, those with PC on an LAN all had access to the internet, without each
machine requiring a direct connection to the Internet’s high speed lines.
The Internet Cont:
As the internet emerged from its roots in ARPA, it began to change.
The initial activities on the Internet were ARPANET derived, so users could log on to a remote computer,
transfer large files, and send mail.
The first serious extension to that triad gave a hint of what the popular press calls a “virtual community”
based on the internet.
Although these groups are associated with the Internet, for years, only those who had access to UNIX
systems could access them.
For the general public, they were anticipated in the personal computer arena by so called bulletin board
system (BBSs) which acted like bulletin boards, on which anyone could post a note for a l to read.
Another extension was UNIX based news groups which first appeared after 1979, somewhat
independently of the mainstream APRANET-Internet activities and under the general name of Usenet.
These were arranged into topics and most of the stories were hardly accurate but did kindle a general
interest in the Internet.
After a time, the internet began to feel like a large library without a card catalog.
So in 1990, programmers at the University of Minnesota responded by creating Gopher, which allowed
students and faculty, including those with little computer experience, to query campus computers for
information which Gopher would fetch and deliver to the person seated at the terminal.
7
Another system, created by Brewster Kahle at the Cambridge supercomputer company Thinking
Machines, was the WAIS (Wide area Information service) which allowed users to search the contents of
files directly.
However, like the early news groups, Gopher and WAIS were eventually rendered obsolete by the World
Wide Web and its system of information retrieval.
The World Wide Web:
The development of the WWW has element of both randomness and planning. It was invented at the
high energy physics laboratory CERN, on the Swiss French boarder by British computer scientist Tim
Berners- Lee.
It was originally part of an effort to foster a computer network that would enable scientists to share
their research more easily.
The Web’s fundamental concept, of structuring information as “hypertext” goes back to the say by
Vannevar Bush in 1945 about the coming increase of information and how technology might be able to
handle it via something called the Memex.
In the midst of all that sprouted the Internet, there was a sudden and unexpected need for a way to
navigate through its rich and ever increasing resources.
Change Slide:
For Berners-Lee
“the web’s major goal was to be a shared information space through which people and machines could
communicate. This space was to be inclusive, rather than exclusive”
He was concerned with allowed communication across computers and software of different types. And
h he also wanted to avoid the structure of most databases and thus, he devised a Universal Resource
Identifier (after to be renamed the Uniform Resource Locator or URL) that could point to any document
in the “universe of information.” Also, he created the Hypertext Transfer Protocol (HTTP) which was
faster and had more features than the then existing File Transfer Protocol.
And finally, he defined a Hypertext Markup Language (HTML) for the movement of hypertext across the
network.
The World Wide Web Cont.:
The WWW got off to a slow start.
To view web materials, one used a program called a “browser” similar in design to the Gopher.
8
However in the fall of 1992, Marc Andressen and Eric Bina began discussing ways of making it easier to
navigate the web and by 1993, they had written an early version of a browser they would later call
Mosaic.
Mosaic married the ease of use of Hypercard (used on Macintosh) with the full capabilities of the WWW.
And thus the user could select items using a mouse, hyperlinks were identifiable by different colors, and
the one that impressed most people who first used it was its seamless integration of text and images.
Eventually Mosaic was rewritten to run on Windows-based machines and Mac’s was well as
workstations.
Eventually, Anderseen and Bina wished to commercialize their product; however, the University of
Illinois objected so the two left and created Netscape and eventually marked a more popular version of
the software in 1994.
Conclusion:
In conclusion, developments with regards to the workstation, UNIX, VAX and the PC helped further
propel the use of computers in the scientific, academic, commercial and personal realms. As such, the
need for networking and an ability to share and search the growing sea of information became
necessary, and as a result, innovations, such as Ethernet, the Internet, and the WWW emerged, and
helped shaped and continue to shape the world we know today.
Sources used:
Ceruzzi, Paul E. A History of Modern Computing. 2nd ed. Cambridge: The MIT Press, 2003. Print.
OpenVMS. Wikipedia. 8 April 2012. Web.
VAS/VMS. Wikipedia. 9 April 2012.