Publications: 106 | Followers: 1

3610-lecture1-history-of-computing

Publish on Category: Birds 0

History of Computing
Chapter 1
Introduction to Social and Ethical Computing
Historical Development of ComputingDevelopment of the InternetDevelopment of the World Wide WebThe Emergence of Social and Ethical Problems in ComputingThe Case for Computer Ethics Education
Historical DevelopmentBefore 1900AD
Man sought to improved life through the invention of gargets.First utility tools recorded dealt with numbersFirst recorded on bones – 20,000 to 30,000 B.C.First place-value number system in place – 1800 B.C.Abacus – Mother of Computers – 1000 B.C. and 500 B.C.Zero and Negative Numbers – 300 B.C. and 500 A.D.1500AD and 1900AD lot of activities in the development of computing devicesDriven by commerce1500 LeonardodaVinci invented mechanical calculator1621 invention of the slide rule1625 WilhelmSchichard’smechanical calculator in1640BlaisePascal’s Arithmetic MachineMajor breakthrough in speed up1800 AD with the invention of the punched card by Joseph-Marie JacquardRevolutionized computingQuickly spread in other fieldsSpeed up computation and storage of information
Historical DevelopmentBefore 1900AD
1830 AD exciting period1830 - Charles Babbage’s Analytical EngineGeorge and EdwardSchutz’sDifference EngineWithin a decade - major milestoneGeorge Boole’s invention of Boolean AlgebraOpened fields of mathematics, engineering, & computingLead to the new frontiers in logic
Historical DevelopmentBefore 1900AD
Mid 1850 through the turn of the century1857 - Sir CharlesWheatstone’sinventionPaper tape to store informationCreated new excitement in the computing community of the time.Huge amounts of data could be entered & stored1869AD - Logic Machine by William StanleyJovons~1874 - first Keyboard by Sholes1881 - Rectangular Logic Diagrams by Allan Marquand
Historical DevelopmentBefore 1900AD
Mid 1850 through the turn of the century1886, Charles Pierce - first linked Boolean Algebra to circuits based on switchesMajor break through in mathematics, engineering and computing science1890 - John Venn invented the Venn diagramsUsed extensively in switching algebras in both hardware and software development1890 - Herman Hollerith invented the Tabulating MachineUtilized Jacquard’s punched card to read the presence or absence of holes.The data read was to be collated using an automatic electrical tabulating machineLarge number of clock-like countersSummed up and accumulated the results in a number of selected categories.
After 1900 AD
Computing in infancyCentury began with a major milestoneVacuum tube by John Ambrose Fleming.Played a major role in computing for the next half century.All digital computer in the first half century ran on vacuum tubes.1906 - triode by Lee de Forest in 1906.1926 - first semiconductor transistorNot used for several yearsCame to dominate the computing industry in late years1937 - Turing Machine by Alan TuringInvention of an abstract computerSome problems do not lend themselves to algorithmic representations, not computable1942 - COLOSSUS, one of the first working programmable digital computers
After 1900 AD
1942 – Turing designed COLOSSUSOne of the first working programmable digital computers1939 – VincentAtanasoff– 1st digital computer modelutilized capacitors to store electronic charge to represent Boolean numbers0 and 1 used by the machine in calculationsInput and output data was on punched cardsSome doubt it ever worked
After 1900 AD
Howard Aiken – developed Harvard Mark I1stlarge scale automatic digital computer.also known as IBM automatic sequencer calculator- ASCC1943, Alan Turing – COLOSSUSConsidered 1stprogrammable computedesigned to break the German ENIGMA codeused about 1800 vacuum tubesexecute a variety of routines.
After 1900
John WilliamMauchly& J.PresperEckert Jr - ENICACVacuum tube-based general purpose10 feet highWeighed 30 tonsOccupied 1000 square feet70,000 resistors10,000 capacitors6000 switches18,000 vacuum tubeNo internal memoryHard-wiredConsistently programmed by switches and diodes
After 1900
1944-1952 John WilliamMauchly& J.PresperEckert Jr – EDVACElectronic discrete variable automatic computer1sttruly general purpose digital computerStored program instruction conceptcompleted in 19564,000 vacuum tubes and 10,000 crystal diodes1948 - UNIVAC I1stcommercially available computer.
After 1900
Many companies became involvedInternational Business Machines (IBM), Honeywell, and Control Data Corporation (CDC) in the USA, and International Computers Limited, (ICL) in UKBuilt mainframeHugh – took entire roomsExpensive – use limited to big corporationsMid to late sixtiesDeveloped less expensive but smaller computerMinicomputerTimesharing conceptLet to idea of networking
After 1900
1971 and 1976 - first microprocessorBuilt with integrated circuit with many transistors on a single boardVacuum tubes and diodes no longer usedTed HoffThe 40044-bit data path1972 – Intel - 80088-bit microprocessor based on the 4004fIrstmicroprocessor to use a compilerSpecific application microprocessors
Microprocessor
1974 -truly general purpose microprocessor8080 -8-bit device - 4,500 transistors & astonishing 200,000 operations per secondAfter 1974, development exploded
Computer Software and Personal Computer (PC)
Until mid 1970sDevelopment led by hardwareComputers were designed and software was designed to fit the hardware.Personal computing industry began1976 - Apple I and Apple II microcomputer were unveiled1981 - IBM joined the PC wars3 Major PlayersIBMGaryKildall- Developed the first PC operating systemBill Gates - Developed the Disk Operating System (DOS).
The Development of the Internet
Internet based on 4 technologiesTelegraphTelephoneRadioComputersOriginated from the early work of J.C.R.LickliderConceptualized a global interconnected set of computersConcept for communication between network nodesPackets instead of circuitsEnabled computers to talk to each other.1961 -KleinrockPublished first work on packet switching theory
The Development of the Internet
Two additional important projectsDonald Davies and RogerScantlebergCoining the term packetConnected computer in Boston with one in Los AngelsLow speed dial-up telephone linecreated the first working Wide Area Network1967 Roberts - publishing the first plan for ARPNET1968 - team, lead by Frank Heart and included Bob Kahn, developed IMP
ARPNET
Began as tool for defense contractorsUniversities addedGovernment joinedOther countries joinedARPANET ceased to exist in 1989Internet was an entity to itself
Development World Wide Web
Beginning concepts - Tim Berners-Lee’s 1989Proposal calledHyperTextand CERNEnable collaboration between physicists & researchers in the high energy physics researchThree new technologies were incorporated.HyperTextMarkup Language (HTML)hypertext concepts- to be used to write web documentsHyperTextTransfer Protocol (HTTP) a protocolUsed to transmit web pages between hostsWeb browser client software program to receive and interpret data and display results.
Development World Wide Web
Proposal included a very important concept for the user interfaceConsistent across all types of computer platformsEnable users to access information from any computer.Line-mode interface was developed & named at CERN in late 1989
Development World Wide Web
GrowthCentral computer at CERN with few web pages in 199150 world wide by 1992720,000 by 1999Over 24 million by 20011993 - graphic user interface browserMosaicPopularized and fueled growth of internet
Emergence of the Social & Ethical Problems in Computing
The Emergence of Computer CrimesPerhaps started with the invention of the computer virusThe termvirusis derived from a Latin wordviruswhich means poisonComputer virusSelf-propagating computer programDesigned to alter or destroy a computer system resourceSpreads in the new environmentAttacks major systemWeakens the capacity of resources to perform1972 – virus used to describe piece of unwanted computer code
Growth of Computer Vulnerabilities
The Case for Computer Ethics Education
What is Computer EthicsJames H. MooreFirst coined the phrase "computer ethics“Computer ethics is the analysis of the nature and social impact of computer technology and the corresponding formulation and justification of policies for the ethical use of such technology .Definition focuses on the human actionsStudy, an analysis of the values of human actions influenced by computer technology.Computer influence on human actions is widespread throughout the decision making process preceding the actionEducation we study the factors that influence the decision making process
Why You Should Study Computer Ethics
Central task of computer ethicsdetermine what should be doneEspecially whenever there is a policy vacuumVacuums caused by the ‘confusion’ between the known policies and what is presentedProfessionals unprepared to deal effectively with the ethical issuesCan stop the vacuumsCan prepare the professionals
Schools of Thought
Study computer ethics as remedial moral educationComputer ethics education not as a moral education but as a field worthy of study in its own right
Justification for First Thought
We should study computer ethics because doing so will make us behave like responsible professionals.We should study computer ethics because doing so will teach us how to avoid computer abuse and catastrophes.
Material taken from Walter Manner in “Is Computer Ethics Unique?”
Justification for Second Thought
We should study computer ethics because the advance of computing technology will continue to create temporary policy vacuums.We should study computer ethics because the use of computing permanently transforms certain ethical issues to the degree that their alterations require independent study.We should study computer ethics because the use of computing technology creates, and will continue to create, novel ethical issues that require special study.We should study computer ethics because the set of novel and transformed issues is large enough and coherent enough to define a new field
Material taken from Walter Manner in “Is Computer Ethics Unique?”

0

Embed

Share

Upload

Make amazing presentation for free
3610-lecture1-history-of-computing