Pages

2/19/2018

Information technology

Information technology



Information technology (IT) is the application of computers to store, retrieve, transmit and manipulate data,[1] or information, often in the context of a business or other enterprise.[2] IT is considered to be a subset of information and communications technology (ICT). In 2012, Zuppo proposed an ICT hierarchy where each hierarchy level "contain[s] some degree of commonality in that they are related to technologies that facilitate the transfer of information and various types of electronically mediated communications".[3]
The term is commonly used as a synonym for computers and computer networks, but it also encompasses other information distribution technologies such as television and telephones. Several industries are associated with information technology, including computer hardwaresoftwareelectronicssemiconductorsinternettelecom equipment, and e-commerce.[4] [a]
Humans have been storing, retrieving, manipulating, and communicating information since the Sumerians in Mesopotamia developed writing in about 3000 BC,[6] but the term information technology in its modern sense first appeared in a 1958 article published in the Harvard Business Review; authors Harold J. Leavitt and Thomas L. Whisler commented that "the new technology does not yet have a single established name. We shall call it information technology (IT)." Their definition consists of three categories: techniques for processing, the application of statistical and mathematical methods to decision-making, and the simulation of higher-order thinking through computer programs.[7]
Based on the storage and processing technologies employed, it is possible to distinguish four distinct phases of IT development: pre-mechanical (3000 BC – 1450 AD), mechanical (1450–1840), electromechanical (1840–1940), and electronic (1940–present).[6] This article focuses on the most recent period (electronic), which began in about 1940.

  • History of computer technology


    Devices have been used to aid computation for thousands of years, probably initially in the form of a tally stick.[8] The Antikythera mechanism, dating from about the beginning of the first century BC, is generally considered to be the earliest known mechanical analog computer, and the earliest known geared mechanism.[9]:279 Comparable geared devices did not emerge in Europe until the 16th century,[citation needed][according to whom?] and it was not until 1645 that the first mechanical calculatorcapable of performing the four basic arithmetical operations was developed.[10]:3
    Electronic computers, using either relays or valves, began to appear in the early 1940s. The electromechanical Zuse Z3, completed in 1941, was the world's first programmable computer, and by modern standards one of the first machines that could be considered a complete computing machine. Colossus, developed during the Second World War to decrypt Germanmessages was the first electronic digital computer. Although it was programmable, it was not general-purpose, being designed to perform only a single task. It also lacked the ability to store its program in memory; programming was carried out using plugs and switches to alter the internal wiring.[11][page needed] The first recognisably modern electronic digital stored-program computerwas the Manchester Small-Scale Experimental Machine (SSEM), which ran its first program on 21 June 1948.[12]
    The development of transistors in the late 1940s at Bell Laboratories allowed a new generation of computers to be designed with greatly reduced power consumption. The first commercially available stored-program computer, the Ferranti Mark I, contained 4050 valves and had a power consumption of 25 kilowatts. By comparison the first transistorised computer, developed at the University of Manchester and operational by November 1953, consumed only 150 watts in its final version.[13]































No comments:

Post a Comment