Disponível somente no TrabalhosFeitos
  • Páginas : 32 (7813 palavras )
  • Download(s) : 0
  • Publicado : 28 de agosto de 2011
Ler documento completo
Amostra do texto
From Wikipedia, the free encyclopedia
Jump to: navigation, search
For other uses, see Computer (disambiguation).
"Computer technology" redirects here. For the company, see Computer Technology Limited.

Computer |
A computer is a programmable machine designed to sequentially and automatically carry out a sequence of arithmetic or logical operations. The particular sequenceof operations can be changed readily, allowing the computer to solve more than one kind of problem.
Conventionally a computer consists of some form of memory for data storage, at least one element that carries out arithmetic and logic operations, and a sequencing and control element that can change the order of operations based on the information that is stored. Peripheral devices allowinformation to be entered from an external source, and allow the results of operations to be sent out.
A computer's processing unit executes series of instructions that make it read, manipulate and then store data. Conditional instructions change the sequence of instructions as a function of the current state of the machine or its environment.
The first electronic computers were developed in the mid-20thcentury (1940–1945). Originally, they were the size of a large room, consuming as much power as several hundred modern personal computers (PCs).[1]
Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space.[2] Simple computers are small enough to fit into mobile devices, and mobile computers can bepowered by small batteries. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from mp3 players to fighter aircraft and from toys to industrial robots are the most numerous.
Contents[hide] * 1 History of computing * 1.1 Limited-function early computers * 1.2First general-purpose computers * 1.3 Stored-program architecture * 1.4 Semiconductors and microprocessors * 2 Programs * 2.1 Stored program architecture * 2.2 Bugs * 2.3 Machine code * 2.4 Higher-level languages and program design * 3 Function * 3.1 Control unit * 3.2 Arithmetic/logic unit (ALU) * 3.3 Memory * 3.4 Input/output (I/O) * 3.5Multitasking * 3.6 Multiprocessing * 3.7 Networking and the Internet * 4 Misconceptions * 4.1 Required technology * 4.2 Computer architecture paradigms * 4.3 Limited-function computers * 4.4 Virtual computers * 5 Further topics * 5.1 Artificial intelligence * 5.2 Hardware * 5.3 Software * 5.4 Programming languages * 5.5 Professions andorganizations * 6 See also * 7 Notes * 8 References * 9 External links |
History of computing
Main article: History of computing hardware
The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. From the end of the 19th centuryonwards, the word began to take on its more familiar meaning, describing a machine that carries out computations.[3]
Limited-function early computers

The Jacquard loom, on display at the Museum of Science and Industry in Manchester, England, was one of the first programmable devices.
The history of the modern computer begins with two separate technologies—automated calculation andprogrammability—but no single device can be identified as the earliest computer, partly because of the inconsistent application of that term. A few devices are worth mentioning though, like some mechanical aids to computing, which were very successful and survived for centuries until the advent of the electronic calculator, like the Sumerian abacus, designed around 2500 BC[4] which descendant won a speed...
tracking img