Q. Explain broadly the developments in computer technology starting from a simple calculating machine to the first computer.
A computer is a machine that can be instructed to carry out sequences of arithmetic or logical.
The history of computing hardware covers the developments from early simple devices to aid calculation to modern day computers.
Before the 20th century, most calculations were done by humans.
However the first electronic digital computer were developed between 1940 and 1945 in the United States and in the United Kingdom. They were gigantic, originally the size of a large room, and also need to be supply a large amount of power source which is equivalent as several hundred modern personal computers. The history of computer hardware covers the developments from simple devices to aid calculation, to mechanical calculators, punched card data processing and on to modern stored program computers. The tools or mechanical tool used to help in calculation are called calculators while the machine operator that help in calculations is called computer.
Early mechanical tools to help humans with digital calculations, like the abacus, were referred to as calculating machines or calculators. The machine operator was called the computer.
The transistor had a major influence on the development of electronics as it was more reliable than the first generation. It made the computers smaller, faster, cheaper and more efficient to run. The first symbolic languages appeared and the backgrounds of high level programming languages were developed.
This generation is beginning with many gains in parallel computing, both in the hardware area and in improved understanding of how to develop algorithms to exploit diverse, massively parallel architectures. Parallel systems now compete with vector processors in terms of total computing power and most expect parallel systems to dominate the future.
Get Answers For Free
Most questions answered within 1 hours.