CLICK TO JOIN FOR FREE SONY LABTOP

Register to Get a Labtop

Saturday, December 8, 2007

Micro Processor

CPU The Micro Processor

A microprocessor is a programmable digital electronic component that incorporates the functions of a central processing unit (CPU) on a single semiconducting integrated circuit (IC). The microprocessor was born by reducing the word size of the CPU from 32 bits to 4 bits, so that the transistors of its logic circuits would fit onto a single part. One or more microprocessors typically serve as the CPU in a computer system, embedded system, or handheld device. Microprocessors made possible the advent of the microcomputer in the mid-1970s. Before this period, electronic CPUs were typically made from bulky discrete switching devices (and later small-scale integrated circuits) containing the equivalent of only a few transistors. By integrating the processor onto one or a very few large-scale integrated circuit packages (containing the equivalent of thousands or millions of discrete transistors), the cost of processing capacity was greatly reduced. Since the advent of the IC in the mid-1970s, the microprocessor has become the most prevalent implementation of the CPU, nearly completely replacing all other forms. See History of computing hardware for pre-electronic and early electronic computers.
Since the early 1970s, the increase in processing capacity of evolving microprocessors has been known to generally follow Moore's Law. It suggests that the complexity of an integrated circuit, with respect to minimum component cost, doubles every 18 months. In the early 1990s, microprocessor's heat generation (TDP) - due to current leakage - emerged, as a leading developmental constraint[1]. From their humble beginnings as the drivers for calculators, the continued increase in processing capacity has led to the dominance of microprocessors over every other form of computer; every system from the largest mainframes to the smallest handheld computers now uses a microprocessor at its core.
source:wikipedia

Friday, December 7, 2007

Input/Output

input/output
- I/O (input/output), pronounced "eye-oh," describes any operation, program, or device that transfers data to or from a computer. Typical I/O devices are printers, hard disks, keyboards, and mouses. In fact, some devices are basically input-only devices (keyboards and mouses); others are primarily output-only devices (printers); and others provide both input and output of data (hard disks, diskettes, writable CD-ROMs).


source: techtarget

Organization of Computing System

1- Computer Hardware (H/W)
2- Computer Software (S/W)
3- Operating System (O.S)
4- Computer Networks (N/W)
---------------------------------------------------------------------------------------
1- Computer Hardware:- All physical Components of Computer are called its hardware. We can say that all physical Devices are computer hardware. Keyboard, Mouse, Printer, Scanner, Hard Disk, CD-ROM etc.
There are three main units of Computer Hardware.
i- Input Unit
ii- Processing Unit
iii- Output Unit
i- Input Unit: The Devices that are used to give data and/or Instruction to computer system are related to Input Unit. Like Keyboard, Mouse etc.
ii- Processing Unit: This unit is responsible for all data processing in computer system. The device for this purpose is called CPU (Central Processing Unit) or Micro Processor.
iii- Output Unit: The devices that are used to show results of processed data, are related to output Unit. Like Monitor, Printer etc.
We will see these topics in detail latter on.

Wednesday, December 5, 2007

Generations of Computer Developments

Generations of Computer

The Five Generations of Computers: The history of computer development is often referred to in reference to the different generations of computing devices
. Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. Read about each generation and the developments that led to the current devices that we use today.
First Generation - 1940-1956: Vacuum Tubes. The first computers used vacuum tubes for circuitry. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on machine Language to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts. The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.

Second Generation - 1956-1963: Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output. Second-generation computers moved from cryptic binary machine language to symbolic, or assembly languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The first computers of this generation were developed for the atomic energy industry.
Third Generation - 1964-1971: The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips,called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.
Fourth Generation – 1971: The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer - from the central processing unit and memory to input/output controls - on a single chip. In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.
Fifth Generation - Present and Beyond: Fifth generation computing devices, based on artificial intelligence are still in development, though there are some applications, such as voic recognition that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

Source: http://www.webopedia.com/

Introduction to Computer

Development History
The Abacus
The abacus is a calculator. Its first recorded use was in 500 B.C. The Chinese used it to add, subtract, multiply, and divide.

Analytical Engine (A Pre-Electronic Computer)
The first mechanical computer was the analytical engine, conceived and partially constructed by Charles Babbage in London, England, between 1822 and 1871. It was designed to receive instructions from punched cards, make calculations with the aid of a memory bank, and print out solutions to math problems. Although Babbage lavished the equivalent of $6,000 of his own money—and $17,000 of the British government's money—on this extraordinarily advanced machine, the precise work needed to engineer its thousands of moving parts was beyond the ability of the technology of the day to produce in large volume. It is doubtful whether Babbage's brilliant concept could have been realized using the available resources of his own century. If it had been, however, it seems likely that the analytical engine could have performed the same functions as many early electronic computers.
The First Electrically Driven Computer
The first computer designed expressly for data processing was patented on January 8, 1889, by Dr. Herman Hollerith of New York. The prototype model of this electrically operated tabulator was built for the U.S. Census Bureau to compute results of the 1890 census.
Using punched cards containing information submitted by respondents to the census questionnaire, the Hollerith machine made instant tabulations from electrical impulses actuated by each hole. It then printed out the processed data on tape. Dr. Hollerith left the Census Bureau in 1896 to establish the Tabulating Machine Company to manufacture and sell his equipment. The company eventually became IBM, and the 80-column punched card used by the company, shown in Figure 1.2, is still known as the Hollerith card.

The Digital Electronic Computer
The first modern digital computer, the ABC (Atanasoff–Berry Computer), was built in a basement on the Iowa State University campus in Ames, Iowa, between 1939 and 1942. The development team was led by John Atanasoff, a professor of physics and mathematics, and Clifford Berry, a graduate student. This machine utilized concepts still in use today: binary arithmetic, parallel processing, regenerative memory, separate memory, and computer functions. When completed, it weighed 750 pounds and could store 3000 bits (.4 KB) of data.
The technology developed for the ABC machine was passed from Atanasoff to John W. Mauchly, who, together with engineer John Presper Eckert, developed the first large-scale digital computer, ENIAC (Electronic Numerical Integrator and Computer). It was built at the University of Pennsylvania's Moore School of Electrical Engineering. Begun as a classified military project, ENIAC was designed to prepare firing and bombing tables for the U.S. Army and Navy. When finally assembled in 1945, ENIAC consisted of 30 separate units, plus a power supply and forced-air cooling. It weighed 30 tons, and used 19,000 vacuum tubes, 1500 relays, and hundreds of thousands of resistors, capacitors, and inductors. It required 200 kilowatts of electrical power to operate.
Another computer history milestone is the Colossus I, an early digital computer built at a secret British government research establishment at Bletchley Park, Buckinghamshire, England, under the direction of Professor Max Newman. Colossus I was designed for a single purpose: cryptanalysis, or code breaking. Using punched paper tape input, it scanned and analyzed 5000 characters per second. Colossus became operational in December 1943 and proved to be an important technological aid to the Allied victory in World War II. It enabled the British to break the otherwise impenetrable German "Enigma" codes.
The 1960s and 1970s marked the golden era of the mainframe computer. Using the technology pioneered with ABC, ENIAC, and Colossus, large computers that served many users (with accompanying large-scale support) came to dominate the industry.
As these highlights show, the concept of the computer has indeed been with us for quite a while. The following table provides an overview of the evolution of modern computers—it is a timeline of important events.

Professional Course Outline

Main Modules

1- Introduction to Computer and Components
2- Operating System (MS Windows XP)
3- Microsoft Office
4- Internet and Electronic Mail
5- Typing Tutor
6- Hardware Maintenance and Trouble Shooting
7- Electronic Commerce
8- Web Development and Maintenance
9- Corel Draw
10- Photo Shop
11- Basics of Computer Networks
12- Impact of I T on Jobs and Organization

Courses for this Site

1- Professional Course
2- Elementary Course
You can learn a lot about these courses from this site without any fee . I will describe all key concepts related to above mentioned courses totally free . Your comments and recommendations will treated as important.

Information Technology Institute Bhimber

AJK Information Technology Board offers several programmes in cooperation with the Private sector which include computer literacy, electronic governance and ultimatly software development programes for unemployed IT graduates.

Google