Wednesday, August 8, 2012

Microprocessor characterization, language, instruction set and working


What is microprocessor?

A microprocessor can be defined as followed:
"A multipurpose , programmable logic device which is capable of reading binary instructions from a storage device referred to as memory, accepts data as input and process it according to those instructions and provides result as output"
A microprocessor can be considered to be electronic integrated chip or a set of chip that implements a central processor of a computer system. Microprocessor in a microcomputer resembles a brain in human body which does the functioning of controlling all the peripherals of the body. The main parts of a microprocessor are:
1) Arithmetic and logic unit
2) Central processing unit

The properties by which a microprocessor can be characterized are:
• Speed
• Word length
• Architecture
• Instruction set

When talking about the digital computers they can be classified into different groups on the basis of:
• Size
• Processing power
• Cost and
• Complexity

In this way depending upon these factors three main classes of digital computers are:
1. Micro computers
2. Mini computers
3. Main frames

Mini computers and main frames are more complex and power full than micro computers which uses only a single IC processor unlike the other two. Here minicomputer has word word length ranges from 16 bits to 32 bits and it generally uses 2 or more CPUs.

What is microcomputer?

This category of computers is characterized by it low speed and lesser restorability of data. It uses only one CPU and which is just a single chip. It has low processing power and it is generally used in personal computing and control applications.
Any device which is programmable can be represented by three fundamental components as in the figure:
• Microprocessor
• Memory
• Input/output devices

When these three parts works and interacts together to perform a given particular task it is called a system and the physical components i.e. electronic circuits for input output devices, memory unit and microprocessor chips itself are called Hardware where as the set of instructions written to perform the given task is called a program and further a set of programmes is referred to as a software which performs some pre-specified tasks. Microprocessor receives the encoded instructions then decodes and executes them. Word length of a microprocessor depends upon the width of data bus but it is independent of the length of instruction and operands which are handled by the microprocessor. The microprocessor can handle arithmetic, logic Boolean alphanumeric data. The time taken to execute the basic instructions decides the speed of the concerned microprocessor.
Also, as the microprocessor is a synchronous sequential electronic circuit some clock signals are required for execution of its each instruction so the speed of a microprocessor will also depend upon the clock frequency of the crystal used in the system. For each pulse generated by the crystal one basis instruction is executed.
Representation and organization of the data in the memory will be described in the next portion.

How memory is organized in microcomputers?

Just as the data is written on the pages of a notebook in the same the data storage in electronic memory chip cam be resembled with writing the binary digits on the semiconductor material pages upon which these binary digits are stored such that spaces occurs there after a fixed number of the binary digits and a fixed number of the lines occurs in each pages.

Generally each line in the memory is called register i.e. a collection of 8 bits given a name one byte. Such byte or the registers are arranged in sequential manner in the memory such registers occurs in groups in power of 2 for example for a memory of 1k byte, 2 to the power 10 registers or byte are grouped together. The set of instructions written by the programmer makes the microcomputer to retrieve the data from some external storage device process them and store them in memory or to give it to some external device to display it like LCD display, 8 segment display, computer monitor, speaker in case of sound signal elc.

What is a memory module?

In our further discussion over memory organization we will be using the term memory module which refers to device which stores the data. A memory module contains a fixed number of memory locations where each location contains a fixed number of bits and the number of bits in each location in called word length. To understand the memory organization let us take an example of a memory module of size (nXm) bits then its word length will be m and memory locations will vary from 0 to n-1. It is clear from this far discussion that number of data line in it will depend upon its word length i.e. larger is the word length less will be data line in it because total number of bits in it is to be fixed which is (nXm).

Microprocessor processes the data taken from outside and returns the result it again to external peripheral the devices for connecting the microprocessor to the outside is done with help of a system of hardware called input output devices I/O modules. It will be discussed in further discussion as

What is input output module/device?

As the name suggests it comprises two parts i.e. input module/devices and output module/devices.

Input devices
Provide the data to microprocessor in binary form performing the function of senses of the system to connect it with outer world. These includes keyboard, teletype and analogue to digital converter.

Output devices
Performs the function of reproducing the result obtained from processing section and decodes the binary result and produces it a form understandable to outer world. These devices includes monitor, loudspeaker, printers, LCD display etc.

Any system requires some important and basis parts for making I/O devices usable for it. That can be listed as:
1. Output port
2. Interfacing circuitry

Output port provides the points in the systems through which it may have physical connection with I/O devices for bidirectional flow of data between system and outer world. The system may have more than one input output port but only one is accessed at time whose selection is done by microprocessor itself while the direction of data flow is determined by control signal generated by the microprocessor i.e. the instruction set decides that whether data is to be transferred from system to outer devices or it is to received by a particular port.

Data which deals with microprocessor has to be in binary form to allow any processing to be carried out upon it in the similar way the data obtained from microprocessor also occurs in binary form but it is not generally understandable by outer world. Here interface circuitry converts outer signal into binary form and binary result produced by microprocessor to some other form understandable to outer devices. In this way interface circuitry connects system to outer devices. Some of the output devices are LEOS, Cathode ray tube, printer etc.

What is instruction set and language in microprocessor?

Any microprocessor understands only machine language i.e. 0 &1. But each one differs in its instruction sets. Every microprocessor has its own words, their meanings and the language that they understand. These words are formed by different combinations of basic binary digits (0 &1) called bits. In this way we can define a word as:

"The number of bits the microprocessor recognizes and processes at a time is termed as a word"

World length of a microprocessor may varies from 4 bits to 64 bits where 4 bit words are used in small microcomputers and large words having up to 64 bits are used in high speed large computers generally word is termed as byte i.e. a group of 8 bits so a 32 bit microprocessor has can be said to have a word length of 4 bytes.

For sake of convenience a group of 4 bits is termed a nibble and thus a byte is made with 2 nibbles, a lower nibble and upper nibble.

How a microprocessor works?

A microprocessor can be made to perform a specified task by the programmer with help of a set of instruction set understandable by it. The instructions are to be in binary form i.e. 0 &1 but it is difficult for the programmer to learn the instructions in binary form they are given some English like word to make easier to learn them. In this way the collection of these English like world equivalent to particular instructions constructs a language which is used by the programmer to communicate with microprocessor. This language is referred to as assembly language and programs written in this language are called assembly programs. These assembly programs are written for a specific microprocessor and as each microprocessor understands its own unique instruction set that is why the program written for machine may and may not be transferable to another one. This is a big limitation of assembly language to overcome this limitation many general purpose languages have been developed which provides the facility of transferability of the program written in these languages as they are machine independent. some of those are:
• BASIC
• FORTRAN
• PASCAL
• C And C++

These languages are closer to human understandable languages like English and are termed as high-level languages.

Different languages and their working

Machine language
At the time formation of the microprocessors designers sets some specific combination of bits and gives them certain meaning based upon their function with the help of electronic logic circuitry these are called basic instructions of the concerned microprocessor. Set of such instructions which are designed in machine are called its machine language constructed with only 0s & 1s. Likewise taking an example of microprocessor 8085 whose word length is 8 bit and has it instruction set which is nothing but a different combination of these 8 bits. it may have 256 different bit patterns which can construct 74 different instructions to perform some specific operations.

Assembly language

Although the instructions in machine language may be written in hexadecimal form but it is still a difficult task to memorize these codes for making microprocessor to perform different functions while the alpha-numerical codes are relatively easier to memorize that is why these hexadecimal codes are given some English like alphanumerical code in place of hex or binary codes. In this way language comes out to assembly language it works in the same way that machine language does and the equivalent alphanumeric codes instructions are called mnemonics

Low level languages

Both the machine language and assembly falls in this category of programming languages which are un-transferable to any other microprocessor i.e. these programs are microprocessor specific and are written for a specific chip. The programs written in low level language are to first converted into equivalent binary form to be read by the microprocessor this function of conversion is done either manually or with the help of a program, referred to as assembler that reads each and every mnemonics instruction converts it into equivalent binary form and then moves on to read the next instruction and so on till the end of the program. Thus obtained program can be fed to the microprocessor for its execution.

High-level languages


These languages are more close to the spoken languages and are more understandable to the programmer. Here more than one instruction may be used for execution of a single machine level instruction. Like low level languages it also has to be, first, converted into an equivalent binary code for its execution by the microprocessor which is done by programs just like the assembler does in case of low level language. There are two types of such conversion programs:
1. Interpreter
2. Compiler

The program written in high-level language is called source code while its equivalent conversion into machine language is called object code which is finally executed. Both interpreter and compiler convert source code into object code.
Interpreter reads, converts and executes each instruction of source code before heading to the next one i.e. it first reads one instruction converts into binary form, executes it and then moves on towards the next one.
Compiler first reads and converts the whole soured code into object code and then executes this object code at ounces, as a whole.

Memory allocation and addressing

In any system all the peripherals are identified by an address location which occurs in binary form. The number of lines in address bus of a chip determines maximum possible memory locations that can be accessed by the chip (microprocessor). Taking an example of microprocessor 8085 it has 16 address lines i.e. each address will have 16 bits so maximum possible address locations are 2 to the power 16= 65,536. It means 64k byte of memory can be handled by it.
Now in order to retrieve a data stored at a memory location A2001H the microprocessor first put the address A2001H on the address bus which is sent to the memory module. In memory data is stored like in a notebook i.e. the first two digits (A2) of location address tells the address of a particular page number while remaining two digits (01) refers to line number at that page. Microprocessor sends an RD signal to memory modules which puts the data stored at that location in data but which is received by the microprocessor and data is retrieved in it.

History of microprocessors

Intel 4004 was the first microprocessor that was introduced in 1971, it was a 4 bit PMOS microprocessor used for small applications. 4004 and its improved 4 bit versions i.e. 4040 were used for following applications:
• Industrial control application
• Calculation instrumentation
• Commercial applications
• Videogames
• Toys etc.

The first 8 bit microprocessor was introduced in 1972 by Intel it was also based on PMOS technology. But it had some disadvantages like slow speed and incompatibility with TTL logic due to PMOS technology used in it. These limitations leaded introduction of a new more powerful and faster microprocessor Intel 8080 but it also had a drawback that it was operated upon three power supplies to eliminate it an another chip was introduced i.e. 8085 in 1975 which was operated upon only a single power supply of 5V and it used NMOS technology. 8080 is still used in some places like laboratories for the purpose of teaching the functioning of a basic microprocessor. Some other 8 bit microprocessors developed by other companies are:
Zilog's
• Z800
• Z80
Motorola's
• MC 6800
• MC 6809

These 8 bit microprocessors were very useful in industrial and other control applications some of them are:
• Instrumentation industry applications
• Small general purpose computers

These have memory addressing capability of 64K and clock frequency in range of 1-6 MHz using LSI technology. Now many 16 bit and 32 bit microprocessors are also preset there in the market from many deferent new companies which are capable of addressing a large memory and can work at higher clock frequencies.

No comments:

Post a Comment