One of important questions you ask when considering microcontroller project is wat oscillator frequency to choose. Usually it depend desired level of performance. In general application speed is directly determined by oscillator frequency. If you double the oscillator frequency, the application will rin in double speed.
But wee do not compare different processors by frequency, but by other quantity â€“ MIPS (Million Instructions Per Second). Why it is so? Because different MCU require different clock cycles for performing one operation. For instance AVR microcontrollers require 1 (some 2) clock cycles for one instructions while Intel 8051 microcontrollers require a minimum of 12 oscillator cycles. So if Clock frequency is 12MHz then 8051 microcontroller will work at performance of 1 MIPS. AVR microcontroller which uses 12MHz crystal will work at about 12 MIPS.
But do we always need maximum performance? Many developers like to select maximum oscillator value that is supported by selected MCU. For instance if Atmega128 supports 16MHz frequency, then many people automatically choose values near to this value. Sometimes it is bad practice because:
Many applications do not require high level of performance that microcontrollers can provide;
In many CMOS based MCU there is almost linear relation between oscillator frequency and power consumption. Logically by reducing frequency you may reduce power requirements – this become important on battery driven applications;
When working in lower frequencies it is simpler to access low speed peripherals;
The higher clock frequency â€“ the higher electromagnetic interference (EMI).
So you should operate at lowest oscillator frequency that is enough to your application.