The speed at which the internal clock in an electronic device oscillates. In computers, each tick (oscillation) of the clock is called a cycle, and the clock rate is measured in megahertz, or millions of cycles per second. Also called clock speed, the clock rate determines how quickly the CPU can execute basic instructions, such as adding two numbers, and it is used to synchronize the activities of various components in the system. Between 1981, when the IBM PC was released, and early 2002, typical clock rates for personal computers increased about 1000-fold, from 4.77 MHz to 2 GHz and faster. Also called: clock speed, hertz time. See also clock (definition 1).
The frequency of a computer's internal electronic clock. Every computer contains an electronic clock, which produces a sequence of regular electrical pulses used by the control unit to synchronize the components of the computer and regulate the fetch-execute cycle by which program instructions are processed.
A fixed number of time pulses is required in order to execute each particular instruction. The speed at which a computer can process instructions therefore depends on the clock rate: increasing the clock rate will decrease the time required to complete each particular instruction.
Clock rates are measured in megahertz (MHz), or millions of pulses a second. Microcomputers commonly have a clock rate of 8–50 MHz.
See clock rate. (Technology) The measurement in megahertz of the speed with which a central processor can perform a series of calculations.