language-icon Old Web
English
Sign In

Clock rate

The clock rate typically refers to the frequency at which a chip like a central processing unit (CPU), one core of a multi-core processor, is running and is used as an indicator of the processor's speed. It is measured in clock cycles per second or its equivalent, the SI unit hertz (Hz). The clock rate of the first generation of computers was measured in hertz or kilohertz (kHz), the first personal computers (PC's) to arrive throughout the 1970s and 1980s had clock rates measured in megahertz (MHz), and in the 21st century the speed of modern CPUs is commonly advertised in gigahertz (GHz). This metric is most useful when comparing processors within the same family, holding constant other features that may affect performance. Video card and CPU manufacturers commonly select their highest performing units from a manufacturing batch and set their maximum clock rate higher, fetching a higher price. The clock rate typically refers to the frequency at which a chip like a central processing unit (CPU), one core of a multi-core processor, is running and is used as an indicator of the processor's speed. It is measured in clock cycles per second or its equivalent, the SI unit hertz (Hz). The clock rate of the first generation of computers was measured in hertz or kilohertz (kHz), the first personal computers (PC's) to arrive throughout the 1970s and 1980s had clock rates measured in megahertz (MHz), and in the 21st century the speed of modern CPUs is commonly advertised in gigahertz (GHz). This metric is most useful when comparing processors within the same family, holding constant other features that may affect performance. Video card and CPU manufacturers commonly select their highest performing units from a manufacturing batch and set their maximum clock rate higher, fetching a higher price. Manufacturers of modern processors typically charge premium prices for processors that operate at higher clock rates, a practice called binning. For a given CPU, the clock rates are determined at the end of the manufacturing process through actual testing of each processor. Chip manufacturers publish a 'maximum clock rate' specification, and they test chips before selling them to make sure they meet that specification, even when executing the most complicated instructions with the data patterns that take the longest to settle (testing at the temperature and voltage that runs the lowest performance). Processors successfully tested for compliance with a given set of standards may be labeled with a higher clock rate, e.g., 3.50 GHz, while those that fail the standards of the higher clock rate yet pass the standards of a lesser clock rate may be labeled with the lesser clock rate, e.g., 3.3 GHz, and sold at a lower price. The clock rate of a CPU is normally determined by the frequency of an oscillator crystal. Typically a crystal oscillator produces a fixed sine wave—the frequency reference signal. Electronic circuitry translates that into a square wave at the same frequency for digital electronics applications (or, in using a CPU multiplier, some fixed multiple of the crystal reference frequency). The clock distribution network inside the CPU carries that clock signal to all the parts that need it. An A/D Converter has a 'clock' pin driven by a similar system to set the sampling rate. With any particular CPU, replacing the crystal with another crystal that oscillates at half the frequency ('underclocking') will generally make the CPU run at half the performance and reduce waste heat produced by the CPU. Conversely, some people try to increase performance of a CPU by replacing the oscillator crystal with a higher frequency crystal ('overclocking'). However, the amount of overclocking is limited by the time for the CPU to settle after each pulse, and by the extra heat created.

[ "Computer hardware", "Electronic engineering", "Real-time computing", "Electrical engineering", "Parallel computing", "Overclocking" ]
Parent Topic
Child Topic
    No Parent Topic