How Did the Commodore 64 Generate Its System Clock Signal?
The Commodore 64 generated its system clock signal using a dedicated crystal oscillator that fed into the VIC-II video chip, which then divided the frequency to drive the central processor. This architecture ensured synchronization between the video output and the CPU operations, preventing visual artifacts while maintaining system stability. Understanding this process reveals how the hardware balanced video timing requirements with processing speed in both NTSC and PAL regions.
At the heart of the clock generation circuit lies a quartz crystal oscillator. In NTSC models, this crystal oscillates at 14.31818 MHz, a frequency chosen specifically because it is four times the NTSC color burst frequency. This relationship was crucial for the VIC-II chip to generate accurate color signals for the television display. PAL models utilized a different crystal, typically oscillating at 17.734475 MHz, to align with the PAL broadcast standard.
The raw signal from the crystal did not go directly to the CPU. Instead, it was input into the VIC-II video display controller. The VIC-II acted as the master timing device for the system. It processed the high-frequency input and divided it down to create the specific clock phases required by the MOS Technology 6510 microprocessor. The resulting CPU clock speed was approximately 1.023 MHz for NTSC machines and roughly 0.985 MHz for PAL machines.
This design choice prioritized video stability over raw processing power. By deriving the CPU clock from the video clock, the system ensured that the processor never accessed memory while the video chip was drawing the screen visible area, except during specific cycles. This handshake prevented bus conflicts and ensured that the display remained tear-free without requiring complex memory arbitration logic. The clock signal generation was therefore a fundamental component of the Commodore 64’s integrated architecture.