If you have video and then lose it, could it be because the previously-attached monitor was set to a resolution that the Cinema Display doesn't support? Or could it be the native resolution of that HUGE display simply isn't supported by the ATI card?
There are preferences you can reset / change, both in Mac OS 9 and X to set the default resolution. Preferences are implicated because the video works (in the "native" resolution?) until the OS starts loading.
Attach a regular VGA monitor and then...
In Mac OS 9 delete:
Display Preferences
...and in Mac OS X delete (if present):
com.apple.systempreferences.plist
ATIDisplays3.plist
ATIMonitor.plist
Reboot the computer and reset the PRAM using CMD-OPT-P-R at bootup.
Boot into Mac OS 9 first and set the monitor resolution to something the Cinema Display will support. Download the latest ATI drivers and firmware and install them both. Set the resolution to something the Cinema Display will support.
Reboot into Mac OS X and install the latest ATI drivers. Set the resolution to something the Cinema Display will support.
Shut down the computer and reattach the Cinema Display. See if it works.
No guarantee it'll work, but these are the basic steps I'd try first.