dual video display with radeon 8500

crash

and burn
hey all, i posted in a thread a while back about flashing the rom for a PC ATi AGP radeon 8500 (128 MB DDR) for the mac. i have been using it since then with no trouble.
HOWEVER (isn't there always a "however?"), i thought that if i were to use a DVI-to-VGA adapter on the DVI port on the card, i would be able to use both the VGA and DVI plugs and have two monitors working from the same, single card. whenever i try this, i always get one completely blue screen at boot and the other screen says out of range?

any ideas?

thanks guys
 
It should work. Ofcource, there are always problems. First it could be as simple an issue as the default res it's trying to use on the DVI port doesn't work on your second monitor. Second option can be that your DVI to VGA adaptor is foobar'd. The third option is, if your card is not a genuine ATI retail boxed card, it may not have the proper circuitry to do analog off the DVI port. I've seen many of the 64M LE cards listed specifically saying they don't support DVI-VGA adaptors.

I'd look at the first option first, as your monitor says it's out of sync. What type of monitor is it? Are both your monitors the same?

Brian
 
yep, both monitors are identical: viewsonic A90's... 19"


heres the REALLY wierd part: if i plug the VGA adapter into the DVI port, and run the monitor from that, its fine. i have another card (a PCI ATi rage 128 Pro) for the second monitor. so, one plug on the radeon, one on the ragePro always works. AND, if i plug a monitor into just the standard VGA plug on the radeon and one on the RAgePro, then it ALSO works fine. the only time a problem arises is when i try to use both plugs on the card at the same time.

does that help you diagnose this? i know it only makes me more confused.
 
Actually that does make things seem a bit clearer. That makes it sound like it's a RAMDAC issue, the issue being it probably only has the ability to do a D/A conversion on one interface. So, when you have a 2nd monitor attached, it's not able to send a proper analog signal out the DVI port as it can only create one analog signal at a time. I could be wrong, but that sounds like the most likely culprit.

Have you done any research on the model you have, is it ati retail, ati oem, 3rd party using the ati chip? That will probably give you the best bet on tracking down the exact problem, as if it is an issue with the hardware design, I'm sure someone with a PC has run into it.

Brian
 
flashing a "winblows" BIOS to your APPLE card is not a good idea. This is probably the reason you are not getting dual display.
And if memory serves correct, you cannot have a DVI and VGA output at the same time. If you want to do dual monitors, your card should have come with a VGA or DVI splitter...
 
Actually you can have both working, even on flashed cards. Many have done it, many will continue to do it. I had a flashed 64M 8500 LE and I used both the VGA and DVI port with an adaptor to power two monitors. But, I had issues with the card overheating (thats what you get for buying refurb cards) so I got tired of playing and got a normal 8500 Mac edition, and using it with two VGA monitors. The problem comes in if you're not using an ATI made card, or some of the cheaper ATI made ones.

Brian
 
it's an ATi Radeon 8500, with 128 MB DDR. store bought, and it came in ATi packaging with the celophane and everything. i'm a little out of the 'graphics card' loop.... what's a "DVi or VGA splitter? if the card has two ports, why should i need anything more than a DVI to VGA adapter.

i'm pretty sure i had read about others on this board that had successfully flashed their radeon cards and were using it with dual displays with no problems. am i an exception?

anyway, thanks for all the responses. hopefully i can figure this out.
 
Back
Top