Thousands or Millions?

Are you running in Thousands or Millions of Colors?

  • Thousands

  • Millions


Results are only viewable after voting.

whitesaint

cocoa love
This is crucial for me to find out if Mac OS X users are using their Colors on a daily basis in the thousands or millions. Please at least vote! Thanks in advance.

-whitesaint
 
for a moment I though you were talking about how many versions of Windows MS will have to develop and maintain. :D
 
Originally posted by Javintosh
for a moment I though you were talking about how many versions of Windows MS will have to develop and maintain. :D

I thought it was the # of bugs in WinXP. ;) :D
 
I voted 1,000's but for some goofy reason my left monotor ended up at millions while my main monitor is 1,000's...
 
I keep it on Millions all the time now, after some snooping around "under the hood" showed me that the mobile-version graphic accelerator in my iBook accelerates in all 32 bit modes but only a couple of 16-bit modes (640x480 and 800x600) and, oddly, all 8-bit modes. That means its actually faster to run graphics in 32-bit mode.
 
I've been hearing for years how thousands mode makes things faster and such, but I have tried it on every Mac I've owned and never noticed one shred of a difference. I"m quite sure on any G3 or G4 you cannot tell me there is a difference, certainly not a noticeable one. Evidence of the ongoing futile attempts to squeeze one more ounce of performance.

I even tried running in grays on a slower machine to hopefully get a little jump. Nothing. It's a myth in my opinion, though I'd love to hear people's stories of how it helps them.

And for anyone considering displaying X in 16-bit, please don't disgrace the OS by doing that!
 
Originally posted by mindbend
I even tried running in grays on a slower machine to hopefully get a little jump. Nothing. It's a myth in my opinion, though I'd love to hear people's stories of how it helps them.

I too have tried this on many different systems, and it does make a difference. On any system with 8 MB of VRAM or less running at 1024x768, display redraw is noticeably faster. In fact the difference between the Beige G3/300 and a Blue & White G3/300 has everything to do with the graphics card and the display settings. I watched one client's system redraw the desktop so slow that one half of a window would disappear before the rest, and when I installed an ATI Rage 128 from a dead B&W, that system had a new lease on life. On cards with 16+ MB, thousands, millions, it doesn't make a difference at that point, but I can tell you that for older systems it is anything but a myth.
 
Originally posted by symphonix
I keep it on Millions all the time now, after some snooping around "under the hood" showed me that the mobile-version graphic accelerator in my iBook accelerates in all 32 bit modes but only a couple of 16-bit modes (640x480 and 800x600) and, oddly, all 8-bit modes. That means its actually faster to run graphics in 32-bit mode.
wow, thanks for that bit of info!

/me wishes he could switch vote to millions
 
i keep it on thousands just so everything is big enugh to read it. and considering there are cultures in the world with only 3 words for colors, then i think thousands of them gives me enough.:)
 
Although the quantity of VRAM isn't technically the reason for acceleration being better with more than 8 M VRAM, it's an appropriate guideline. The old chips were optimized at 16 bit, and 32 bit required some double working and software patching.

I remember HAMMING mode on a Commodore Amiga ...

But later, as things moved to 32 bit, graphics cards started thinking in 32 bit, and would actually downsample to 16 bit for display. So that in the RAGE 128, 32 bit draws may sometimes be faster than 16 bit draws, because it's more natural for the card.

On radeons, and GeForce3's, etc. and above, 32 bit is probably preferred, although there's going to be little perceivable difference in most tasks anyway.

Actually, on some of the early builds of Mac OS X (beta and such) on my G3 233 laptop, thousands was noticeably faster than millions, and that makes sense considering the hardware. ... but now, I con't tell the difference by speed, only by whether or not my desktop background dithers or not. Millions there now too.

The biggest kick is to slop it into 8 bit, and watch OS X fake colors for pulsing buttons. It cracks me up. Suffice it to say that OS X thinks in 32 bit color and downsamples when asked to. Millions everywhere as far as I'm concerned.
 
I've also noticed that on my iBook things seem to go faster with millions of colors. I guess it mas more calculations to do if it is constantly doing thousands.
 
I tried switching to thousands on my slow-ass 233 iMac (6 MB VRAM), and it seemed not to make one iota of difference. d'oh!
 
Hmm... My TiBook 500 monitor is set to millions, but my external 17" LCD at 1280*1024 at thousands, because the TiBook can't make it show millions. It's a pity, but at least I can work with two very good monitors. So I'm glad to use both, but prefer millions all the time.
 
so with 46 votes in - a huge response for a poll in 2 days btw - might i ask why it is "crucial" you know this?:D
 
I was wondering the same thing - what ya doing WS?

BTW, I'm at millions. I was utterly surprised to see so many people using thousands. Personally, millions seem more 'true to life' and thousands feels kinda fake. But then again, it *is* just a desktop :)

While we're on the subject, I have a bit --> color question.
everyone knows 8bit is 256 colors because 2^8 = 256.
now, lets see, 16bit is 65,536 colors (2^16).
and, 24bit must be 16,777,216 colors (millions).
So whats up with 32 bit? 4,294,967,296.
Does any video card or monitor support 4 billion colors?
Which alsobeggs the question, could we tell the difference? And, 64bit colors? hmmm
 
Originally posted by kilowatt
While we're on the subject, I have a bit --> color question.
everyone knows 8bit is 256 colors because 2^8 = 256.
now, lets see, 16bit is 65,536 colors (2^16).
and, 24bit must be 16,777,216 colors (millions).
So whats up with 32 bit? 4,294,967,296.
Does any video card or monitor support 4 billion colors?
Which alsobeggs the question, could we tell the difference? And, 64bit colors? hmmm

According to one of my teachers, the only difference with 24bit and 32bit is that 32bit images (Photoshop?) use those extra 8 bits for Alpha channels and Transperancies, other than that there is no difference...

But I don't know how true his statement is... :p
 
Kilowatt, your right for the most part except 32 bit is still millions of colors.

Kilowatt and Ed - I'm sorry I can't say. I'm just very excitingly happy to see 2/3 of Mac OS X users running in millions!:D
 
wait, so if 2^32 = 4,294,967,296 how is that only millions of colors?

I guess its millions of colors with ~3billion extra pieces of data?
 
Back
Top