Quote
Not! Being saucy, but 32-bit RGB isn't millions of colors. It's three, 8-bit color channels, plus an 8-bit alpha channel. IMPORTANT difference
Ummm. Not quite right.
8bit = 256 shades
8bit+8bit+8bit=24bit (no alpha channel obviously) BUT
256+256+256=768 colors... 24bit does not = 768 different shades.
what 24 bit is:
24 squared or 16777216
number of shades.
To find that you can simply go like this
1bit x 2 = 2 colors
2bit x 2 = 4 colors
3bit(4colors) x 2 = 8 colors
4bit(8colors) x 2 =16 colors
5bit (16colors) x 2 = 32colors
6bit (32colors) x 2 = 64colors
7bit(64colors) x 2 =128colors
8bit(128colors) x 2 =256colors and so on.
I believe when the mac hits 16bit it breaks down the colors into separate RGB+Black channels that equates to 65536/4 = 16384different shades of the four different channels.
so 32bit = 4294967296/4 =1073741824 different shades of colors/channel =30bit /channel since 30bit+30bit+30bit+30bit=32bit.
That is why 32bit is MUCH smoother than 16bit, basically it has over 1000x the number of colors to work with.
I believe Photoshop is the only program I know of that uses 24bit color + an 8 bit alphachannel. But in reality it is using 32bit color not 24bit, it is only tricked into thinking the former or something like that.
If you can do 24 bit your monitor will have a setting Millions of Colors and 32bit will be Millions of Colors+.
Or course, the higher the setting the slower your computer will run but on a modern day G3 (400mhz+) the effect of millions of color vs. thousands will only be distinguishable by the computer as it processes the colors in speeds of nanoseconds.
Trentent Tye
Please correct me if I'm wrong anywhere in this so I may learn from my mistakes I'm only attempting to regurgitate what I've come to over the years.
------------------
I never thought I'd go
as far as I am right
now. Now that I'm
here, I have to start
over or else I've got
nothing to do.