Iritscen
Mr. Grumpikins
Hi guys. I have a really weird history in programming, where I am fairly skilled in some ways but blasted ignorant in others. I am still getting the hang of drawing in OS X. I've made some great progress in things like loading and displaying images with color masks, and yet some of the simpler things stymie me.
All I'm trying to do at the moment is get reliable color in OS X. Apparently I'm missing something. I'm using the function CGContextSetStrokeColor (and related functions like -SetFillColor()), and passing it the current CGContext and a float array with four elements. Then I use something like a FillRect() command to draw onscreen.
According to the documentation I'm reading, colors in OS X are handled initially as floats from 0.0 to 1.0, with a value for R, G, B, and alpha, so black would be {1, 1, 1, 1}. Right?
I am getting very inconsistent results. Most sets of values that I'm using yield nothing. Invisible lines. Despite the fact that the alpha value is always 1.0. When {1, 1, 1, 1} failed to yield black, I found through trial and error that {.25, .25, .25, 1} yields a somewhat acceptable gray.
{0, 0, 1, 1} should yield blue but it's also invisible or gray. {.25, .25, 1, 1} yields a mid-bright blue, however, exactly as you'd expect. What is going on? Am I in the wrong drawing mode or something? I'm not setting any drawing mode in particular, so presumably Quartz is using the default mode, which should be fine. Incidentally, I always use 1.0 for 1 and 0.0 for 0 in the actual code, I just didn't type out the decimal places in this post because I'm lazy
If no one understands the problem, I will bring in the actual code snippet to post tomorrow.
All I'm trying to do at the moment is get reliable color in OS X. Apparently I'm missing something. I'm using the function CGContextSetStrokeColor (and related functions like -SetFillColor()), and passing it the current CGContext and a float array with four elements. Then I use something like a FillRect() command to draw onscreen.
According to the documentation I'm reading, colors in OS X are handled initially as floats from 0.0 to 1.0, with a value for R, G, B, and alpha, so black would be {1, 1, 1, 1}. Right?
I am getting very inconsistent results. Most sets of values that I'm using yield nothing. Invisible lines. Despite the fact that the alpha value is always 1.0. When {1, 1, 1, 1} failed to yield black, I found through trial and error that {.25, .25, .25, 1} yields a somewhat acceptable gray.
{0, 0, 1, 1} should yield blue but it's also invisible or gray. {.25, .25, 1, 1} yields a mid-bright blue, however, exactly as you'd expect. What is going on? Am I in the wrong drawing mode or something? I'm not setting any drawing mode in particular, so presumably Quartz is using the default mode, which should be fine. Incidentally, I always use 1.0 for 1 and 0.0 for 0 in the actual code, I just didn't type out the decimal places in this post because I'm lazy
If no one understands the problem, I will bring in the actual code snippet to post tomorrow.