Tiger on the Mini

Captain Code

Moderator
Staff member
Mod
Has anyone run it on the Mac Mini yet? Does it run CoreImage at all? Not that I have a Mini or am going to buy one, but maybe we could put the question to rest finally whether or not CoreImage will work well on the Mini.
 
The mini has the same graphics card as the current iBooks. That is a clear "no" on the CoreVideo/CoreImage things AFAIK. On my iBook, for example, you don't see anything fancy when enabling a Dashboard widget.
 
So, Apple are selling new machines like hotcakes which aren't going to work properly with the OS it's releasing the next few months?

Don't think so.
 
I've just spoken to Apple UK/Eire about this, I had to hold for a long time while they spoke to the Tiger team in the US. They say Apple is not going to speculate on the capabilities of an OS before it's even slated for release, any copies in circulation are unfinished works and nothing should be read into what will be in the final products.

CoreImage/Video are appear to be scaleable for the hardware it's running on. This means that people like myself and partner who are waiting for theirs are not going to have lame turkeys along with all the eMac and powerbook users in a few months. It will destroy them.

I have to say though, their attitude to answering this question was DISGUSTING.. I almost cancelled our orders on principle.
 
While CoreImage may be scalable, the whizz-bang effects (ripple effect when launching a Dashboard widget etc.) seem more tied to Quartz 2D Extreme which requires a video card capable of pixel-level programming. The mini does not currently ship with such a card...
 
Personally i think Apple might have stuffed up. I could be wrong but this is what i think. Obvisouly Apple are way ahead of ms and windows, however why are they creating an os that is far more advanced than its actuall hardware. There is 4 models at the moment that are G4's, they have just updated their powerbooks to a new line aswell as the iBooks and the new mini. If Tiger will not run on G4's with all the eyecandy features that will appear on G5's, why should people even bother buying Tiger if they have a G4.

There is alot of great features that is appearing in Tiger, i am really looking forward to it. Really my questions is - if Apple knows Tiger wont run well on G4's why have they brought out new G4's that can run Tiger but not as well. It's not like when they bring Tiger out they will bring out a whole new range of G5's ... personally this doesnt make sense to me .
 
Tiger runs great on G4s. To get all you can out of it you need a good video card. I have an upgraded G4 and finally upgraded the vid card to a Radeon 9800 from a Rage 128 PRO. Yes, 3 weeks ago that is still what I had and Panther ran pretty good on it.
 
If the sole enhancement of Tiger over Panther was the eye candy, I could see some legitimacy in griping about how well it runs on currently-shipping models... but Tiger's eye candy enhancements are only the tip of the iceberg concerning new features -- and, as stated earlier in the thread, CoreImage is purported to scale well.

And it goes without saying that some machines, even currently-shipping ones, will not run Tiger as well as other machines. It's the same with any piece of software. I can't expect Final Cut Pro to run as well on an iBook than a G5. That goes for the OS as well: the higher-end machines will run the software faster/better than lower-end machines. It's even the same on the "other" platform: buy a new Dell with Windows XP, and if you get a Celeron or lower-end processor, it's gonna be slower than getting a better, faster processor like a P4.
 
Yeah "not working properly" is certainly not the right term for Tiger on a Mac mini or iBook G4. It works GREAT on my iBook, but it doesn't have the ripple effect. When Jaguar (10.2) came out, it hat a shadow behind the mouse cursor (Quartz Extreme) and that didn't work on my PowerBook G4/500. Didn't mean Jaguar wouldn't work properly, did it? ;)
If you want Tiger to "work properly" on your Mac, get a Dual 2.5 GHz G5 with a graphics card that's going to be released AFTER Tiger.
 
and you will have to keep in mind that apple in the future wont release OSs as often. That means that they are going to aim at higher G4 and G5 processors since most will have them in a year or two. for the first time in since OS9 the hardware will probably "evolve" faster than the OS and they will want to release an OS that will look modern for more than two years.
 
Quartz Extrem effects of Jaguar do not work on G3's... nevertheless Jaguar is a very good OS for my old B&W G3.
 
Decado said:
and you will have to keep in mind that apple in the future wont release OSs as often. That means that they are going to aim at higher G4 and G5 processors since most will have them in a year or two. for the first time in since OS9 the hardware will probably "evolve" faster than the OS and they will want to release an OS that will look modern for more than two years.

True: Apple has stated that they intend to slow down the release of new OSs -- but it has been said that Moore's Law is becoming inaccurate, meaning that the "speed doubles every 18 months" will probably not become reality.

I have to disagree with your statement that hardware will be evolving faster -- look at the Macintosh over the last 5 or 6 years: we went from a bondi-blue iMac at 233MHz to a G4 running at 1GHz in a couple of years. That's more than a 400% increase in speed. Since then, we've only hit 2.5GHz -- a mere 250% speed increase. It's been said that the way processors are manufactured now is approaching a brick wall and that the speed increases we've been experiencing over the last few years won't hold their current rate of increase.

That's not to say that other areas of performance won't increase dramatically, like the GPU or FSB or some new-fangled peripheral connectivity (like ultra-FireWire!), but raw speed is most definitely slowing down. Is Apple slowing the release of their OS because of this, or is it coincidence? I dunno...
 
Why would you see a reason like hardware-development slowdown as the reason for Apple taking a slower road for OS X development? Doesn't make any sense. Also: I could care less about Moore's Law. Computers are still going to get faster (and that's _not_ just about frequency, and not just about the CPU). The "brick wall": I don't believe in it. There had been such 'brick walls' before, and the industry has always found a way around things or they'd just go into a different direction altogether.
 
ElDiabloConCaca said:
Is Apple slowing the release of their OS because of this, or is it coincidence? I dunno...

Fryke: I don't see where I said that Apple is slowing OS release because of the slowdown of hardware advances. I simply posed a question: are these two things related or not? Then I specifically said "I dunno..." at the end. I neither speculated nor proposed that the two were related.

Computer will, no doubt, get faster. Will they get faster at the same rate they've been getting faster in the past? Well, evidence suggests not. But, that's not the end-all, be-all rule -- they may or may not get faster faster.

I simply said that I don't think hardware advances are progressing along like they used to in the days of OS 9. Whether this has anything to do with Apple's intentions to slow down OS releases is anyone's guess, and my guess is that they don't have anything to do with each other.
 
just to make it clear. my point being this:
in the days of OS7.5 to OS9 hardware "evolved" from about 33 mhz to 1 ghz. i think anyone can agree that the difference in hardware, the introduction of powerpc, G3 and G4, was more impressive than the difference between OS7.5 and OS9.

in the days of OSX the hardware has "evolved" from about 1 ghz G4 to 2,5 ghz G5. but in the meantime there have been OSX.0, OSX.1, OSX.2 and OSX.3.

therefore one could argue that the OS have made greater strides than the hardware (though 64 bit is impressive) since the introduction of OSX (compared to OS7.5-OS9). But now when they are slowing down the OS releases they make Tiger as kool as they can on top notch hardware (much like doom 3, for you gamers) so it will last longer.
i.e if you buy a Mac mini now it will be fine, and in two years when you buy a new mac, another level of Tiger eye candy will reveal itself and thus the OS will feel fresh.
 
OK, based on how people are reacting, it sounds to me like I must have an incorrect understanding of CoreImage (or those people do).

To my understanding, CoreImage is to be a software layer that "knows" the instruction set of a number of video cards, and can offload regular commands in the most efficient way directly to the GPU, saving the CPU from having to calculate algorithms that are THEN sent to the GPU (Gaussian Blur, for example). To me, this means that at the very minimum, CoreImage will require a card that it knows has in its instruction set a particular function, and how to call it in relation to its internal algorithms.

If the card is NOT supported, CoreImage would have to use the CPU to perform the same calculations, thus providing no speed increase over current solutions, but no slow-down either. What's more, it should provide ALL the same flashy features that you get at high-end level, and just be less efficient with slower hardware.

It would, of course, adapt to hardware changes so you always are getting the best performance out of YOUR computer. Software updates would include the instruction sets for more video cards and thus keep it up to date.


Is my understanding WAY off?
 
fryke said:
The "brick wall": I don't believe in it. There had been such 'brick walls' before, and the industry has always found a way around things or they'd just go into a different direction altogether.

The brick wall that's being talked about by enthusiast sites and industrial analysts has to do with the CPU frequency. Based on Moores law, speeds would about double in 18 months. For many years, this was true. Lately, this isn't the case. Take the Pentium 4 for example. The 3.06 Ghz was released in 2002. It's 2005 now and they're still barely reaching 4GHz. This is increase of approx. 33% in 3 years. What happened to the 4x faster CPUs? Didn't happen.

This is caused by the manufacturing process of CPUs. Nothing can really be done about it as with current technologies, CPUs don't look like theyll scale much higher than 4.0 GHz. Hence you see all the interest in dual-core/multi-core technologies as a way to improve performance.
 
"The observation made in 1965 by Gordon Moore, co-founder of Intel, that the number of transistors per square inch on integrated circuits had doubled every year since the integrated circuit was invented. Moore predicted that this trend would continue for the foreseeable future. In subsequent years, the pace slowed down a bit, but data density has doubled approximately every 18 months, and this is the current definition of Moore's Law, which Moore himself has blessed."

Moore's Law is simply not directly connected to the frequency of a processor. And the "brick wall" they're talking about isn't, either. From what I've read in the past few years, it's rather about how they're finding it more and more difficult to further minimize the structures. 90nm posed more of a problem than anticipated. Still: Here we are, using 90nm processors. The Cell processor is on the horizon, too. I don't think it'll hold all the promises made by enthusiasts around the world, but the hardware (although not necessarily expressable in "MHz" or "GHz") is still giving more and more performance/money.

And about CoreImage/CoreVideo: I thought those effects were to be 'live' - or not there at all. I.e.: You either have a graphics card capable of doing it, or you don't. I don't think this'll be supported on the current iBooks or the Mac minis.
 
Well that's really going to screw with things.

People don't want to have to write software that incorporates this feature that not all its users can utilise.

Who would write CoreImage software if they can't be assured that (at some speed) it would work on all Tiger machines?
 
Back
Top