Sadly, the Finder experiences a segfault when the total size of all desktops seems to exceed around ten million pixels.
I ran into this when I put a third display on my machine, each running at 1600x1200; the Finder died almost immediately, and continued to generate segfaults upon further attempts to run it.
I played around with it for a while (it's amazing how usable the machine still is, even without the Finder), and determined that it really is the total screen real estate which trips it up. The Finder is trying to scale the desktop images to all displays, and exceeds a preallocated buffer somewhere.
This is mostly bothersome because it reveals some very bad coding practices:
Firstly, fixed, preallocated buffers are incredibly bad form. They waste space needlessly; they open endless doors to security problems; and, as evidenced by this situation, what some developer thinks at one point is "enough for anyone" frequently turns out to not be.
More generally, the Finder shouldn't be what's responsible for the desktop pattern. What if I don't want to run the Finder, or want to run some third-party Finder replacement? What about the philosophy of modularity, and separating functions into easily updateable, replaceable, configurable parts?
And, only tangentially related to this problem, I'll ask again: what's with this only-one-desktop-pattern-mirrored-on-every-display nonsense?
We all know that the Finder is the hackiest part of this system, clearly thrown together at the last minute. But this is truly alarming.
I ran into this when I put a third display on my machine, each running at 1600x1200; the Finder died almost immediately, and continued to generate segfaults upon further attempts to run it.
I played around with it for a while (it's amazing how usable the machine still is, even without the Finder), and determined that it really is the total screen real estate which trips it up. The Finder is trying to scale the desktop images to all displays, and exceeds a preallocated buffer somewhere.
This is mostly bothersome because it reveals some very bad coding practices:
Firstly, fixed, preallocated buffers are incredibly bad form. They waste space needlessly; they open endless doors to security problems; and, as evidenced by this situation, what some developer thinks at one point is "enough for anyone" frequently turns out to not be.
More generally, the Finder shouldn't be what's responsible for the desktop pattern. What if I don't want to run the Finder, or want to run some third-party Finder replacement? What about the philosophy of modularity, and separating functions into easily updateable, replaceable, configurable parts?
And, only tangentially related to this problem, I'll ask again: what's with this only-one-desktop-pattern-mirrored-on-every-display nonsense?
We all know that the Finder is the hackiest part of this system, clearly thrown together at the last minute. But this is truly alarming.