Apple is on the verge of announcing a new, totally re-designed family of three flat-panel displays with the addition of a 30-inch HD-ready model (2560×1600 resolution), claims ThinkSecret. In what appears to be a major change in direction for Apple, the displays will exclusively use a digital visual interface (DVI) connector and will not come with Apple’s proprietary Apple Display Connector (ADC).Our Take: On the second mockup, I totally love how the buttons and inputs are layed out. On my 21″ SONY E540 I have to literally twist my arm around in an unatural and discomforting position to reach its controls. This mockup of the Apple displays –putting the controls on the side– feels so natural for the arm and hands to easily reach these controls. Ingenious simplicity.
And the firewire and USB ports are on a nice position too hiding the ugly connectors on the back of the monitor, while they don’t get too difficult to reach.
/newpants
i just hope one day i can google this and laugh my 640k off at it. this one’s big, but i want higher density to become more mainstream.
Wow, now I only hope that they’ll start making graphics cards compatible / exchangeable with the PC ones, and Macs will be able to get much cheaper!
Oh yeah, Apple, and please make a headless Mac with DVI for less than $1000.
Their far more conservative and measured for a rumors site then most, I think this is likely true. Their no macosrumors – which btw, what happened to them? They’ve been offline for a long time now.
Excellent news for both Mac and Windows users-before you’d need a third party adaptor to use a Apple display on a PC. A
Thank God, no more ADC…
macosrumors has some kind of DNS problem but you can still get there (but do you really want to?)
http://199.105.116.92/
“Excellent news for both Mac and Windows users-before you’d need a third party adaptor to use a Apple display on a PC”
No, you had to buy Apple’s ADV-DVIadapter/power adapter.
If they go over to DVI, they will basically just include the adapter.
Hmm – iPod, iTunes, Airport Express, flat panel displays… It seems the iPod has made Apple realize there’s a lot of revenue to go after in the “PC” world as well (which is good of course; the Mac needs more standards support for its HW components). However, with a lot of the company revenue coming from non-Mac products, I wonder what kind of impact that would have on our beloved computer… or do they (Apple, and Steve Jobs in particular) still believe that these products are a way to increase visibility for their HW and MacOS X = increased sales?
BR//Karl -> qwillk
As an owner of a 23″ cinema display, I must admit, these things are bad ass.
The only problem was finding a rightly priced video card to drive the thing.
I ended up buying a Ti4200 Geforce 4 128 MB with ADC+DVI ports for about 4 times the price of an equivalently sold Geforce 4 ti4800se 128MB PC card.
Anyone know much about the flashing process or the instruction set architecture differences? I remember <a href=”http://www.xlr8yourmac.com/“>xlr8yourmac had some flashing techniques for pc cards that would allow you to plug into your mac, but they seemed to have been highly encouraged by those companies to stop.
If 23″ costs only $1000 then I’m going to get 2 instead of going 30″!
“I ended up buying a Ti4200 Geforce 4 128 MB with ADC+DVI ports for about 4 times the price of an equivalently sold Geforce 4 ti4800se 128MB PC card.”
That just says it all…
The dumb-as-nails ADC connector is gone. It is a dead-end design that sacrifices performance and compatibility for Apple vendor lock-in.
Hurrah, the darn thing is finally gone! Apple got a clue.
Maybe this bodes well for the new(er) G5. Though rumors have it that Jobs is going to eat crow over missing his 3Ghz speed.
being that power was drawn from the CPU via the ADC cable
Not from the CPU, it’s from the Main Board. They’re in the computer industry and they still don’t know the proper tech speak.
Too bad for the ADC necessity. This is needed on Windows side as well. Why do good ideas always get cut down? I remember plugging my monitor power cord in the case’s power supply. I still can’t find a power supply that can do that. ADC should be an industry standard to everyone (Mac AND Windows).
Just checked the Apple store.
the new 2.5 ghz G5 is now available. They have 21 days to bring out the 3 ghz model.
I want four or five 23 inch displays in my house but I don’t want to buy four or five computers. Did Steve think ahead enough to let me just plug in these various new displays to the Gb LAN so I can attach the displays to a single host as X terminals?
No? Then I won’t even bother to hope for integral wireless network display connectivity or for minimal upgradeable firmware to host X server software and a GPU in the display cause I realise I’m already pushing it.
I do realise that it is probably too much to ask of someone (SJ) who envisions himself as having helped invent the personal computer as someone who could allso help uninvent the personal computer by smashing the myth that every single display must be driven by a dedicated personal computer attached directly to it.
looks like Steve 3GHz promise was a little bit too optimistic.
looks like Steve 3GHz promise was a little bit too optimistic.
I’m guessing IBM told Apple, “Sure, we’ll come up with 3GHz in a year, no problem!”, and Steve ran with it. Or perhaps, IBM said, “Sure, we’ll come up with 3.6GHz in a year!”, and Steve cut the number down to be safe. Let it be a lesson, never promise to deliver anything that you rely on someone else for.
“never promise to deliver anything that you rely on someone else for” ain’t that the truth. Still though, I’ve was going to wait till January anyhow to upgrade.
http://www.apple.com/powermac/design.html
Just looked at the new G5 and it’s water cooled. Pretty cool desing!
Just had a look at the specs of the new G5, and they still sport one DVI, and one ADC connector. This might or might not be an indication that ADC will not be radically phased out, but I can hear some geeks crying for want of a dual headed-DVI-only dual G5…
Does DVI only work up to a certain resolution, like 1920 x 1200? At a computer lab in school we had some IBM T221 monitors, and they needed two DVI inputs simultaneously to reach 3840 x 2400, or so I thought.
I dont think Steve was being too optimistic. I could be wrong though. He said 3ghz this summmer. Well there are still 2.5 months left this summer. Honestly, I think its a bit silly to go from 2ghz to 3ghz. 2.5ghz is a great speed bump here. Not to mention a more than fair speed to launch a new processor on a heavily revised product line. If all goes well with the new G5, I wouldnt be suprised if we see the dual 3ghz version popup by summers end.
Yeah 2.5 is a good speed bump, even though it came out this summer. From what I read it was a big challenge to get to 3ghz, but hey it’s better than Moto.
All in all it was a good challenge to Apple to “try” to get to 3ghz.
I dont think Steve was being too optimistic. I could be wrong though. He said 3ghz this summmer. Well there are still 2.5 months left this summer. Honestly, I think its a bit silly to go from 2ghz to 3ghz. 2.5ghz is a great speed bump here. Not to mention a more than fair speed to launch a new processor on a heavily revised product line. If all goes well with the new G5, I wouldnt be suprised if we see the dual 3ghz version popup by summers end.
But why blame Steve? he was mearly relaying what IBM had promised him. IBM promised Apple 3Ghz CPUs by summer, Steven simply relayed that promise to the Mac fans. If the people here want someone to blame, blame IBM for the lack of volume in achieving 3Ghz.
But why blame Steve? he was mearly relaying what IBM had promised him. IBM promised Apple 3Ghz CPUs by summer, Steven simply relayed that promise to the Mac fans. If the people here want someone to blame, blame IBM for the lack of volume in achieving 3Ghz.
Firstly I don’t really care that they didn’t make 3Ghz. Its no big deal, however passing the buck seems to be so typical of Mac Users.
In the past it was “blame Motorola” and now its “Blame IBM”. Christ its Apple promoting and selling the products and its Apple who should be accountable for things when they don’t work out as advertised.
With that said I still consider it no big deal. I doubt its the end of the world and 2.5 Ghz. is a very respectable speed to obtain since the launch of the G5.
“Just looked at the new G5 and it’s water cooled.”
I’m guessing that’s the only way they were able to get to 2.5GHz. They just took two of their 2GHz CPUs and overclocked the #$%! out of them, necessitating the need for a liquid cooling solution.
Getting back on subject, the thought of a 30″ Apple display is truly drool-worthy, even though I know I’ll never be able to afford one. BUT, maybe it will allow me to purchase a smaller previous-generation display at a great price!
It is kind of sad about the death of ADC, if true. The idea of consolidating all the necessary connections into one cable was great. It won’t be nearly as elegant if you will now be required to run power, video, USB and Firewire cables to your monitor.
Ok, so they dident reach 3Ghz. Big deal, PowerPC processors does alot more work per cycle than intel processors. So they still blow away anything out there. I know alot of PC users look blindly at the hertz, but thats not an indicator of actually processing power. Risc processors and the altivec is really sweet architectures.
Sadly, i bought an AMD64 instead of an apple – so yes, im drooling from my linux over the new boxes.. Someday apple, someday
being that power was drawn from the CPU via the ADC cable
Not from the CPU, it’s from the Main Board. They’re in the computer industry and they still don’t know the proper tech speak.[i]
The CPU (Central Processing Unit) has two equally valid meanings; the processor, and the box as a whole. Just because the second meaning has become less popular lately, does not make it any less “proper tech speak”. You are correct that using the processor and motherboard/mainboard terms is more precise.
[i]Too bad for the ADC necessity. This is needed on Windows side as well. Why do good ideas always get cut down? I remember plugging my monitor power cord in the case’s power supply. I still can’t find a power supply that can do that. ADC should be an industry standard to everyone (Mac AND Windows).
That is a scary thought. I wonder how much more we can boost the power requirements before either the box burns a hole through the floor, or it is required that the entire system be factory-sealed as a safety precaution. There are sound reasons for spreading out the load a little.
That said, yeah, it would be cool to dump even more of the cables, so from that perspective this seems like a step back. Personally, I think that DVI is worth the extra cabling in the long run, though. As technology evolves, maybe the single cable for power and data will be re-opened and resolved with a more elegant and scalable solution(Firewire-driven monitors, anyone. )
Those buttons are useless for me (and the rest of the 10% of peope who are left handed). If anyone designs a monitor like this they’d better duplicate the buttons on the left hand side.
“Firewire-driven monitors, anyone”
Wireless FireWire was also recently approved. Maybe send the info wireless and just have a power cable for the display. Wireless display, and keyboard/mouse are already wireless, could end up being very cool.
Hey, I’m a leftie, but unlike most I actually use my right hand for a great many things, including my mouse. Why is it that almost all left-handed people have a completely useless right hand? You live in a right-handed world, learn to use it and stop demanding that every product on the planet be made ambidextrous. Especially monitor controls! I mean seriously, how often do you need to use them anyway? Most times, you just set it up once and leave it alone.
Sorry for the rant, but this just happens to be one of my pet peeves. I, for one, don’t mind if Apple puts the controls for it’s new monitors on the right, left, bottom or even the top(!), as long as they are intuitive and easy to use (which Apple products usually are).
I’m guessing that’s the only way they were able to get to 2.5GHz. They just took two of their 2GHz CPUs and overclocked the #$%! out of them, necessitating the need for a liquid cooling solution.
If you remember, the new chips are supposed to be the 90nm chips. IBM was supposed to be having the same trouble with heat as Intel at 90nm. Rather than using a HSF the size of a Buick that sounds like approaching aircraft (like Prescott systems use), Apple decided that if they were going to keep sound down, they’d need something different… like liquid cooling.
I hope they do get the 20″ displays under $1000. I’ll certainly get one at that price.
I’m a right-hander, and my Dell monitor has controls on the front-left. I amazingly manage to use my left hand to adjust the screen.
Did you guys notice the actual controls? They aren’t like the manual controls on the older monitors. It looked like 3 buttons, brighter, darker, and the same prefs buttons on the current displays, where you press the button and it opens system prefs to the displays pane and you use your mouse. Left or right handed, hitting one button (not a wheel or nob or anything, not even an actual button, but a touch-sensitive one like the iPods use) isn’t a problem. If you can’t do that, how do you type?
Did anyone notice that the new Dual 2.5 GHZ is still on a Radeon 9600 XT? Not much of a “power” machine for that price.
I actually liked the ADC connector. Clips right on, no power lead to the monitor — makes for a tidy desktop. The only thing I use my cinema monitor for is an ADC-ready G5.
But bully for Apple heading out into the general consumer market with monitors even PC users can love. In ten years, we’ll probably think of Apple as an entertainment brand.
A 17″ budget monitor that looks like the big boys would be a great match for the headless iMac I fear Apple will never build.
There is an option… but still it should be standard.
This is probably one of the best thing that Apple could have done. It should open the Mac up to more cutting edge video cards as ATI/NVIDIA will not have to dedicate separate engineering efforts for integrating ADC into their cards.
Bert
“If you remember, the new chips are supposed to be the 90nm chips. IBM was supposed to be having the same trouble with heat as Intel at 90nm.”
True, true. You’re right; I implied too much when I said that they used two of their existing 2GHz CPUs. I don’t know if they are or not. They could very well be the new 90nm chips. But my point still stands. I’m guessing that they are having problems reaching 2.5GHz, even at 90nm, without some serious cooling (just like Intel, AMD and everyone else making chips at 90nm). And liquid cooling is just as viable an option as any other, so kudos to Apple for using a quiet solution instead of just throwing more air at the processor.
As an aside, I don’t think I would call Apple’s setup “liquid cooling”. It looks more to me like a heat pipe, as there is no pump or reservoir typical in a liquid cooling system. The cooling liquid appears to be moved through the system by way of currents generated by the heating and cooling of the liquid. Granted, the system does use a liquid to cool the processors, but semantically, I would call it a heat pipe. But then again, maybe I’m just being picky.
“Rather than using a HSF the size of a Buick that sounds like approaching aircraft…”
Hehe, I LOL-ed at that. You should see the size of the heatsink on my AthlonXP
@Jeff
Did anyone notice that the new Dual 2.5 GHZ is still on a Radeon 9600 XT? Not much of a “power” machine for that price.
In case you didn’t notice, the Radeon 9800 XT 256mb is available as well as an option.
“Granted, the system does use a liquid to cool the processors, but semantically, I would call it a heat pipe. But then again, maybe I’m just being picky.”
I call it a radiator. Anyhow, they still have about 2 and a half monthes to announce 3 GHz G5s, keeping in mind they said ‘summer’ and not ‘next WWDC’, and keeping in mind how fast the 1.42 G4s came out after the 1.25s.
Haha, fair enough. I’ll quieten down then Yeah, I can use my right hand of course, I’d just rather that designers took account of both points of reference rather that making life harder for one. It’s not particularly that monitor controls themselves are a big problem, it’s all of the little problems heaped on top of eachother.
I think you’re right. It is more like a heat pipe.
Monster HSFs are okay as long as they are running on “low”. My Opteron has a MONSTER HSF on it, but the fan barely turns, so it’s really quiet. However, the new Prescotts start at medium and quickly move to high speed under use. You have to put the things in a closet to muffle the noise.
As far as Apple video cards go, remember that they have to make Mac specific cards, so they are going to come out several months later than the PC cards. Since they are custom and sold in low volume, you will pay more for them.
I’d like the person who reported my post for abuse to step forward like a man and take responsibility for such a fool thing. When the world is beginning to see network driven wireless displays all over the place is it too much to ask Steve Jobs newest, most powerful computer, already running X, to serve up remote displays to several Apple remote displays at once? I mean, really, 20-year-old X software will do it, without any help from Steve except for an insightful placement in the bezel of a single chip wireless broadband solution and a ATI Mobility Radeon 9700, for instance. Whatsamatter? Do you feel dirty from the idea that a whole bunch of displays on a LAN should be able to do just fine without a Macintosh attached to each and every one of them?
“Sorry for the rant, but this just happens to be one of my pet peeves. ”
So your “rant” also holds for blind, deaf and handicapped people, who are in the minority. It after all is a sighted, hearing, and fully capable world.
And that’s one of MY pet peeves.