“Intel’s next generation laptop platform, code named Montevina, has a nice feature that remains quite unheraleded, Displayport. Not only does it allow you to drive an external DP monitor, it uses it internally.” My take: So let me get this straight. We are finally leaving the days behind where TVs were TVs and computer monitors were computer monitors, entering a brave new world where a TV can be a computer monitor and vice versa, all thanks to DVI/HDMI – and now we’re getting Displayport on computers, recreating the wretched OR situation of yore? If I had any hair, I’d be pulling it out right now.
According to the wiki article you linked to, it’s backward compatible via hdmi/dvi…
It’s also royalty free, which means if it takes off, TVs will have DP ports on them rather quickly.
if not a monitor with a radio tuner built in?
but said tuner in a box, hooked to the monitor via some kind of connector and the difference is what?
Well, one thing a TV is increasingly often is a computer running Linux ;-). Seriously though, you’re absolutely right.
I would agree that there are too many different types of cables and that makes it difficult for average consumers to buy products, but I imagine that’s something hardware vendors will eventually address and completely separate from the original complaint.
To me, it sounds like a better standard than HDMI, esp. if you can get a DP-HDMI converter. It’s just one more connection you’ll see increasingly often in monitors _and_ TVs.
only problem i see is that none of these connectors can carry control signals of any kind. as in, why cant we hook up x devices and use a single remote for them all?
Mostly display resolution (in terms of cheep tvs compared to cheep monitors)
Most of my TVs at home only support interlaced mode as well (which quite frankly is sh*t)
Thats true for now but HDTV’s are catching on in a big way. The price point is insanely low. It’ll be another couple years, but 1080P will become standard, and its far higher res than most computer monitors.
if not a monitor with a radio tuner built in?
It is not. The TV viewer is usually sitting far from the screen, so TV panels = large dimensions + crappy colors/performance.
PC monitors: the viewer is near the display and spend a lot of time here, so it is required to have compact high-quality high-resolution panel.
interestingly, that just summed up my complaint about HDTV
still, how sits very close to a 30″ monitor (like those dell have for sale)?
if you sit at desktop distance your moving your head to pick up the corners, and you could just as well have gone with a couple of smaller ones side by side.
but then i have this crazy ass dream about displays standardized around iso standard sizes (you know, the A4 and friends) with a standardized pizel size.
Why the hell would you want standardized pixel sizes? I want them to continuously increase pixel sizes for as long as they can, at least until they hit around 600 DPI – not stick to some lame low-res standard locking display technology in the past without being able to fix all the crap our low ~100 DPI monitors force us to deal with, like having to choose between incorrectly placed glyphs or overly fuzzy glyphs.
known dimentions, pure and simple…
when one have standardized sizes for both displays and pixel density, one do not have silliness like webpages thats designed for 17″ at some rez or other (because thats what the designer used)…
hell, lately i have been running into pages designed to be viewed on widescreen displays. most likely designed by some imac user…
One shouldn’t make webpages designed for a special res or display at all, and not expect the design to look the way one sees it at all. HTML are crap for that and the information are much more important than the design. Once you learn that and design the page accordingly it’s no problem.
And no, we don’t want shitty monitors because people want to have an easier time designing web pages, that’s absurb.
its funny how ones someone air the word standardized, someone else automatically think its bad. what kind of meme is this?
Monitors report DPI via EDID, the information you seek is already made available by the display.
The problem isnt the lack of information or lack of standardization, its the lack of software to do anything with that information.
Yeah, it’s amazing how nothing have improved, with the TFT-panels rather got much worse. I guess it most/all depends on how shitty Windows work at higher DPIs, to bad.
And that’s why almost everyone uses TN-panels! ;D
Displayport carries USB as well as audio doesn’t it? IMHO it’s about time. it’s annoying having 4 cables. video/audio/usb/power should be integrated into one. I’d like to see power delivered to the monitor via the psu as well but I guess that won’t happen, a pity, but possibly good design. Bring on less cable clutter I say
problem is that usb alone does not help, that is unless there is a control standard of some sort to go along with it like bluetooth have in the the AVRCP profile.
And, if I remember correctly, USB is limited to a cable length shorter than what’s possible with the other ones (video / audio / power). According to the concept of, for example, using the “TV” as “PC monitor”, short USB cable parts would force the user to have the PC next to the “TV”…
If you’re carrying your power along the cable with you, it doesn’t take too much thought to integrate a usb repeater into the cable.
There is a DDCCI interface for media sources to control displays with, but I’ve never used it and I dont think it works across digital interfaces.
The more advanced automation stuff should be controlled via UPnP.
Earlier it was, I don’t know why this switched.
Not sure what the inquirer article is talking about. They make it sound like driving an external display is a brave new feature, when it is not.
Laptops already have DVI plugs on the side for driving external displays. And the alternatives to DVI (including DisplayPort) deliver next to no benefit to consumers.
HDMI/DVI has a crap encoding scheme and is little more than a digital version of VGA which necessitates that the display (which is LVDS internally) buffer each frame rather than just letting the video card drive the display directly.
Display Port is a lot closer to that ideal, but what I really want is a display standard that can run over cheap UTP cables. Hell, I want everything to run over UTP.
And I want everything running on one single wire (between grounded devices) so everyone around can benefit from the HF generated by the data transmission. 🙂
If you’re as old as I am, you remember when personal computers used televisions as monitors all the time. “Monitor? what’s that?”
My C64 was hooked up with just a television, my tandy 1000 was as well…
as those were simple days
Exactly. My first “real” computer was an Atari 400 (ahhh … the old bubble “keyboard” ) … I had to fight my Dad for time to use the family T.V. in the living room. It was a black & white, which made things interesting to say the least when trying to program colors — I got pretty good at detecting subtle shades of gray. The connection was through an R.F. connector — had to screw the prongs onto the terminals on the external antenna at the back of the T.V. each time I wanted to take it out for a spin. Throw in the fact that I couldn’t afford any kind of mass storage at the time (the tape cassette recorder cost around $100, which wasn’t exactly spare change for me) and I got really, REALLY good at memorizing program code
Personally, i think using displayport to drive both the external and the internal panel is great: ever tried to use or the monitor of a laptop, only to discover they use custom connections and chips soldered on the laptop motherboard, so you can’t just replace with any other, or use it on another pc?
Finally, a standard for laptop panels too!
What? The TV have always been the monitor, either thru RF-modulator or that Amiga Video to SCART-adapter for superior picture quality.
But yeah, if it doesn’t add anything more than an “OR” I get your point
“entering a brave new world where a TV can be a computer monitor and vice versa,”
You’re apparently too young to be an editor.
TV == monitor was how things were in the eighties.
And in 30000 BC, cave walls == monitor. Irrelevant. Your remark doesn’t nullify my complaint.
My first computer experience was on an MSX, so I think I know what you’re talking about. I’ve been there.
“And in 30000 BC, cave walls == monitor.”
Cavemem didn’t plug PCs into cave walls, now did they…
DisplayPort is intended to be used for TVs, too. It supports 1080p with 15-metre cables (more for fiber-optics cables). It’s being seen in computers and monitors first, but that was the same with DVI/HDMI, too. Yes, many of the DisplayPort benefits are only really useful to PCs and monitors (I don’t expect TV resolutions to jump beyond 1080p any time soon – there’s no point to even having 1080p for most consumers), but those benefits are stuff we actually can use – HDMI is never going to be able to drive high-DPI monitors, for example, and I’m personally looking forward to getting my hands on something closer to 200 DPI than 100.
DisplayPort has built in DRM? right?
Anything like this will fail to replace DVI/VGA so long as it contains a built in DRM mechanism.
Royalty free or not, it’s still DRM, and that means its total crap.