I'd like to gather your opinions on the concept of a high-pixel density display in normal computers. Professionals often have external displays to work with, and screens with enormous resolutions are primarily useful for multimedia work. For example, I think that comic could possibly benefit from having high resolutions such as this, and to have it available without being tethered to another display could be nice. However, I don't feel as if the world is ready for such a device. While text and such will be greatly improved, the majority of graphics are not ready to be used with such devices. Websites aren't currently taking advantage of the high resolutions in Apple's current-generation iOS devices, and every application will require graphical reworking to take advantage of the displays.

Personally, I do not feel that we're prepared to take full advantage of the technology, and as such, it's currently useful primarily to multimedia professionals. I think that Apple should have taken time to allow developers and website administrators to take advantage of high resolutions, during which time an even higher pixel density could be achieved. What are your thoughts?

P.S. If the trend of doubling the resolution arrives for the 13" MacBook Pros, they'll have a higher pixel density than the 15" models, even though it's barely significant.
In my opinion, I agree that higher density displays aren't all that they are cut out to be. For ex, look at games for iOS. If a game wants to take advantage of it, the textures used in the game must be recreated to be much larger. That increases memory usage significantly (plus $$ spent on data since apps will be larger).

Yes, websites do have issues on larger displays. I just transitioned from a small netbook display and a decent sized laptop screen to a large 1080p LCD monitor. A lot of stuff is made for smaller screens and its is easily visible.

Multimedia, yes, I agree with you, it does give quite a big advange to multimedia quiality.

Isn't there a maximum clarity that human eyes can see and these displays are getting close to passing it? If that's true from where ever I heard/read it from, then higher pixel density displays will be wasteful since you cannot process fine pixels. Now, this is more beneficial for mobile platforms and wasteful in desktop/laptop uses due to the distance from the display. (this is not saying that current monitors are bad, I am just saying that future monitor sizes, like 3840 × 2160 (Currently supported by DisplayPort), will get too wasteful for up-close viewing.
Also these displays are much, much more resource-heavy (as in graphics card processing power) and thus indirectly use more power.
Pretty much. For Apple, its just "Hey, we make new stuff and you just buy it. Don't worry about the fact that the cost of the hardware, cost of apps, data usage costs, heat output, graphics and processing requirements, and power usage are going up." Wink
Actually I'm totally fine with my 1024x600 display. I think the pixels are small enough.
If you're working close enough to your screen or on detail oriented work (photography, graphic design, 3D Modeling) I see the benefits to the added clarity and sharpness. On my iPad, the retina display is amazing. Everything is sharp and crisp. And using it as a second monitor with "HiDPI" settings enabled on my Mac Mini allows me to see the added clarity in the two or so computer third party programs I have that support it. Coda 2 being one app I have that supports Retina Monitors (and actually launched with the support about, about 3 weeks ago), but nothing to make we go "wow" just yet. Though a 1024x768 at 220dpi doesn't leave a lot of room for amazing impressions.

The 400 dollar premium over the two 15" MacBook Pro's is certainly a semi-steep price, but you get twice the RAM, 4x the pixels and the same 7 hour charge life is quite impressive. The only downside is that the SSD is sliced in size. I'm glad I won't be joining the retina Mac line, but in the next two years I certainly plan too. When the iMac has Retina and the premium is low enough to the point that there's no cost difference, I'll definitely join the party.

Now, this topic isn't about Apple. It's about the use of retina displays. I certainly use my available retina display (iPad 3) when I edit photos as a second monitor. I don't need anything else as my 37" 1080P TV provides a pixelless experience at the distance I sit from it (so there for, Retina Displays already existed?). Sitting about six feet away, it's almost impossible for me to see the pixelated circles and jagged diagonal lines in back and forward arrows.

There's also amazing text clarity. I can't say how well this improves the likes of programming but when I read it's noticeable. Small text becomes legible instead of pixelated (but still readable) words. If I were starring at text all day on a 27" monitor just 2 feet away, I'd probably favor a retina display.

No one complained about monitor quality before and it's not for every application, both physically and programmatically. Certainly. As mentioned above, it's great for detail oriented work. If you need it, great. Otherwise move on.
I (and others) have long lamented the stagnation of the LCD market. Before LCDs took over you could easily run most CRTs at 1600x1200 (or higher), then everything fell off when LCDs became mainstream, to the point where everybody is happy with lame resolutions like 1366x768, or find it exciting to have a gigantic 27" LCD with resolution of only 1920x1080.

I have no intention of buying an Apple computer anytime in the forseeable future, but I'm glad to see them increasing DPI and resolution because other OEMs are sure to begin adopting similar panels in response. Whatever the other drawbacks, having lots of pixels is nearly always better than having fewer pixels.
Tari wrote:
I (and others) have long lamented the stagnation of the LCD market. Before LCDs took over you could easily run most CRTs at 1600x1200 (or higher), then everything fell off when LCDs became mainstream, to the point where everybody is happy with lame resolutions like 1366x768, or find it exciting to have a gigantic 27" LCD with resolution of only 1920x1080.

I have no intention of buying an Apple computer anytime in the forseeable future, but I'm glad to see them increasing DPI and resolution because other OEMs are sure to begin adopting similar panels in response. Whatever the other drawbacks, having lots of pixels is nearly always better than having fewer pixels.


LCDs didn't cause that. IBM made a 3840×2400 22.2 inch LCD IPS monitor in 2001 (like actually produced you could go buy one, not just a one-off prototype). No, monitors becoming more and more widely used caused it. A decent 20"+ CRT cost a fortune, whereas a 27" 1920x1080 monitor is $250.

High density displays are *awesome* and I can't wait to have them as standalone displays that are somewhat affordable. Modern PCs can easily, easily handle the super high resolutions - the power is sitting there. And more than ever most content is now procedural or vector, not bitmaps (not just text - gradients, borders, and shapes on the web are most often generated by the browser and not bitmaps).

Apple's approach is pretty shitty because their GUI APIs were never designed with high density in mind, but I see it more as a stepping stone.
I certainly agree with Tari's point that the pixel density of things like huge 27" displays is despicable, and I appreciate Apple pushing LCD manufacturers to try to do better with their current technology. I think that my gripes with the so-called "Retina" displays are more related to my general gripes with Apple than with this technology specifically. First, every Joe Apple Fanboy Shmoe out there is going to start crowing about how Apple manufacturers such amazing displays, and why can't Dell and Acer and Samsung and all the rest manufacture such wonderful displays. Then I'll have to pull out my standard speech about how Apple doesn't actually make any hardware, and that Macs come out of the exact same factories churning out PCs and PC components. Without getting us too off-topic, I also despise Apple for continuing down a path of machines where literally nothing (not even the SSD because it uses a ridiculous proprietary connector instead of mSATA or mPCIe) is user-serviceable or user-upgradeable.
I've never had a problem with 1024x768 displays, and I'll even go as low as 800x600 or 640x480 in a game if it gives me a significant performance boost(which on my hardware it usually does). I have no problem with people who can afford larger displays having access to them, they just aren't useful to me personally.
Operating systems can scale up the user interface to retain legibility on lower dot pitch displays, though results are mixed. Windows usually uses raster graphics rather than vector graphics, so this tends to result in ugly icons (and as most software is not designed with this in mind can result in layout bugs). Here's a site that demonstrate's Vista's scaling taken to the extreme.

I prefer a single high-resolution display over multiple medium-resolution displays so I look forwards to being able to buy a monitor with a decent number of pixels on it at a reasonable price. I just hope that Windows 8 doesn't swing people the other way, as that is currently pushing the idea of running a single application at a time full-screen rather than multiple windows on a large desktop.
I love the high density graphics, but I'm definitely upset about the lack of user serviceability in this new line of Macbook Pros specifically. Older models that I've used for years have always been perfectly reasonable to service (compared to other laptops). And the desktop workstations they make for people who are savvy enough to do self-maintenance are also quite easy to service. Even the mini + iMacs have upgradeable RAM and harddrive.



Quote:
Isn't there a maximum clarity that human eyes can see and these displays are getting close to passing it?

That's the point. The pixelation in your graphics should be so fine that everything looks smooth (at a certain minimum distance. eye resolution is angle-dependent)
Smooth is one way to put it. I always say it looks like its been printed on high quality photo paper, but however you put it the pixels in the screen are too small to perceive any edge pixelation at and beyond the reccomended viewing distance.
  
Register to Join the Conversation
Have your own thoughts to add to this or any other topic? Want to ask a question, offer a suggestion, share your own programs and projects, upload a file to the file archives, get help with calculator and computer programming, or simply chat with like-minded coders and tech and calculator enthusiasts via the site-wide AJAX SAX widget? Registration for a free Cemetech account only takes a minute.

» Go to Registration page
Page 1 of 1
» All times are UTC - 5 Hours
 
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum

 

Advertisement