Size, Type, and Brightness of Display for Healthy Development [closed] - monitor

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
If you stare at a monitor for 8-12 hours a day looking at code, I have a couple questions for those that may have researched the health factors of this or have tried a few options.
To be easy on the eyes, can a monitor be "too big"?
Is there a particular type of display technology over another that reduces eye fatigue?
How bright should your display be in relation to your environment? Is it less fatigue to have a bright environment and a bright monitor over a darker environment?

If you're worried about eye-strain, don't forget the low-tech solution: every 30 minutes, lean back, close your eyes, and rest them for 10 seconds. Or, if you don't want to look like you're napping, gaze out a window or across the room. You should do this regardless of whether you're staring at a monitor, a book, or a sheet of music. Staring at anything for hours at a time is going to strain your eyes.
I use a free timer program to tell me when 30 minutes is up. Whenever I forget to do this, my eyes always feel itchy and tired by the end of the day.
I know this doesn't answer the precise question you asked, but I think you're looking in the wrong place for a solution. Rather than investing in a new monitor, just rest your eyes on a regular basis. There. I just saved you a few hundred bucks.
EDIT: References have been requested, so here they are. There's a decent scientific article on the value of microbreaks here and a review of the literature here.

I've always used the analogy between monitor size (resolution) and desktop size - larger screen, more space to spread out and work.
More important than the physical size is how you set it up - most people have their monitors set way way too bright.
I typically start with maximum contrast and minimum brightness, and work from there. The black on your screen should be real black, not dark gray; the white on your screen should be no brighter than a piece of paper held up next to it.
That said, I do have good screens. At work, dual 22" 1680x1050 LCD; at home, dual 19" 1200x1024 CRT; and my laptop is 1920x1200 17". I've trialled a single 24" LCD - was really nice, not as wide as either dual monitor setup.
Updated 1 Mar: The suggestion from rtpearson to look away from the monitor regularly is good advice.
I was told (years ago) that it is important for your eyes to change focal length regularly.
If you have a seat next to a window, glancing outside while you think is a good way to achieve this. "Walking an email" to a colleague on the same floor can help as well. Using a timer (such as this one I wrote) to remind you to take breaks and rest your eyes is also useful.

I'm not sure it matters. I've worked in investment banks where multiple high-res screens were the norm and am currently doing development work at home on a 9-year old Sony laptop with a 1024 x 768 screen. I haven't noticed any difference in my productivity or my eyestrain in those very different envirobments.

In terms of brightness, what works for me is to adjust the brightness of the display to match the ambient light in the room. At the moment I am running a 24" Samsung Syncmaster and I have to say that I consider leaving it on the brightest setting to be a health hazard.

There are lots of websites to help you calibrate your monitor brightness/contrast. This is just one http://www.displaycalibration.com/brightness_contrast.html

I have a 24" Dell at home, but I doubt many companies would consider that for a development machine.
22" Wide with a resolution of 1680 x 1050 is good, and the price of those monitors are relatively cheap now.
Currently I am working on a 17" 1280 x 1024, as the laptop I got to dev on only got a meager 1280 x 800 screen, which is pretty much useless for coding.
IMO 2 x 17" or 19", or 1 x 22" or larger.
Note: most cheap LCD's have terrible color, example the orange on SO, looks a pale yellow, and thats the best I can get it. The Dell at 5 times the price of a cheap 24" does not have these issues, but you pay for it :( (I still think it was a damn good investment)!

24 inch is minimum for me, and 1680x1050 is too few dots for effective coding. I prefer dual monitors at 1920 x 1200 or better .. i'd really like a pair of 30 inch Samsungs but I need to get richer. Brightness and all that other stuff has never much affected me .. since i'm always coding at night anyway not much of an issue

If you use a CRT screen, make sure you set the refresh rate nice and high. 85Hz is a good value. The default rate on Windows of 60Hz is too low. The flickering makes me feel nauseous. The refresh rate on LCD screens doesn't matter due to high "persistence".
Most people don't know this and leave their screens at 60Hz. Strangely, however, from personal experience, if you tell them directly, "Your refresh rate is wrong", many of them will get defensive—about their refresh rate!—which they probably don't even know what it is. People are strange. I'm glad LCDs are replacing CRTs.

Firstly, yes, there is a limitation of the screen size. I think a monitor is better not bigger than 30 inches. That's also the reason most brands only released monitors with screen size from 19 to 27 inches. Although you can also find a monitor with 100 inches screen size, it's not common. I guess the manufactories did research and find out the most acceptable range of screen size.
Secondly, there are some technologies already. For instance, BenQ has a technology called Flicker-free. “The Flicker-free technology eliminates flickering at all brightness levels and effectively reducing eye fatigue. ” There are also other specifications, too. I also heard about some labs are working on e-ink monitors.
Thirdly, it's difficult to give an exact number of brightness. It depends on the environment light. On the other hand, somebody is sensitive to the brightness, others are not. It's better to try different values and find the best way for yourself.

The app f.lux can really help at night if you're coding on a bright background. It reduces the blue in the screen and thus eye strain. Change the setting to 1hr instead of the default 20 seconds and you won't even notice it.
https://justgetflux.com/

Related

SceneKit scenes lag when resuming app

In my app, I have several simple scenes (a single 80 segment sphere with a 500px by 1000px texture, rotating once a minute) displaying at once. When I open the app, everything goes smoothly. I get constant 120fps with less than 50mb of memory usage and around 30% cpu usage.
However, if I minimize the app and come back to it a minute later, or just stop interacting with the app for a while, the scenes all lag terribly and get around 4 fps, despite Xcode reporting 30fps, normal memory usage, and super low (~3%) cpu usage.
I get this behavior when testing on a real iPhone 7 iOS 10.3.1, and I'm not sure if this behavior exists on other devices or the emulator.
Here is a sample project I pulled together to demonstrate this issue. (link here) Am I doing something wrong here? How can I make the scenes wake up and resume using as much cpu as they need to maintain good fps?
I won't probably answer the question you've asked directly, but can give you some points to think about.
I launched you demo app on my iPod 6-th gen (64-bit), iOS 10.3.1 and it lags from the very beginning up to about a minute with FPS 2-3. Then after some time it starts to spin smoothly. The same after going background-foreground. It can be explained with some caching of textures.
I resized one of the SCNView's so that it fits the screen, other views stayed behind. Set v4.showsStatistics = true
And here what I got
as you can see Metal flush takes about 18.3 ms for one frame and its only for one SCNView.
According to this answer on Stackoverflow
So, if my interpretation is correct, that would mean that "Metal
flush" measures the time the CPU spends waiting on video memory to
free up so it can push more data and request operations to the GPU.
So we might suspect that problem is in 4 different SCNViews working with GPU simultaneously.
Let's check it. Comparing to the 2-nd point, I've deleted 3 SCNViews behind and put 3 planets from those views to the front one. So that one SCNView has 4 planets at once. And here is the screenshot
and as you can see Metal flush takes up to 5 ms and its from the beginning and everything goes smoothly. Also you may notice that amount of triangles (top right icon) is four times as many as what we can see on the first screenshot.
To sum up, just try to combine all SCNNodes on one SCNView and possibly you'll get a speed up.
So, I finally figured out a partially functional solution, even though its not what I thought it would be.
The first thing I tried was to keep all the nodes in a single global scene as suggested by Sander's answer and set the delegate on one of the SCNViews as suggested in the second answer to this question. Maybe this used to work or it worked in a different context, but it didn't work for me.
How Sander ended up helping me was the use of the performance statistics, which I didn't know existed. I enabled them for one of my scenes, and something stood out to me about performance:
In the first few seconds of running, before the app gets dramatic frame drops, the performance display read 240fps. "Why was this?", I thought. Who would need 240 fps on a mobile phone with a 60hz display, especially when the SceneKit default is 60. Then it hit me: 60 * 4 = 240.
What I guess was happening is that each update in a single scene triggered a "metal flush", meaning that each scene was being flushed 240 times per second. I would guess that this fills the gpu buffer (or memory? I have no idea) slowly, and eventually SceneKit needs to start clearing it out, and 240 fps across 4 views is simply too much for it to keep up with. (which explains why it initially gets good performance before dropping completely.).
My solution (and this is why I said "partial solution"), was to set the preferedFramesPerSecond for each SceneView to 15, for a total of 60 (I can also get away with 30 on my phone, but I'm not sure if this holds up on weaker devices). Unfortunately 15fps is noticeably choppy, but way better than the terrible performance I was getting originally.
Maybe in the future Apple will enable unique refreshes per SceneView.
TL;DR: set preferredFramesPerSecond to sum to 60 over all of your SceneViews.

app burns numbers into iPad screens, how can I prevent this?

EDIT: My code for this is actually open source, if anyone would be able to look and comment.
Things I can think of that might be an issue: using a custom font, using bright green, updating the label too fast?
The repo is: https://github.com/andrewljohnson/StopWatch-of-Gaia
The class for the time label: https://github.com/andrewljohnson/StopWatch-of-Gaia/blob/master/src/SWPTimeLabel.m
The class that runs the timer to update the label: https://github.com/andrewljohnson/StopWatch-of-Gaia/blob/master/src/SWPViewController.m
=============
My StopWatch app reportedly screen burns a number of iPads, for temporary periods. Does anyone have a suggestion about how I might prevent this screen persistence? Some known workaround to blank the pixels occasionally?
I get emails all the time about it, and you can see numerous reviews here: http://itunes.apple.com/us/app/stopwatch+-timer-for-gym-kitchen/id518178439?mt=8
Apple can not advise me. I sent an email to appreview, and I was told to file a technical support request (DTS). When I filled the DTS, they told me it was not a code issue, and when I further asked for help from DTS, a "senior manager" told me that this was not an issue Apple knew about. He further advised me to file a bug with the Apple Radar bug tracker if I considered it to be a real issue.
I filed the Radar bug a few weeks ago, but it has not been acknowledged. Updated radar link for Apple employees, per commenter's notes rdar://12173447
It's not really a "burn in" on a non-CRT display, but there can be an image persistance/retention issue on some LCD display panel types.
One way to avoid both is to very slowly drift your image around, much more slowly than a screen saver. If you move your clock face around a small amount and very slowly (say several minutes to make a full circuit of only a few dozen pixels), the user may not even notice this happening. But this motion will blur all fine lines and sharp edges over time, so even if there is a persistance, the lack of sharp edges will make it harder to see.
Added:
There is also one (unconfirmed) report that flashing pixels at the full frame rate may increase the possibility of this problem. So any in-place text/numeric updates should happen at a more humanly readable pace (say 5 to 10 fps instead of 30 to 60 fps), if repeated for very long periods of time. The app can always update the ending number to a more accurate count if necessary.
"Burn in" is due to phosphor wearing in CRTs. LCDs cant have burn in since they dont use phosphor.
More likely it is image retention/Image Persistence. An image can remain 'stuck' on the screen for up to 48 hours. Usually it shouldnt last that long so it may be a defect in their hardware too. MacRumors has a thread about iPad image retention, it discusses this very issue. As for a solution, there is nothing you can do about the actual screen because its a just how LCD's work. What I would try if you are still concerned is using more subtle colors. Unless something is actively changing the pixels (think screen saver) you arent going to be able to completely eliminate the problem.

Is it reasonable to consider a future Retina / HD iPad when starting a new project?

A few days ago a client asked me if the transition to the iPhone 4s retina display was a difficult one, development-wise.
This made me ask myself whether I should have considered iPhones with high resolution dispays even before the iPhone 4 had been announced - creating artwork with higher resolution, preparing codepaths... (while, of course, creating high resolution artwork is never a bad idea, considering its use for marketing, porting to other platforms etc.)
Now, with the iPad being around for some months, first rumors of a future iPad with retina display emerge from the depths of the www. And I start wondering - would it make sense to prepare new projects for such an iPad? I'm pretty sure that apple will in fact release a retina iPad at some point in the future, because it would be quite a logical step. So, I guess the important question is "how soon can we expect such a device?". There is much to consider when thinking about that, most of all production difficulties and the impact of a resolution of 2048 x 1536 (if apple sticks to simply doubling the "old" specs) on a mobile devices performance...
So, what do you think? Will it pay up to prepare new projects for a retina iPad, starting now? Or do you think the overhead is not worth it, yet?
Maybe some of you are already developing with the retina iPad in mind..?
I'd be glad to hear some of your thoughts! Thanks alot, guys!
Edit:
Well, Apple just answered my question. Yes, it was in fact reasonable to consider a Retina iPad..!
I wouldn't spend too much time making your app work on a theoretical device. But that doesn't mean you can't be prepared. Ever since they started changing things around I've been considering the following:
Use vector art wherever practical. That way resizing should be simple
Don't assume that the screen is 768x1024 or 320x480. Try to make your views gracefully resize
Don't assume that there will be an on-screen keyboard
So far Apple have allowed time between announcing products and making them available, and even there un-optimised apps have still worked.
Most of my work is for a client who has their own designer, who provides me with layered Photoshop files to pick image elements out of. I now have a policy with them that ALL images will be provided to me at double resolution. I don't care if it's just text, if it's only going to be on the iPad, I want it at 2x no matter what.
That takes a lot of thinking and judgement out of the hands of the designer (who's a good designer but not a particularly good technician or strategist), and allows me maximum flexibility in what I'm building.
Right now, I don't think I'd build #2x support into an iPad app just now (although presumably 4.2 will allow you to do it and have it downgrade nicely, just like 4.1 does), but I have the graphics here ready to install when needed.
A few of Apple's apps (such as iBooks) have already been seen in the wild with #2x iPad graphical elements (mistakingly?) left in, so it is clear that a retina iPad is coming as soon as it is practical for Apple to affordably include such an incredibly hires panel.
It might be later this year, it might be a year from now, or it might be two years from now.
It doesn't hurt at all to prepare now though. It is easy to downres graphics, but it is often impossible to upres graphical elements without redoing them from scratch.
So short answer - do everything in #2x resolution now, but wait to include it with your app until the time is right. When Apple issues the call for retina iPad apps, you'll be ready to go and able to be featured on day #1.
I'm going to agree with the others. I'll go out on a limb and say I think it is highly likely that a Retina iPad will have 2x horizontal and vertical resolution compared to the current iPad screen, just like the did with the iPhone, because it is such a freaking clever idea in terms of the relative ease of support of the new resolution for developers, the backwards compatibility with apps that have not been updated, and it also gives Apple a mechanism for preventing developers from making a I'll-cram-in-more-UI-on-the-high-resolution-version interface...
So absolutely, planning ahead for this is a good idea. That said, the ideal would be to plan for full resolution independence where possible, using vector artwork and so on so you can re-export at new resolutions with minimum hassle.

Is using a widescreen monitor in portrait orientation more effective for coding? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
In the very near future my development setup will be upgraded and part of the deal will be dual monitors (yay!)
At least one of the monitors, possibly both, will be widescreen.
I've heard of developers using a second monitor, especially a widescreen monitor, in portrait mode. It allows for many more lines on the screen (albeit narrower) and runs a bit like having a long page of code.
Does anyone out there use this and think it's more effective?
I actually have 3 widescreen monitors in portrait mode and yes, it's a fantastic way to work. There's so much less scrolling around and you can fit all your debug / output / reference windows on screen at once.
The problem with using two monitors is that you'll generally be working on one main one and have output (or whatever on another). If you do have two, set it up so that your primary monitor is directly in front of you and the other (less frequently used) one is off to one side. I find that to be the best way to use a dual-monitor set up as it reduces RSI from being permanently twisted to look at a particular screen.
Additionally, there are some programs available to provide virtual screen splits which I've fund very useful for large/widescreen monitor setups.
[edit] ..and yes, you should write functions short enough to fit on a single page, but being able to see more functions at any one time can often make development easier in my experience :-)
[edit2] Running Visual-Studio-esque IDEs in portrait on a widescreen monitor is fantastic when it comes to debugging compile errors as you have more useable space to see code and errors at the same time. I suppose you could argue that if you compile regularly enough though, you shouldn't see that many errors at one time? ...but who codes like that? ;-)
Since you shouldn't write functions that are longer than a screen, making the screen much longer is a little bit of cheating, isn't it? ;)
Anyway, I found portrait mode not really better when coding, and only with my old 17" widescreen in portrait mode was viewing / editing documents better. With two large screens in landscape mode, You can put two pages on a screen when viewing documents, and have many tool windows open at both sides of the IDE's text editor. So no, portrait mode is not better, unless you have four of them to make up a really large screen (there was a photo of such a setup on a Microsoft blog, but I don't remember where).
There are some applications where portrait is still better, though, e.g. if you have to show a document in large resolution, or if you have some monitor (as in network monitor) running and want to see more lines at once.
I can't imagine how that would speed up productivity. In my opinion, it is always easier to scroll up/down than left/right.
It depends on which IDE you use, if any.
Microsoft Visual Studio likes to take up a lot of the width of the monitor with its “Toolbox” and “Solution Explorer”, so I find it works better on a landscape monitor. As it will not let you undock an editor window, you could not even drag a code editor to a second monitor that was in portrait mode.
Also consider how your customers are most likely to have their monitors set up. You may wish to write any UI code with the same setup, so you get a feel for what the application will be like to use.
Depends how big your monitor is. We have 1 28" monitor in landscape and 2 24" monitors in portrait which flank the big monitor.
Works great for pair-programming!
At work, I run my primary monitor (secondary is the laptop screen), in portrait mode. I really like it. I've become spoiled to seeing more code at once. I don't find that it encourages longer methods at all. Occasionally, I run across code that is a bit too wide since the IDE sidebars cramp it a bit, but I largely use Eclipse (Rational Application Developer, but Eclipse-based), so s quick double-click maximizes the code window, and it's very useful. Another double-click and I have my sidebars back.
I also find it a very useful orientation for my email.
I recommend it highly.
Portrait mode widescreen monitors work very nicely for edting code, thank you. However, some monitors have poor viewing angles on one dimension, which would usually be vertical but becomes horizontal in portrait mode. This can make the colours bad or unusable if everything isn't aligned correctly.
I have never given it a try but I would imagine it would work pretty well. I personally like to keep my lines fairly short, and wide screens tend to give me fewer lines of code, so I would give it a try.
It all comes down to personal preference however, what ever allows you to be the most productive and works best for you is the way to go.
For me it's not effective at all. I use IDEs, so in landscape mode I have sidebars to navigate code, navigate project etc.
It's not silly but a matter of opinion. A widescreen in portrait is very nice for writing code, code width has never really been an issue, and being able to see more line of code on the screen is always nice.
The other reason to put a widescreen in portrait is so it matches the height of your other monitor, for example a 30" widescreen next to a 22" widescreen in portrait have close to the same height.
It all comes down to your preference.
I just have one big monitor at my home office.
I tried it once. I didn't like it. I usually have an IDE and IDEs are perfect for widescreen. It's faster to jump around if you can see your function list on the right, file list on the left, etc.
Also, I try to keep my functions small so this usually isn't a problem (I have dual 24"). If your functions are reasonably small, and you have widescreen, you can show two files side by side which is often more useful. Some editors allow you to split the window and scroll to two different parts of the same file. This is also very useful is far better than having 100+ lines on the screen. With my settings, I have 60 lines per screen on an editor. If I split the editor, I can see 120. If I do it again on the other monitor, I can see 240. That's quite a bit of code and generally only useful for very different parts of it.
If you're working mostly with text (as most programmers or other technical folks do), or even documents, then portait mode is much more valuable. In fact, the general trend in displays is all the wrong direction: aspect ratios are squishing landscape displays to a mail slot to better fit the format of movies. Personally, I have never watched a movie on my computers (laptop or desktop), and I'm not about to start now - that's what I have a TV for!
In reality, vertical pixels are the most valuable asset in computing - do whatever you can to get more of them - you won't be sorry you spent the money! I won't even buy a laptop with less than 1024-1080 vertical pixels, since that's the minumum required to display a full page PDF at a readable resolution, and (much) more is better. (Since PDFs make up a large portion of today's online documentation/manuals, that's a very big concern.) You should only think about width after you've got enough vertical pixels.
What I really want is a 15.4" or 16" laptop with a portrait screen - these should still be wide enough to package a full-size keyboard into the base - a FlyBook-style pivot arm would be nice, but isn't required.
I found understanding the intent of related functions are easier when you print them first on paper than understanding them directly from screen, never fails, why? Because you can easily review many lines of code at one glance, no need for incessant scrolling.
The same thing with monitor oriented in portrait mode, you can easily understand the intent of multiple related functions, re-factored or otherwise. But don't let having portrait screen be an excuse to write a function with many lines.
Writing this on stackoverflow using portrait screen :-)
I can easily see many posts at one glance :-)
If you are working with print material, yes, as for source, why not full screen your IDE and close the task panes you do not need?
I find portrait is only useful to me if I'm working on a web site, being able to see the entire page at once helps.
I would say if the monitor is large enough you don't need portrait mode (24" and higher) for writing code.
If the monitor is smaller than that, then portrait mode is preferable.
Ideally what you would have is a single 30" (2560 x 1600) as widescreen to work on your code along with utilities comfortably open nearby and a second smaller monitor nearby to preview the results (I am speaking about web coding specifically here but it would probably apply to most other coding as well - a screen the size of your target audience's screens).
The 30" screens have really come down in price now so it's probably worth the jump up. A 24" screen does have the advantage of significantly larger text at default font sizes. The text on 30" monitors can get to be a bit of grind unless you move up to 14pt.
Good luck.
I have 2 19" monitors currently. One I keep in landscape and one I keep in portrait mode. I find that working on documentation or reading long web pages is easier on the portrait screen. I have used this setup for coding also and find that it does help, however it was a learned habit. lol

Is a glossy or matte LCD screen better for long coding sessions? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm looking at getting a new LCD monitor, but I'm concerned that a glossy monitor might cause more eye strain after a long day of work. I typically spend a lot of time in front of my monitor, so eye strain is definitely something I have thought about. Do you prefer the matte or glossy LCD screens and why?
Matte, because you get fewer reflections on it, which is good if your workplace is bright. I've worked with both, but especially if you have bright objects around your monitor, or windows at the side, you'll really want to have a matte one.
Constantly having reflections in it is really annoying, and hurts the readability in the long run.
I have a glossy screen on my laptop and I have an LCD standalone monitor that I hook up. I like them both for different reasons.
Reasons I like my glossy monitor:
Graphics look great
Colors look vivid
Presentations look awesome
Graphics are really easy on the eye and just seem to flow like on TV
Great for games which all look phenominal
It's widescreen, great for movies
I can see all my code without having to scroll right [because of the widescreen]
Reasons I like my Standalone Matte Monitor:
My code is easier to read [Consequently this is what I use for programming]
Graphics I design on my glossy screen don't always look great on my matte monitor, but graphics I design on my matte screen always look fantastic on both.
It's bigger (may not be relevant to you)
It has a higher contrast ratio and better backlight
If you're somewhere bright or have a light source behind you, i.e. you're sitting with your back to the sun, the glare can be intollerable on both screens... whichever has the highest "Bright" setting will win out here.
What I find a lot of people say about "You should use matte because..." or "You should use glossy because..." are just repeating what the guy in FutureShop or CompUSA spewed out trying to sell them what they ultimately bought.
I have one of each, I love having one of each and love them both for different reasons. Pick the one that's best at whatever you're going to be using it for.
My suggestion is this: Find somewhere you can try them both out side by side for what you're going to be using it for, or if you can, try each of them out for a few days to decide.
I prefer glossy, as long as the light isn't shining on it. Check where your computer is, and where your lights are and where your windows/skylights are. Otherwise, I would always use matte.
I think this is pretty much a personal choice. I used to think that glossy is unbearable until I got a laptop with a glossy screen and was forced to work with it for some time. Now I don't even care too much and don't feel that it's much worse. If I had a choice, I'd still choose a matte one, however.
I went through the same dilemma when I bought my current laptop. I'm an old timer and I didn't want the glossy screen. I almost bought a different one because I wanted the matte screen. I would go into Circuit City and Best Buy and I would hate the glare. I then used one at a friends house for a few hours in more real world conditions and I liked it. I bought a glossy one but I was still torn about my decision.
Now I'm glad I got the glossy and I wouldn't buy a laptop without one. Not only does most things look better on it, but it has a great viewing angle. I tend to use my laptop a lot when demoing stuff or working with a user. You can't beat the viewing angle of the glossy displays.
After two years of using it I rarely run into situations where I run into glare from bright lights. The few times I do, just a slight repositioning is all that is required to fix it.
When I replace my current monitors they will be glossy. The issues with a laptop, because it's mobile, just don't really exist with a desktop monitor. Neither of my two desktop environments present a glare issue. I currently have one glossy and one matte at work and I don't really see a difference between one and the other as far as eye strain. For me it's all viewing angle and how great stuff looks on the glossy one.
I think glossy gets too much stick. The reflections can be a nuisance in the wrong environment, but matte screens don't give the pure, unmolested picture that some people seem to think - incoming light gets diffused over the screen's surface by the anti-reflective coating hence the sharpness and contrast that the underlying panel is able to offer are degraded somewhat.

Resources