Stereo 3D causes strabismus in adults!

Noticed this bit of news on /. today and it should temper the recent enthusiasm for stereo 3D promulgated by 3D cinema.

  1. SEGA discovered how serious the danger for children was 15 years ago after asking Stanford Research Institute at Palo Alto to study the safety of VR and buried the results: http://www.audioholics.com/news/editorials/warning-3d-video-hazardous-to-your-health

  2. While this article only discusses kids, it is nowadays known that the eye muscle training setups used to treat strabismus (and conversely, can induce it) which operate on the same effect that stereo 3D does (forcing convergence different from what the default is for the subject’s eyes), are also effective in adults: http://www.strabismus.org/all_about_strabismus.html#latetreatment

1 + 2 = if you use stereo 3D regularly and intensively, you will develop strabismus, period.

What this means is that shutterglasses, VR helmets, polarized projection with passive glasses, and all autostereo like Cubicvue’s stuff, parallax barrier, and lenticular arrays, are ALL DANGEROUS TO YOUR HEALTH unless used sparingly.

I would be surprised if we don’t see class action lawsuits start popping up within years. Only displays that provide consistency between stereopsis (convergence) and accommodation (focus) are medically safe for significant usage. Since holographic and volumetric displays are not really practical, this means that only tunable focus microlens arrays and tunable focus direct retinal projection are left…

And you know what ? Looking at a book or computer screen for hours a day will cause myopia, period.

Seriously, as I can understand point 1), your 1+2 conclusion is really exaggerated.

It is right that bad 3D content may harm more your eyes than good content, which has to be carefully setup so that the main subject of the scene is on the screen depth, and the maximum eye separation is not too extreme.
But staring for hours some heavy stroboscope or strong visual illusions on 2D TV content will also hurt a lot.

This smells someone has interest in undermining Nintendo/Sony efforts about 3D :wink:

Maybe…

But presuming you aren’t just an ambulance-chasing lawyer, note that that’s not the most serious potential side-effect. I’ve heard that bad 3D where the views fed to each of the eye are displaced “vertically” relative to one another (which is totally foreign to any human’s binocular vision) can cause seizures in some people (though I haven’t read any studies on it). Then again, the mere flickering off/on of polarized stereo viewing from a common video source can cause seizures in some (especially at low refresh rates), and there are disclaimers all over the place on that.

So yeah, life comes with some risk. News at 11. If you aren’t comfortable with the risk, don’t buy it.

This is a pretty serious and insulting accusation. Maybe you should note that one can insert any other of the major TV makers besides Sony, as they all either have 3D models or upcoming ones.

Both of the links I posted were on a slashdot headline yesterday. Maybe you should email their corresponding webmasters and level the same accusation.

I’m a 3D enthusiast and I have used shutterglasses for gaming starting over a decade ago. I did not know about this issue until I saw the Slashdot post, and I am quite disappointed. But now I’m glad that I didn’t use these glasses too much, as there’s a serious chance it could have damaged my eyesight.

I am not surprised to hear lots of denials of course, and there might be official ones if the first article especially spreads. But that’s how it’s always been… http://www.youtube.com/watch?v=VpwcF3Malj8

This was posted elsewhere; highlights are mine:

[quote=8:13;18832593]Problems of stress on the visual system have been most obvious in HMDs. While poor engineering design or incorrect calibration for the user can be a source of visual stress, a problem less easy to avoid is the challenge to the accommodation-vergence cross-links. Current stereoscopic VR displays provide an illusion of depth by providing each eye with a separate 2D image on a fixed focal plane. The mechanisms of binocular vision fuse the images to give the 3D illusion. Because there is no image blur, [b]the eyes must make a constant accommodative effort. But at the same time the images stimulate
a changing vergence angle with changes in apparent depth, so that the normal cross-linked relationship between the systems is disrupted[b] [Mon-Williams & Wann 1998]. The problem is not limited to HMDs as ANY stereoscopic display, from a stereoscopic desktop to immersive systems such as the CAVE, uses the same display method [Wann & Mon-Williams 1997]. Within certain limits the visual system can adapt, as shown by results of orthoptic exercises and of adaptation to different prisms placed in front of each eye. However, whether the changes are long term or whether there can be dual adaptation to both the real and virtual environments has not been established [Rushton & Riddell 1999].

What has been shown in several studies is that short-term exposure to VEs with stereoscopic displays has produced changes in heterophoria (latent squint), where the visual axes of the eyes deviate from their usual position. The resting vergence angle of the eyes may be altered either in the direction of exophoria (turning outwards of the eyes) or esophoria (turning inwards of the eyes). Some decrements in visual acuity have also been reported.[/b]These objective changes, which must be assessed using orthoptic instruments, are associated with reports of subjective symptoms such as blurred vision, headaches, eyestrain or momentary diplopia (double vision). The degree of objective change and the symptomatology also depended upon the VR system or VE being evaluated [Costello & Howarth 1996; Mon-Williams & Wann 1998; Mon-Williams et al. 1993]. The reported changes in heterophoria could account at least in part for the subjective symptoms as well as reduced visual acuity and reduced perception of depth when relying on stereopsis. These changes are similar to those reported with the use of NVGs and thought responsible for the reduced depth perception [Sheehy & Wilkinson 1989]…Certainly longer exposures in flight simulators result in greater severity of symptoms overall [Kennedy, Stanney, & Dunlap 2000].

http://dspace.dsto.defence.gov.au/dspace/bitstream/1947/4079/1/DSTO-TR-1419%20PR.pdf[/QUOTE]

About the vertical displacement :

Unlikely on computer graphics, but small vertical displacement is handled well in (some?) humans, I learned that to my surprise at the end of a small 3D film (polarized glasses, big silver screen, dual 35mm projectors). When removing the glasses, I noticed the dual images of the credits showed a very significant vertical shift (maybe half a letter or so). And for more than an hour it was unnoticeable and did not prevent good stereo perception.

“Hack your brain” section, to try in front of a bathroom mirror (bla bla you may be hurt and/or learn new things doing that bla bla no responsibility if you follow these instructions bla bla) :

  • stare precisely at one of your eyes, close enough to see fine details in your iris
  • tilt your head, between maybe 5° and 10°, while still staring at your iris
  • note that even if the head is tilted, the eye keeps the same angle relative to the horizon ! I checked later, indeed human eye has 3 muscular axes.
  • you can even tilt pretty quickly one side to the other, eyes will still keep the same angle
  • that also means that each image seen by the eyes is now vertically displaced relative to each other

Sorry I have to leave you and return playing for hours on TrackMania Nations Forever in red/cyan stereo with great pleasure.

Anyone who wears glasses will get a slight horizontal and vertical displacement compared to not wearing them (Especially when they are a few years old as they can become quite bent over time).
If one lens is much more powerfull than the other then the two images will have different relative sizes, which has got to be harder for the brain to compensate for than a simple displacement.

The focusing position of the muscles attached to the lens of the eye is completely different between glasses on and off, so i cant see how a slight mismatch between steriopsis and focus for a 3D image could be any worse.
If anything, the extra exercise of the eye muscles should delay the natural loss of focus accomodation as you age.

You are completely wrong, as a basic review of optometry will tell you: [Primary Care Optometry, p.310](Primary Care Optometry - Theodore Grosvenor, Theodore P. Grosvenor - Google Books vergence&pg=PA310#v=onepage)
And in there note the following:

Fortunately, the effects due to increased (or decreased) accommodation often combine with the effects due to the lack of induced prism by decentration to cause the fusuional vergence demand to change little.

The eye is close enough to the lens of the glasses that the varying thickness of the lens across the horizontal axis gives a prism effect, and that compensates for the change in vergence due to different accomodation as a result of the sphere/cylinder effect of the lens.

The page you quoted is comparing glasses with contact lenses of the same magnification, not glasses with uncorrected eyesight.
It also makes no mention of glasses that are bent so that the two lenses are no longer aligned.

If i tip up one side of my glasses 5mm i can see double vertically.
I can pull the images back together with my eye muscles, but i can feel this strain and could not do it for long.
If i move my glasses to the tip of my nose then i can focus on much closer objects, pushing them back moves the focus further away.
Glasses move around on the face as you move, so the eye lens and convergence muscles are continually having to correct for random errors that dont occur with natural eyesight.

If you view a 3D system that is setup for a different eye separation then the muscles that turn the eyes inward to look at very close objects will correct the alignment.
A small error will be no problem, a large error will make those muscles ache after awhile.

The effect that you seem to be concerned about however is that the eye is continuously focused at a fixed distance which does not always correspond to the depth that the eyes are converged on.
It is the convergence that is interpreted as depth, and as long as the separation between the images is not excessive, reversed, or distorted in such a way as the two images cant be aligned, then how can it cause strabismus?
The focusing inconsistancy should not cause any worse eyestrain than looking at anything close-up for extended periods would anyway.
With large-screen 3D TV the viewer is going to be several metres away, so the eyes focus will be almost fully relaxed anyway, making the focusing inconsistancy undetectable.
A screen about 6m away would create a binocular image that is indistinguishable from reality (as long as you dont tilt your head to one side).

“random” is key here. It’s analogous to the way a tourbillon mechanism keeps a high end mechanical watch more accurate–the variation in direction of biasing results in an averaging out of much of the bias.

A small error will be no problem, a large error will make those muscles ache after awhile.

The human visual system detects blurriness as defocus and adjusts accommodation to minimize it; it uses binocular disparity to adjust vergence. But the two feedback mechanisms are not independent. By proprioception, the brain has information of the eye’s accommodation at any given time, and this information also feeds into the neuromuscular mechanism that sets vergence. This is standard neurophysiology. With flat stereo displays, the accommodation-driven vergence is different from the binocular disparity-driven vergence. The latter takes precedence and stereopsis works despite the conflict because binocular disparity is the highest priority depth cue in the human visual system (followed by monocular disparity {i.e. motion parallax, texture gradient, relative sizes, overlapping, known object sizes}, then proprioception from vergence, and proprioception from accommodation). However, the tension still exists, and you have neuromuscular learning to have the eyes tend to a different vergence for a given accommodation than when viewing real scenes. The precedence listed before limits the effect, but doesn’t eliminate it, and is related to the strabismus treatment with prism glasses.

With large-screen 3D TV the viewer is going to be several metres away, so the eyes focus will be almost fully relaxed anyway, making the focusing inconsistancy undetectable.
A screen about 6m away would create a binocular image that is indistinguishable from reality (as long as you dont tilt your head to one side).

??
Vergence is an angular measure. A screen distant from you would have a correspondingly larger horizontal stereo separation to give the same 3D effect as a handheld screen with a much smaller separation.
The visual system expects a given vergence for a given accommodation. The eye being relaxed corresponds to distant focus, and the visual system expects nearly zero vergence; instead the screen’s stereo forces a different one.

To try to be constructive:
What we need are ultra-high resolution displays so that we can cover them with tunable microlens arrays. An 8x8 pixel patch per lens should be sufficient to create a 3D lightfield for a decent viewing angle and that recreates proper focus variations along the image. Now combine this with real HDR and a full visible-range color gamut, and you have a true window into a virtual world.
Tunable microlens exist and can be done with LCD or MEMS tech. Microlens arrays have been used in computational photography for a few years now. It would be expensive to build this right now, but so was Bightside’s propotype HDR display of a few years ago. Things like this get cheaper over time.
The spinning-mirror etc. volumetric displays also reproduce matching focus, but have the see-through problem; it’s like being able to render in 3D space only with additive blending and no opacity.

A high vergence implies an object close to the viewer.
Ignoring the occasional arrow flying towards the viewers head, most 3D is close to the screen (zero image displacement) or behind it, for the simple reason that anything closer will be clipped by the edges of the screen unless you have the field-of-view of an iMax cinema.
For a big-screen TV this would give a 3D image in the range of at least 3m to infinity.
Hence the eye will expect to be focused close to infinity, which anything over a few metres away is, the separation of the binocular images will be almost exactly what is expected for a real image, and no physical toe-in of the two eyes is required.

The effects you describe only occur for close-up viewing such as a computer screen, handheld game, or wearable dual-screen VR.

As a cheaper option, how about an RGB LED that passes through a focusing lens attached to a piezoelectric actuator (as used to move hard-disk heads) which is scaned directly into the eye by a tiltable rotating mirror in a head-mounted display.
Much better than a fixed screen as you can look in any direction in a virtual world, and it lets you tilt your head, something a fixed screen will never be able to do.

It also provides the 3D cue that hasn’t been mentioned yet, that of being able to move your head to change the viewing angle and see around objects.
This can be done to a limited extent with fixed screens by using a head-tracking device to move the camera in a 3D world.
This is only possible with full 3D worlds of course, the so-called 3D movies are not really 3D, they are only binocular.
A real 3D movie would digitise the whole set and let you view the action from any angle or position you chose (Maybe in another 20 years?).

You mean something like this? http://en.wikipedia.org/wiki/Virtual_retinal_display

Yes, but with an eye tracking device that detects what you are looking at in the scene, adjusts the focus of the optics so the eye has to focus at the same depth as that objects binocular depth, and feeds back to graphics system to make objects at that depth sharp while blurring objects closer or further away.
And a head tracking device so you can look around the virtual world.