Wide-eyed Wonder: an artist's musings on three-dimensional vision

Some are color blind. I am stereo blind.

Archive for the ‘motion parallax’ Category

Stereo Vision Survey

with 2 comments

Exciting news! Bruce Bridgeman, the gentleman who gained stereo vision after watching Hugo, has teamed up with Sue Barry of Fixing My Gaze to create a long crowd-sourced research project in search of those who have experienced increased stereo vision after watching 3D movies.

Although my stereo experiences are limited and have not yet been scientifically verified, there seems to be room for even me to take this survey, as there is a comment section at the end of three different sections where I can plug in additional information. (In my case, how BRAO has affected my vision.)

I encourage all strabismic adults to at least read the survey, which is instructive in itself. If you have had a stereoscopic experience after watching a 3D movie, share your experience in the survey.

The survey also takes into account if you have had any vision therapy or had your stereo-awareness measured by a professional.

The VisionHelp Blog

If either you, a family member, or any patients you encounter have developed stereo vision as an adult – even intermittent or weak stereo vision – please complete this survey developed by Sue Barry and Bruce Bridgeman:

http://bit.ly/1vThYaM

custom_clipboard_check_em_15372

The survey and its background were just published on page 13 of the new journal, Vision Development & Rehabilitation.  Through crowdsourcing of this nature, Drs. Barry and Bridgeman may be able to provide evidence to support that the viewing of stereoscopic 3D movies and similar modalities can be therapeutic for certain individuals.  We blogged about that possibility here last year, and this survey is an important step in that direction.

Completing the survey is entirely voluntary. You do not need to answer every question before submitting it. Your answers are sent to a spreadsheet which simply tabulates your answers with no other identifying information.  Thank you in advance!

View original post

Advertisements

Life of 3.1415926535 8979323846 2643383279 etc.

with 2 comments

… that would be Pi

I’d read so much about the artistic use of 3D technology in Ang Lee’s film Life of Pi, I decided it was worth a strabismic test. If I did not see palpable space, with things jumping off the screen towards me, at least I would see art: imagery that would move me and delight my eyes and heart.

I learned something even before seeing the movie: I’m probably the last person on the planet to arrive at a 3D movie only to discover the theater is playing a 2D version! I’d read so much about the artful use of 3D by Ang Lee, that I completely ruled out anyone wanting to see the movie any other way. And so our first trip to the nearest theater in the next town was self-defeating. We went grocery shopping instead.

But, almost two weeks later, Pi resurfaced (with the necessary “3D” listed in the title) in the next town over. (I had given up on Pi and was looking for the Hobbit. But Middle Earth can wait.)

It was good timing for taking in a matinee today, as I had exactly one week to adjust to my new bi-focal lenses with base-right prism (but that’s another story.)

As the film began, the hummingbirds in flight brought an audible thrill to the folks to my left. Ah, but they were too quick for me! And then a short conversation began between my husband and I: “Did you see that? says he. “No” says I. After a few more similar exchanges I said “If I see something, I’ll squeeze your hand.” I believe he got one hesitant squeeze as the monkeys rushed through the trees.

Then I forgot all about how I was seeing as I became immersed in the story.

It was a delightful story with visible layers of foreground, middle-ground and background all moving on their own planes, however if only within the screen for me. More delightful were the even more layers of meaning. Naturally it is easier to take the layers of meaning with me, and enjoy them in my mind long after the visual effects fade away. Ang Lee was masterful in using the concrete layers of the story to enhance the abstract and philosophical.

I had popped sinus medication and ginger pills before the trip, and the seas did get rough! But my stomach did not once drop out from under me. (Suppression has it’s advantages.) I also did not turn green around the gills when the seas turned calm with endless random bobbing. No ginger pills needed, unlike the many times I’ve been becalmed on Lake Erie in my father’s sailboat!

But the “float” stayed with me after the movie.

When everything is floating for two and a half hours, my guess is it does open one’s brain to recognize “float” in the real world.
Dr. Susan Barry describes her experience with “float”, saying “Knowing that objects are separated by volumes of space and perceiving those empty volumes are very different experiences. ”

My first thing to pop out toward me was not the whale in the movie, but my coat, hanging in front of me on a hook in the bathroom stall. It billowed towards me. (My first “sighting” since getting the new lenses.) The bathroom sink faucet took on the familiar forward projection, and doorways and all things moving in my periphery as I walked through the lobby swam in kinetic motion-parallax. A trip to Lowe’s afterwards revealed noticeable depth in the layers of paint chip racks. Empty racks of all sorts reached out toward me, and a small stand of Ohio State banners swam towards me as I walked by, like a school of fish.

I’m floating still.

Links to layers of meaning
Life of Pi: A Novel by Yann Martel By Phoebe Kate Foster http://www.popmatters.com/pm/review/life-of-pi/

‘Life of Pi’ Ending Explained by Ben Kendrick http://screenrant.com/life-of-pi-movie-ending-spoilers/

Links to layers of 3D
Life Of Pi’s Visual Effects Are Extraordinary. Here’s Why by Brendon Connelly
http://www.bleedingcool.com/2012/12/19/life-of-pis-cg-secrets-fx-supervisor-bill-westenhofer-on-tigers-magical-skies-and-more/

How did they bring the ‘unfilmable’ Life of Pi to our screens? by Nick Clark
http://www.independent.co.uk/arts-entertainment/films/features/how-did-they-bring-the-unfilmable-life-of-pi-to-our-screens-8393738.html

Ang Lee On The Filmmaking Journey Of “Life of Pi” By: Scott Pierce
http://www.fastcocreate.com/1682021/ang-lee-on-the-filmmaking-journey-of-life-of-pi

Cat’s Ear and Coffee Cup

leave a comment »

I have commenced to sketch, as best I can, the various scenarios my brain morphs together. Today’s initial sketch for “The Physiological Diplopia Series” is called “Cat’s Ear and Coffee Cup”

One thing that has become of my leaving-flatland-goal + BRAO is Wonderland. Thanks to the three months of Vision Therapy I did have, plus lots of vision research and blogging, I am familiar with aspects of vision that I previously ignored: expanded peripheral vision, heightened motion parallax and physiological diplopia.

Of these three beautiful vision aspects, physiological diplopia is confirmation that my BRAO is not preventing both eyes from working together to look at the same thing at the same time in the same space. In my case, I experience it most 3-13″ or so from my nose, the same distance I was able to create diplopia with the Brock String before my BRAO.

Here is a diagram of this morning’s scenario. Instead of a bead on a string, I was staring at the tip of my cat’s ear just through the handle of my coffee cup which I was holding next to my face about 1:00 from the tip of my nose.

 

111206cat-cup-diag

Next to the diagram is my sketch of what I saw from each eyes and both eyes combined into a brain morph of right and left aspects:

 

The vertical hatching above the right eye cup is my BRAO. Note, when I am using both eyes, I cannot experience physiological diplopia where I have no right-eye vision (this is also true for stereo vision). In this case, in my 3rd sketch of both eyes looking, the top of the cup assumed the left-eye aspect.

The Wonderland experience was seeing my coffee securely held by an open shell spiral that my brain created when both eyes pointed at the tip of my cat’s ear! If I attempted to study the mirage too closely, it vaporized and the scene defaulted to the left-eye image. This is because my left eye has the superior central vision and therefore bears the “what” function of my vision.

I don’t see the brain morph most of the time. Normal people with stereo vision also do not see physiological diplopia unless they allow themselves to, by turning off their own brain suppression. I can’t vouch for how that happens; ask a Developmental Vision Therapist!

* “In an alternating esotropia the patient is able to alternate fixation between their right and left eye so that at one moment the right eye fixates and the left eye turns inward, and at the next the left eye fixates and the right turns inward. Where a patient tends to consistently fix with one eye and squint with the other, the eye that squints is likely to develop some amblyopia. Someone whose squint alternates is very unlikely to develop amblyopia because both eyes will receive equal visual stimulation.” [3]

 

Less than half full

with 4 comments

I lost the moon the other day. When I bent to see it out the passenger car window, the car roof blocked my left eye, but not my right. The moon hid itself in the blind half of my right eye. It was quite a surprise! Thankfully, I don’t normally see things disappear in this way.

It’s been over 3 months since my vision loss from the branch retinal artery occlusion (BRAO) and I am pretty resigned to not regaining my central vision. The blindness in the upper half is a bit more than half, making reading impossible with the affected eye, and eye-teaming by pointing both eyes at the same thing at the same time next to impossible. My tests with the Brock string reveal a partial string in front of the bead that my right eye cannot see without a conscious effort to look above the bead (photo illustration here).

My decision to pursue more vision therapy to gain stereopsis is pretty much settled: if I could read with the right eye and see the Brock bead easily, I would go for it. But alas, I cannot. When it comes to seeing 3D, my glass is less than half full.

I do have one friend who has urged me not to fully resign myself to permanent loss until six months have passed. He also had BRAO and regained more vision in months 4-6. However, the retinologist said the ischemic tissue would resolve in about 3 months, so I’m mostly resigned at this point.

And so I have begun to grieve a bit. When watching the best documentary of Dr. Susan Barry’s (aka “Stereo Sue”) story yet (Imagine: The Man Who Forgot How to Read and Other Stories, Part 3 beginning at minute 11) I felt that I would never be able to see the front end of a Toyota (minute 14:14) with anything close to her jubilation over it’s roundness … I got misty-eyed when the film ended with Oliver Sacks sadness over his stereo loss (Sue’s gain is compaired to Dr. Sack’s loss in part 4). That night the family cat breathed her last, and the next morning the torrents flowed over the dual losses. My husband was relieved to see me cry, finally.

My self-portrait is another step in the grief process. Today, I took advantage of the cracked side of the mirror and left the top of my head in a vignette to illustrate not so much what I see, but how my impairment can feel at times. I’ve felt the need to do this portrait before pursuing my art again. (I still can’t make a nose pop out like it should!)

The good news is that my brain has smoothly pieced together a complete visual field. I actually do see a near-complete picture without wrinkles or cracks, but sadly, this is because my right eye is almost fully suppressed. I am a master suppressor, having suppressed my left eye for no real reason all my life right up to March 26th. I was beginning to overcome this rogue suppression when the BRAO hit. Now, ironically, suppression is helpful.

I only see my blind veil when the left eye is occluded by the bridge of my nose, most often when I turn to look behind my right shoulder to back up the car. At those times, I rely on my recently learned ability to look above what I need to see. Nothing is clear, but movement and large objects can be detected out of the corner of my right eye. Needless to say, I avoid backing up the car as much as possible, and do so very slowly when I absolutely have no choice.

I frightened myself passing a box truck last week. I felt way too close to the truck when I quickly got back into my lane after realizing the oncoming pickup truck was much closer than I had first determined. I felt it was a close call, and I’m sure the other drivers thought I was out of my head!

When not encumbered by driving, my summer hours in the outdoors have been delightful. I attribute this to a ramped-up sense of motion parallax. This week, picking blueberries and pruning are challenging my brain and eyes to orient myself in space. These are visually demanding situations where “where” is more important than “what.” When I make a move, the branches of bushes and trees diverge and converge just like a 2D video game. What fun! I also routinely search for and destroy the random leaves of returning poison ivy with carefully aimed squirts of herbicide, first-person shooter style.

Inside the Northern  Laurel Oak

I really sensed space inside my magnificent Laurel Oak, but alas, a photo doesn't capture volumes of air.

Occasionally, I thrill over my sense of what Susan Barry calls “palpable space” as well as the heightened textures of grass, weeds, and even asphalt. This is probably because I am seeing the world through my “other” eye and the viewpoint and perspective are new. While hanging laundry, I truly sense the space between the moving clothes-lines and pins. Sometimes, I am enchanted by the hollow spaces inside trees, and the “float” of the lily pads on my pond. I can see “under” the wire mesh deck table when I bob up and down in my deck chair in the evening cool. I see the space inside my coffee cup (this I consider to be true stereo). It is all a delight to my inner child.

So much of the world in my new, 5-acre homestead (photos here) is a rediscovery of childhood delights: stars at night and glorious moonshadow; weeds I haven’t seen since childhood blooming in delicate flower at the edges of the pond. We even have bats at sunset, that swoop over the pond in amazing aerobatics as they scoop up their insect meals: another childhood memory from my Nana’s summer cottage in New Jersey. When a cold front comes through, the clouds dance over the house and fields …

I can still be amazed at everything I see. I still SEE, and so my half-empty stereo-vision cup overflows.

Fun with Orthoptics

with 2 comments

I started orthoptics or vision therapy exercises just after posting about the matrix in my head about a month ago, when I felt my brain was over-riding the new, wider way of seeing I had been experiencing from the bi-nasal occluded glasses.

I am happy to report the eye exercises are contributing to a wider peripheral field again. I am definitely not bumping into doorways, furniture and counter-tops on a daily basis, as I was before wearing the glasses. So the glasses and the exercises are re-programming my peripheral vision to a wider field that seems to stay with me throughout the day.

In fact, when I drew what I saw through my glasses back in November for my post on “The Frame Game”, I was straining to see beyond the lens area to sketch what was in the edges of the frame while looking straight ahead. No more. I am easily seeing my glasses frame and stem pieces as I type!

vision therapy orthoptics: fusion

After touching the bridge of my nose during the "eye control" circuit, I pull Mr. Bird out to my fusional area

I credit this to daily eye control, smooth pursuit and peripheral vision building:

Eye control is simply following a finger or my feathered friend Mr. Bird around the bony perimeter of my eye sockets: brow, outside edge, center cheekbone and bridge of nose. I look pretty funny doing it, but this is the yoga stretch of eye exercises. My left eye now tracks as fluidly and smoothly as my right. Well worth 3-5 minutes a day!

Thumb Pursuits are an extension of the same. This time my unpatched eye tracks either my thumb or Mr. Bird with my arm fully extended. Mr. Rimke checks that my head is centered on my body and remains stationary. Then he asks me random questions while I maintain focus on Mr. Bird and keep the things in the room behind flowing in motion parallax. I should work each eye for 3-5 minutes.

The Nielson Chart is a new favorite. When I first started this exercise, I couldn’t see any of the circle to keep inside. Now I am seeing more and more of the circle from almost all of the plus sign fixation points. I also noticed my circles are near perfect in dim light. This is because peripheral vision increases when the eyes are more dilated. I use both right and left hands 2-3 times for 10-15 minutes.

Vision Therapy Orthoptics Nielson Chart

The Nielson Chart

More smooth pursuits are needed to perform Straw in the Target. I patch an eye and hold a tube out in front of me at various heights and depths with my patched side hand, and use my other hand to smoothly direct a straw towards the tube from the far edge of my peripheral vision to a smooth insertion into the tube. 5 minutes with each eye and different diameter tubes constitutes total workout.

I have also been given two motor control exercises which I am going to pursue more aggressively after another week or two of trigger point shoulder therapy to unlock and heal the muscles surrounding my rotator cuff. (Note: this is my idea, not my vision therapist’s.) Right now the Randolph Shuffle and Angels in the Snow are a bit too painful. Last week, I did master the shuffle sequence and can change it on demand.

I will have one more week of additional in office tests and orthoptic exercises, and then we will evaluate at my first 8-week mark. I have less to show for my first 8 weeks because it took us four weeks to find any semblance of a fusional area. I have not begun any fusion exercises, and may not for a few more weeks. I do try and check that hard-won fusional area every now and then, and seem to see lo-ong, “E.T.” fingers in 3D, but little else at this point: not even Mr. Bird!

I must be patient! But, oh— to be a 4 month old learning fusion and convergence naturally, during feeding time!

“The illuminated jet bib feeding system” can be found at http://www.thinkgeek.com/geek-kids/1-3-years/c682/ Check out those flashing lights! Maybe I could keep my nose centered if I wore one of these!

Non-stereo Notes from the Flatlander Hairdresser

with 3 comments

I cut my own hair, and I don’t see 3D. Describing my process will be a great introduction into non-binocular depth cues.

First off: I avoid using electric clippers like the little guy on the left! Scissors work best for longer hair, and are more tactile.

Half of hair cutting is tactile, involving the sense of touch. We stereo blind are very gifted at judging distance by touch. So the act of pulling small sections of hair perpendicular to my head to the same distance as the section before is almost intuitive. It involves a sense of timing, too. There’s a sort of steady rhythmic slowness as I run my hand along my scalp, feel a section and slide my fingers up and out and hold, maybe on a subconscious count of three, then snip using the tops of my fingers as a guide.

Secondly, I can judge distance by 2D sight because I have trained myself to do so as an artist and graphic designer. The act of drawing translates the 2D world I see to a 2D canvas or monitor. So I can see 3″ long hair horizontally, vertically and even at an angle with foreshortening, and still intuitively know it is 3″ no matter the angle. (This is mostly known as linear perspective.)

Hairdressers have other tricks, like cutting off the same amount of hair from the ends. “How much do you want me to take off?” is a common question. Once they have taken that amount off one section, they pull up a part of that section into the next, and use it as a guide.

I start my haircut by pulling my hair forward around my face and cutting my bangs and sides where I want the edges to end just under my eyebrows and along my cheekbones. Then I start cutting sections from front to back, using the cut hair as a guide for the amount I wish to take off the rest.

Just like any binocular do-it-yourself hair dresser, I use a mirror on the wall to cut the front and sides. The image of myself that I see is flat, but I still can measure how much to take off and how long the remaining hair is from “3D to 2D conversion” practice. I’m constantly checking by pulling up sections from both sides at the same time at points where the hair is in perfect profile. I’m also studying how the cut hair lays on my head, turning to see if any missed part is sticking out.

I have to move my head to pull each section to a place where I can see it clearly, in perfect profile, usually to the outside edges of the flat image I see of myself. It is also along these edges that I can judge if anything is sticking out.

Since reading about motion parallax, I am moving my head after judging a section length to look at it from more than one angle, as wiki says pigeons do to judge depth [1]. The motion parallax is a new non-binocular cue I am using, and I make less mistakes (I have very forgiving hair!)

Just like the binocular hairdresser, I need a third hand to do the back. I turn my back to the wall mirror and use a hand mirror to see. I no longer need to move my head to gain motion parallax perspective; I can move the hand mirror. I have to put the mirror down to use the scissors, which means I cut by feel, cutting along the tops of my fingers and trying not to take some skin!

I have no clue if this whole experience would be easier in 3D. To you, the reader, I may sound like a square describing how he sees a triangle in the book, Flatland. Or perhaps my way of “seeing” isn’t so radically different after all.

Written by Lynda Rimke

January 24, 2011 at 12:47 pm

3D Movies and Stereoblindess

leave a comment »

Hi, I’m just curious … I don’t see 3D. Would it be possible for me to get a glimpse of your 3D movie and find out what I can see?

I asked this only after purchasing tickets for the 2D version of Narnia: The Voyage of the Dawntreader Afterward, I and my friends shared a pair of glasses to briefly check out the 3D difference.

Even though I didn’t get the “Wow!” of an object jumping off the screen or moving behind the screen that my friends saw, I would still pay the extra bit to see the richness of detail the new polarized glasses create. Everything was clear, and not a blur (which happened often in the 2d version. Thoughts on why later.)

With polarization, which cuts out some glare, I saw deeper colors. [1] I studied the golden shell Edmund pulled from the pool of gold, and it appeared richer to me, even though it was not 3D. Instead, it was similar to what I experience when I see a painting by that rare artist that can translate 3D to a canvas, such as my friend and portrait artist Judith Carducci.

Unlike legacy 3d movies that required the red-cyan glasses, the new technology uses a different kind of fusion. Legacy film 3D required two projectors running film perfectly in synch, shot with cameras that had to be aligned according to a precise geometric formula during filming. [2] One camera filmed through a cyan filter, and the other filmed through a red filter. The red-cyan glasses worn by viewers would cancel out the conflicting image and the binocular function in the brain’s visual cortex would fuse the two images from each eye to create 3D. [3] It was an imprecise method and never caught on, as every imperfection would create literal headaches.

Furthermore, the stereoblind, who could not bring the non-conflicting images together with one-eyed viewing, experienced a movie that was either completely red or cyan!

The new technologies are digital. No film is used. The two cameras that do the “filming” to digital files are precisely aligned with a computer. [4] Digital post processing of CGI effects is also rendered with 3D formulas. (In the case of Dawn Treader, only the post processing was 3D. Two cameras were not used to shoot the live action. [5]) The left and right integrated “film” file is shown through only one projector, as the input from both angles is digitally fused into one movie. [4]

Both RealD and Dolby 3D projection systems use a rotating high-speed alternating filter. Left and right images alternate so quickly that the brain takes them in through the Dolby optically coated or RealD polarized glasses as one continuous image. The coatings or polarization in the glasses cancel out the conflicting alternate images in each lens. [6] [7]

What this means for the steroblind is only every other image is seen through either the right lens or the left. There is no haloing or ghosting because the opposite image is blocked by the same lens the dominant eye is using.

This is good news! No ghosting! No red or cyan viewing!

Eye problems, even with the 2D version

The newer 3D movies rely on extraordinary optical flow and motion parallax. The rapid CGI rotation and camera panning appeared blurred or jerky to me in the 2D version, annoyingly so, because all my depth cues were put on steroids. It was like watching a home video where the would-be videographer didn’t know how to pan his camera. If these scenes had lasted any length of time, I would have needed to avert my eyes to avoid nausea, a common problem I experience due to lack of stereoscopic ability.

All of the film’s CGI sequences appeared jerky, like watching a video with a slow internet connection. At the end of the film, during the credits, the background had a very annoying flicker. I wonder if this is because the producer did not bother to create alternate 2D CGI? The film underwent 8 months of 3D post processing. [5] Making alternate 2D CGI sequences may not have been in the budget, especially if the producers thought the 3D ghosting or haloing and alternating flicker would not be noticeable to most viewers.

Secondly, much of the film was was shot with a lack of depth of field, especially during the layered action scenes onboard the ship. Because I am steroblind, relative size relationships and perspective behind and before the action were out of focus. I felt somewhat lost and annoyed without these depth cues, as I was constantly subconsciously attempting to bring the whole scene into focus.


In this image, Lucy’s face is in focus, but the painting and Edmund are not. This is due to the way the scene was shot, with a lack of depth of field.

All this points to a dim future for the stereoblind movie-goer, as every 2D version of a 3D movie is not going to be as easy to watch as the simple 2D movies of old.

RealD polarized or Dolby coated glasses would cancel out any 3D flicker that may be embedded in a 2D version, but how to find out if 3D sequences are in the movie? Would the ticket person be able to hand out the glasses for 2D movies if asked? Probably not. Better to spend the extra 3 bucks and just watch the 3D version with the same flatness I see in real life.

At least I am improving my peripheral awareness to enjoy the pumped-up optic flow and motion parallax that these new films are dishing out.

Links to the technology of 3d movies for further exploration

Popular Mechanics “The Tech Behind 3D’s Big Revival” April 2009 http://popularmechanics.com

“JDSU Shares Science Behind 3D” http://www.youtube.com JDSU is the company that developed the coatings for Dolby 3D. The commentator offers some lame misinformation, as his competitor’s polarized glasses are being recycled at movie theaters and not thrown into landfills.

“To 3D Or Not To 3D: Buy The Right Chronicles Of Narnia Ticket” http://www.cinemablend.com offers a sub-par 3D analysis of the film, as the reviewer says at one point that it didn’t matter if she saw the film with the polarized glasses or not. I’m sorry, but in any 3D film, the glasses cut out the bothersome ghosting. This reviewer must have grown up with poor TV reception in West Virginia.

Tech info on RealD from wikipedia

RealD 3D cinema technology uses circularly polarized light to produce stereoscopic image projection. Circular polarization technology has the advantage over linear polarization methods in that viewers are able to tilt their head and look about the theater naturally without a disturbing loss of 3D perception, whereas linear polarization projection requires viewers to keep their head orientation aligned within a narrow range of tilt for effective 3D perception; otherwise they may see double or darkened images.[2]

The high-resolution, digital cinema grade video projector alternately projects right-eye frames and left-eye frames 144 times per second.[2] The projector is either a Texas Instruments’ Digital Light Processing device or Sony’s reflective liquid crystal display. A push-pull electro-optical liquid crystal modulator called a ZScreen is placed immediately in front of the projector lens to alternately polarize each frame. It circularly polarizes the frames clockwise for the right eye and counterclockwise for the left eye. The audience wears spectacles with oppositely circularly polarized lenses to ensure each eye sees only its designated frame, even if the head is tilted. In RealD Cinema, each frame is projected three times to reduce flicker, a system called triple flash. The source video is usually produced at 24 frames per second per eye (total 48 frames/s), which may result in subtle ghosting and stuttering on horizontal camera movements. A silver screen is used to maintain the light polarization upon reflection and to reduce reflection loss to counter the inherent losses by the polarization filters. The result is a 3D picture that seems to extend behind and in front of the screen itself.[3]

Tech info on Dolby 3D from wikipedia

Dolby 3D uses a Dolby Digital Cinema projector that can show both 2D and 3D films. For 3D presentations, an alternate color wheel is placed in the projector. This color wheel contains one more set of red, green, and blue filters in addition to the red, green, and blue filters found on a typical color wheel. The additional set of three filters are able to produce the same color gamut as the original three filters but transmit light at different wavelengths. Glasses with complementary dichroic filters in the lenses are worn which filter out either one or the other set of three light wavelengths. In this way, one projector can display the left and right stereoscopic images simultaneously. This method of stereoscopic projection is called wavelength multiplex visualization. The dichroic filters in the Dolby 3D glasses are more expensive and fragile than the glasses technology used in circular polarization systems like RealD Cinema and are not considered disposable. However, an important benefit of Dolby 3D as compared to RealD is that no special silver screen is needed for it to work.