Photochromic Filters

Here’s another idea that I have come up with while I was reading a book on photographic exposure:

We all know that camera sensors are not capable of handling high contrast scenes. Our eyes though are very complex sensors that are capable of adjusting automatically and simultaneously even if there are extreme differences in light intensity in a single “frame”.  

Here is my proposal: We use photochromic filters between the lens and the sensor. 

Photochromic glass is very commonly used in prescription eye glasses. A popular brand is called Transitions. I wear them. In direct sunlight photochromic lenses turn dark and automatically return to their perfectly transparent state under the shade. 

We can probably use this technology in photography. If the contrast is very high, areas of the frame that are bright are automatically darkened proportional to the intensity of light. Areas that are dark remain fully transparent. 

This photochromic filter should be optional because there are times when you do want a high contrast shot such as in silhouettes. I would propose that there will be increasing levels of dymanic range control. Maybe have several layers of this thin filters that can be lowered just like mirrors in SLR cameras. I don’t know yet how to implement this. 

Yes, it’s a crazy idea. 

Advertisements

3 thoughts on “Photochromic Filters”

  1. 1.Putting stuff between the lens and the sensor changes the optical path thus causes aberrations in the image.
    2. The photochromic lenses react of UV, not to artificial light
    3. The effect is time depending and slow
    4. The effect is temperature depending

    Why not instead just use processing to flatten the contrast?
    Of if the camera is not upto the dynamic range of the scene, then HDR might be an option, or buying a better camera. Or waiting for image sensor improvements (DR should go up lots over the next 5 years due to stacked sensors).

    1. 1. Yeah, that’s why Sony uses translucent mirrors. Duh?!
      2. Photochromic lenses now react to visible spectrum. Duh?!
      3. When you are shooting landscape, it is much slower to compose a shot than to wait for the filter to react to light. Unless of course you are the machine gun type of shooter.
      4. HDR has its own uses but it is not good enough when you have very tiny gaps.

      Anyway, it is just my own idea. Not sure if it will work. I don’t like your uninformed negativity though.

  2. Sonys alpha-lenses have been developed with this in mind. The older lenses do perform slightly worse than they would wihout the semitrasparent mirror.

    A good example on the effect of putting material on the optical path between the lens and the sensor is the effect of sensor toppings (ie. infrared-filter, protective filter, AA-filter) – lenses not designed for those will work worse than ones designed for those. This is especially true for lenses wth exit pupil close to the image plane – for example using Leica M-lenses on Sony A7-series cameras causes excessive field curvature and smearning. I am sure you can do the mathematics yourself – it’s just geometry.

    Air has refraction index of almost exacly 1. Glass typically has about 1.5. Putting material with refractive index different from 1 into the path of light creates a change into the path light travels.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s