Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-l] Camera to capture "light field"



On 3/1/2012 11:36 AM, Bob Sciamanda wrote:
By placing an array of microlenses in front of a camera sensor==>

https://www.lytro.com/science_inside

Scroll down to the bottom for a link to the Stanford Phd thesis behind
this.

Bob Sciamanda
Physics, Edinboro Univ of PA (Em)
treborsci@verizon.net
http://mysite.verizon.net/res12merh/
_______________________________________________
Forum for Physics Educators
Phys-l@carnot.physics.buffalo.edu
https://carnot.physics.buffalo.edu/mailman/listinfo/phys-l

The thesis seems to me to be remarkable lucid. Though I have had just a few minutes to scan the effort, I sense that a microlens array near the sensor plane that can capture light ray directionality (which of course requires at least three sensors per pixel in each color) amounting to 5 dimensions of input in each color, provides the wherewithal to compute an image for any desired optical plane. The cost is the reduced definition available at the selected focus - but with sensor arrays passing well beyond 15 Mpixels, this seems not to have been a stumbling block.

I would like to hope that this US development sees it's reward in US made products - in view of the sad demise of the major Kodak business. But I am aware that Asian engineers, often trained in fast-moving US businesses, are now exceptionally fleet of foot.

Brian Whatcott