ED. Links have been updated from 02JUL12 to point to New Lytro Gallery (at link).
It’s the 21st Century’s Pinhole Camera
There is no focus, there is no aperture setting, there are no moving parts save one button to power and one to shoot. Just as a pinhole camera had everything in focus, so too, does Lytro’s Light Field Camera. It’s the beginning of everything new again optically; it’s that important. I’m sure of it.
Poor light. So misunderstood. It’s recorded and seen in X and Y on flat surfaces like our eyes and cameras but what’s before us is actually a field of light; depth, the Z of XYZ. Here’s the end result as brought to you by Lytro – creator of the camera that takes 15 years of science and makes one device that takes and shares, “living pictures.”
Note: Clicking the image below allows you to refocus the image! You’re traveling through Megarays.
Megapixels – easy concept: A thousand pixels in various ratios (standard vs. HD for example). Megarays – a harder concept since we’ve been goofing with light as a 2D medium where cranking up Megapixels is considered a huge leap.
Who can see farther through fog?
I’ve been giving this concept a lot of thought lately so that I can make 3D animation more accurate. Who can see farther through particles in the air, like fog or swarms or even atmospheric perspective (haze)? A bug, a cat, a human or Godzilla?
Yes. And in that order. The bigger the aperture for light to COME IN, the more can be seen around obstructions to objects that are occluded. If I put a pack of cigarettes directly in front of a credit card and you look, with one eye, you cannot see the credit card. But, if you open both eyes and line up the centers between your eyes, you can effectively see AROUND the edges of the pack of cigarettes.
This concept roughly applies to the distance between both our eyes being the diameter of ONE of Godzilla’s eyes – he could see around the pack of cigarettes because the aperture of his pupil is much greater; more “megarays” can get in.
The other way – A bug can barely see through fog or smoke because the aperture of their eyes is so small that each particle stands a better chance of blocking their vision. A tiny bug can be blinded by a few particles of dust where a human wouldn’t even notice one hundred particles of dust in their sight. That’s seeing around something and it’s directly related to photography’s F-stop which is a part of Megarays.
Bigger f-stops see around things.
That’s why more objects appear more greatly out of focus. The width of the aperture opening is far greater than the width of an edge (the edge of a cup or wall or flower petal). The result is that we can see it and what’s behind it. That’s the trademark BLUR of a low F-stop; seeing around objects. To make this happen, we have to focus behind that edge.
Focal plane’s relation to Megarays
As you focus in front of and behind that edge, you’re really selecting an XY slice from the XYZ Light Field entering the camera. If you imagine the scene entering the camera as a little 3D cube, the focal plane is a “2D” plane passing from the front of it to the back of it. “2D” is in quotes because the Focal Plane is not all 2D – Depth of Field is how much IN FRONT and BEHIND the Focal Plane remains in focus.
A SMALLER F-stop (higher number, smaller aperture) increases the Depth of Field so that more is in focus. This is how Pinhole Cameras manage to capture everything – their aperture is merely a pinhole. Like a bug’s eye, it can’t see around anything SO everything appears to be in focus/not have the BLUR of a wide aperture.
In this sense, we’ve become accustom to light being a 2D medium because, until Lytro and Light Field experiments in the 90’s, we’ve recorded it that way.
Click where you want to focus. Fucking brilliant. Thanks to Lytro’s permanent F-stop of 2.0, the aperture is always pretty huge. So Lytro can see around the spokes of the bicycle. What it sees there reflects its own light into the scene. So rather than Lytro recording ONE Focal Plane for you to look at, it’s gathered EVERYTHING in the scene like a little 3D cube, or a shoe box diorama so that you can pick which slice you’d like to see.
Best example: Diorama in a memory card
The diorama is a perfect example of all of this – the bigger the viewing hole, the more you can see around the objects inside. The diorama is a 3D scene, not a 2D recording of a scene. Objects will go in and out of focus the more you look around them or depending which objects you chose to center your attention.
Currently, the Lytro is Mac-only. Images, living pictures they say, can be shared with Lytro’s software as they’ve done here – they provided embed links. 16GB Lytro Camera can shoot 750 Lytro images, light field files, pre-ordering for $499 or 8GB holding 350 files pre-ordering for $399.
This is sexy – this is as exciting a development as Tone Mapping and High Dynamic Range – this is as jarring to the industry as Terminator 2: Judgement Day’s morphing was to Cinema effects. The data stored in a Light Field File can also be used to create 3D.
3D Images from one lens. ONE!
That’s NEVER happened and I’m not the only one who thought that would never happen, COULD never happen. Keep your eyes WIDE OPEN on this technology.
More questions for the Lytro
Shutter speed? A picture of running water has some slight motion blur – how would that affect the Light Field? Signal to Noise ratio? The S:N has dead-solid meaning and importance in digital recording – both audio and visual. The more noise in the signal, the more static in quiet recordings or specs of colors in dark pictures – how would signal noise affect the Light Field? How sensitive to light are the sensors used in Light Field photography? Can they get bigger soon? And of course, although there’s no FOCUS to worry about, there must be limits: How close is TOO close? i.e. Macro available for Light Field Photography?