#optics

garryknight@diasp.org

Could contact lenses be the ultimate computer screen? | BBC News

Imagine you have to make a speech, but instead of looking down at your notes, the words scroll in front of your eyes, whichever direction you look in.
That's just one of many features the makers of smart contact lenses promise will be available in the future.

#technology #tech #security #privacy #optics
https://www.bbc.co.uk/news/business-61318460

olddog@diasp.org

New camera no larger than a grain of salt | Popular Photography

https://www.popphoto.com/news/researchers-demo-camera-size-of-salt-grain/

Shrink-ray engaged: Researchers demo a camera no larger than a grain of salt

This tiny metasurface camera promises new possibilities for medical imaging/robotics and an end to unsightly smartphone camera bumps.

By Mike Tomkins | Published Dec 11, 2021 8:00 AM

Image

New camera the size of a grain of salt
The developers of this tiny camera hope the technology will some day make its way into your smartphone. Princeton University / University of Washington

With the rise of smartphones and mirrorless cameras, photography gear has gotten more compact over the past decade. But for some uses, like medical imaging and miniature robotics, current camera tech still proves far too bulky. Now, researchers have pushed the boundaries of what’s possible with an experimental camera that’s similar in size to a grain of salt and yet offers image quality that’s an order of magnitude ahead of prior efforts on a similar scale.

What is a metasurface camera?

Video: Princeton Computational Imaging Lab

Designed by a team of researchers from Princeton University and the University of Washington, the new system is detailed in a paper published last week in the peer-reviewed journal Nature Communications. It replaces the complex and bulky compound lens found in most cameras. With a metasurface that’s just 0.5mm wide, the camera is studded with 1.6 million cylindrical “nanoposts” that shape the light rays passing within.

The metasurface camera is said to provide image quality on par with a conventional camera and lens that is 500,000 times larger in volume. After comparing the sample images in the paper and Youtube video above, we’d say that’s perhaps a little too generous, but we don’t want to take away from the team’s achievements in the least, as they’re certainly impressive.

Image

Compared to other cameras

Neural nano optics remove the need for bulky compound optics.
Neural nano-optics remove the need for bulky conventional compound optics. Princeton University / University of Washington

The combination of a 2/3-inch sensor and an Edmund Optics 50mm f/2.0 lens used to provide the conventional camera comparisons still has noticeably better image quality, especially in the corners. But at the same time, the metasurface camera’s results are deeply impressive when bearing in mind its spectacular size advantage. And the results it provides are also far in advance of what was achieved by the previous state-of-the-art metasurface camera just a few short years ago (see below).

Compared to the earlier metasurface cameras, the new version differs in the design of its individual nanoposts as well as in its subsequent image processing. The nanotubes’ structure was optimized using machine-learning algorithms which prioritized image quality and field-of-view. The image processing algorithms, meanwhile, adopted neural feature-based deconvolution techniques. Finally, the results of the new image processing were fed back to allow further improvements to the nanotube structure.

These strategies have clearly worked well, yielding a huge step forward from the results possible with past efforts. While the compound optic still has a pretty obvious advantage in terms of fine detail, color, contrast, vignetting and corner sharpness, the gap between technologies is certainly shrinking.

Image

A comparison showing the latest generation of neural nano-optics
As you can see from this comparison, the latest generation of neural nano-optics is leaps and bounds more capable, from an image quality perspective, than the previous generation Princeton University / University of Washington

What’s next?

Next up, the research team is planning to increase the metasurface camera’s computational abilities. This should allow not only another step forwards in terms of image quality but also other capabilities such as object detection.

In the longer term, the study’s senior author, Felix Heide, suggests that the goal is to break into the smartphone market. Heide predicts that one day, you could see the multiple cameras in your smartphone replaced by a single metasurface that turns its entire rear panel into a camera. Is the era of awkward camera bumps soon to meet its end? We can only hope!

#Photography #Camera #Nanophotonics #Optics

hackaday@xn--y9azesw6bu.xn--y9a3aq

Math, Optics, and CNC Combine to Hide Secret Images in Acrylic

image

Magic mirrors, with an LCD panel hidden behind a partially reflectively mirror, are popular for a reason -- they're a good-looking way to display useful information. A "Magic Window," however, is an entirely different thing -- and from the look of it, a far cooler one.

If you've never seen a Magic Window before, don't worry -- it's partially because you're not supposed to see it. A Magic Window appears to be a clear piece of glass or plastic, one with a bit of a wave in it that causes some distortion when looking through it. But as [Matt Ferraro] explains, the distortion encodes a hidden image, visible only when light passes through the window. It looks a bit like a lithophane, but it's projected rather than reflected, and it relies on an optical phenomenon known as caustics. If you've ever seen the bright and dark patches cast on the bottom of a swimming pool when sunlight hits the surface, you've seen caustics.

As for how to hide an image in a clear window, let's just say it takes some doing. And some math; Snell's Law, Fermat's Theorem, Poisson's Equation -- all these and more are mentioned by [Matt] by way of explanation. The short story is that an image is morphed in software, normalized, and converted into a heightmap that's used to generate a toolpath for a CNC router. The design is carved into a sheet of acrylic by the router and polished back to clarity with a succession of sandpaper grits. The wavy window is then ready to cast its hidden shadow.

Honestly, the results are amazing, and we marvel at the skills needed to pull this off. Or more correctly, that [Matt] was able to make the process simple enough for anyone to try.

<https://mattferraro.dev/images/caustics/3dcat.mp4>

#art #mischacks #acrylic #caustics #lithophane #morphic #optics

petapixel@xn--y9azesw6bu.xn--y9a3aq

Lenses Don’t Cause Perspective Distortion and ‘Lens Compression’

image

The focal length of lenses doesn't cause perspective distortion, which is often explained as "lens compression." Period. Now before you call me crazy and dismiss whatever I have to say, I’d like to invite you to take a dive into how lenses work and what actually causes perspective distortion.

Definition

There are two types of distortion in photography: optical and perspective. It is quite easy to confuse the two as we commonly associate wide-angle lenses with distortion. In simple terms, distortion is an optical effect where the lens seems to alter the real perception that we see through our eyes.

Optical Distortion

Optical distortion has to do with the lens only. It is when a lens takes straight lines and makes them appear skewed. Often they are spherical with the least optical distortion at the center and most at the edges. This can be seen with a simple test against a radiator that originally has straight vertical lines:

50mm 24mm

As we can see, at 50mm the lens is producing straight lines, however, at 24mm it is already starting to distort the photograph when captured from the same distance.

There are three types of optical distortion: pincushion, barrel, mustache. Each has its own unique characteristic, however, discussing them is beyond the scope of this article.

A Word on Lens Types

Before I prove that lenses don’t affect perspective distortion, I need to clear up two lens types that exist in photography: curvilinear and rectilinear. The difference is in how these optical types render images.

A curvilinear lens will curve the photo very unnaturally, making it look fisheye. Canon’s 8-16mm f/4 lens is a perfect example of that.

A rectilinear lens, on the other hand, will render lines straight and will be close to human vision. This distinction is most appropriate to wide-angle lenses that have the most optical distortion. Canon’s 16-35m f/2.8 lens is a great rectilinear lens. Here is how they look

Perspective Distortion

This is the juicy part of the article. Often optical and perspective distortions are confused. Let me clear up what perspective distortion is.

What a camera does is simply create an image of a 3D space on a 2D sensor. This, of course, has its drawbacks because an object placed closer to the camera will appear much larger compared to the background.

Here is what I mean:

Can you see how when I move the 24-70mm closer to the camera it starts to appear larger and becomes the same size as a 70-200mm?

Our brain was trained to see this and recognize that this is just one thing being closer than the other. (It is suspected that babies don’t have this, therefore they see whatever is closest to them as the biggest.)

So, how would an image taken with a wide-angle lens differ from that of a mid-range and telephoto lens? To find out I took three photos. One at 16mm, the other at 35mm, and one more at 70mm. The image plane remained the same for all images, and the camera was not moved.

Here is how they look:

Afterward, I cropped each picture to show the same part of the image and overlayed them on top of each other.

Now I moved the camera to a position where the center of the frame would be approximately the same size as in the 16mm image. In fact, it was rather difficult to do so as the lens couldn't focus as close. Still, there is clear perspective distortion.

Closing Thoughts

As you can see, there are different types of distortion that exist. When it comes to perspective distortion, only the image plane (camera) position affects it. Hence, you can take the exact same picture with a 16mm lens and a 200mm lens simply by cropping the wide-angle photo into the telephoto one.

Different focal lengths will cause a photographer to choose different distances to their subjects, and this distance is what causes perspective distortion, not the focal length. Hence, it could more accurately be called "distance compression" rather than "lens compression."

#educational #distortion #explainer #illyaovchar #learn #lenscompression #lenses #misconception #myth #optics #perspectivedistortion #photography