Standard image sensors, like the billion or so already installed in practically every smartphone in use today, capture light intensity and color. Relying on common, off-the-shelf sensor technology—known as CMOS—these cameras have grown smaller and more powerful by the year and now offer tens-of-megapixels resolution. But they still see in only two dimensions, capturing images that are flat, like a drawing—until now.
ADVERTISEMENT |
Researchers at Stanford University have created a new approach that allows standard image sensors to see light in three dimensions. That is, these common cameras could soon be used to measure the distance to objects.
The engineering possibilities are dramatic. Measuring distance between objects with light is currently possible only with specialized and expensive lidar—short for “light detection and ranging”—systems. If you’ve seen a self-driving car tooling around, you can spot it right off by the hunchback of technology mounted to the roof. Most of that gear is the car’s lidar crash-avoidance system, which uses lasers to determine distances between objects.
…
Add new comment