Apr. 29, 2020

Choosing the Right Sensor Type

Sensor Review Mono Cameras

  • Sensors can have different aspect ratios but share the same optical format. (Source: Flir)Sensors can have different aspect ratios but share the same optical format. (Source: Flir)
  • Sensors can have different aspect ratios but share the same optical format. (Source: Flir)
  • Red, green and blue pixels are rearranged into 2x2 “super-pixels.” Each super-pixel has one polarizing filter of each orientation.  (Source: Flir)
  • The near-infrared region of the spectrum detectable by the Sony IMX248 CMOS image sensor highlighted in red.  (Source: Flir)

To ensure you get the right camera for your application, FLIR designs and manufactures machine vision cameras with a wide range of sensors. Understanding the differences in optical format, readout, and pixel structure of these sensors, and how they impact different performance criteria can help you choose the camera that is best for you. For example, inspection of parts on a moving conveyor belt will benefit from global shutter readout, while traffic systems for detecting mobile phone use by drivers will find on-sensor polarizing filters useful for seeing through the glare of car windshields.

Resolution, Pixel Size, and Optical Format

Resolution, pixel size and optical format are closely linked. The optical format of a sensor is a measurement of the physical size of the image sensor. It is measured diagonally across the sensor and represents the diameter of the image circle the lens must produce to completely illuminate the sensor. Sensors can have different aspect ratios but share the same optical format.

Increasing the resolution while maintaining the optical format results in a decrease in pixel size. Smaller pixels of the same pixel architecture will generally have a reduced quantum efficiency and saturation capacity. Reducing the pixel size while maintaining resolution results in a decrease in sensor size. Lenses for smaller sensors are generally more compact, lighter and less expensive than lenses designed for larger optical formats.

CMOS Compared to CCD

CMOS is the dominant technology for image sensors. Compared to the CCD sensors they have replaced, CMOS image sensors deliver superior imaging performance across a wide range of metrics including Quantum Efficiency, Absolute Sensitivity, Dynamic Range and Temporal Dark Noise. CMOS image sensors can read pixels much faster than CCDs, yielding large increases in speed for sensors of the same resolution.

Global Shutter Compared to Rolling Shutter

While the practical differences between global and rolling shutter sensors are less of a factor with Sony’s success with their Pregius global shutter CMOS technology, traditionally, global shutter was preferred for imaging fast moving objects whereas rolling shutter was preferred for its lower cost and success at low light imaging.

Global shutter sensors have read-out circuitry on each pixel.

This enables them to read every pixel across the sensor plane simultaneously. Rolling shutter sensors read each row out sequentially. Global shutter sensors are preferable for imaging moving objects. By reading out all pixels at the same time, they can capture moving objects without any distortion. When rolling shutter sensors capture moving objects, the objects will continue to move as the line-by-line readout takes pace. This means that the object will be in a different position from one line to the next. Depending on the speed of the object being imaged, this can result in significant distortion.

Back-Side Illuminated (BSI) Sensors Compared to Front Illuminated Sensors

On most CMOS image sensors, the light sensitive photodiode is located on the back side of the sensor. It sits behind the readout circuitry, which is sandwiched between the photodiode and the microlenses used to direct light into the pixel. Back-side illuminated (BSI) sensors invert the layout of this typical pixel structure. By placing the photodiodes directly under the microlenses, photons can enter the photodiodes more easily, yielding a higher QE.

On-Sensor Polarizing Filters

On-sensor polarizing filters enable new applications by making it possible to detect not only the intensity of light hitting a given point on the image sensor, but also its polarization angle. Sony’s IMX250MZR and IMX250MYR sensors are based on their popular five-megapixel IMX250 Pregius global shutter CMOS sensor with the addition polarizing filters below the microlens of each pixel. These filters are oriented to 0°, 45°, 90° and 135°.

Glare Elimination

Polarizing filters can eliminate unwanted glare on reflective and transparent parts. On-sensor polarization enables these systems to be installed quickly and adjusted dynamically. In addition to simplifying lighting requirements for industrial imaging systems, glare reduction is useful for managing the challenging lighting encountered in outdoor applications.

Degree of Linear Polarization

Degree of Linear Polarization (DoLP) is the proportion of light that is polarized at a given pixel. A perfectly polarized light source would have a DoLP of 100%, while unpolarized light would have a DoLP of 0 percent. DoLP can be useful for differentiating materials which would otherwise appear identical.

Angle of Linear Polarization

The Angle of Linear Polarization (AoLP) is the average polarization angle of the light at a given pixel. When used in conjunction with a polarized light source, AoLP can be used to greatly enhance the contrast of the fibers in composite materials.

Combining Polarization and Color

The IMX250MYR sensor adds a color filter array to the sensor below the polarizing filters. This sensor uses a unique quad-Bayer pattern which prioritizes spatial resolution of the polarization domain over spatial resolution of color information.

Selectable Conversion Gain

Sony’s newest additions to their Pregius family of global shutter CMOS sensors come equipped with a unique new selectable conversion gain feature. This provides users with control over the gain applied during the analog to digital conversion.

By selecting between high and low conversion gain, the performance of the sensor can be optimized for high sensitivity or high saturation capacity. Enabling conversion gain is equivalent to adding an additional 7.23 dB of analog gain.

Near-Infrared Imaging Performance

The silicon used by CMOS image sensors to detect incoming photos, has a relatively low sensitivity to light of wavelengths greater than 900 nm. The average QE for Sony Pregius and Starvis sensors at 850 nm is 18 percent, while at 950 this falls to 7 percent.

For applications which benefit from sensitivity in the Near-Infrared (NIR) wavelengths, Pregius and Starvis sensors are generally recommended. While their QE at 950 nm may be lower than other sensors optimized for higher QE at this wavelength, the far lower Temporal Dark Noise (read noise) of Pregius sensors easily compensates for this. The low read noise results in Pregius and Starvis sensors having much better NIR Absolute Sensitivity Threshold. This allows higher gain to be applied, delivering a brighter, clearer image than sensors with higher NIR QE, but lower NIR AST.

Get the Review

If you would like to compare camera sensor performance for quantum efficiency, dynamic range, temporal dark noise (read noise), and more Flir’s 2019 mono and color camera sensor review can be downloaded (Link)


Flir Systems, Inc.
27700 SW Parkway Ave
97070 Wilsonville, OR
Phone: +1 503 498 3547

Register now!

The latest information directly via newsletter.

To prevent automated spam submissions leave this field empty.