4 Jan 2021
The multitude of camera phones has made photography a part of our everyday lives. Pictures can be taken at almost any time or anywhere because people carry their mobile phones with them everywhere. Also, the quality of the images is improving all the time, and it is possible to get an almost professional level photo without any prior education or experience when just passing by an interesting object or a scene.
The insides of a mobile phone may remain a total mystery for the average consumer, and yet nowadays a top-notch image is expected from these little gadgets. When it comes to fitting the optical systems into the thin mobile devices, we are already pushing the limits. Due to size limitations the mobile manufacturers are forced to invest more and more on improving the camera capabilities with software. The most impressive recent advancements in mobile device photography are already coming from the software rather than from more expensive and more advanced sensors and lenses. Today, a smart phone can capture better photos than some more dedicated cameras.
The principle of capturing digital images was borrowed from the human eye functionality. An image sensor senses the intensity of striking photons and converts them into an electric signal. The response of each pixel in a sensor is closely related to the amount of photons (light) that hit the pixel. A raw image file contains almost unprocessed data from the camera´s image sensor, and this is the version of the picture that the user of the mobile phone never sees before it is enhanced with different algorithms.
What does image quality mean? This might be one of the most difficult questions when it comes to image processing and imaging in general. Quality can be divided roughly into two categories: objective such as the amount of noise or sharpness, and subjective such as theme and composition. And even if we can measure the objective qualities, they can still be a very much a subjective matter. This means that even if some image is objectively measured to be better than another, some people might like more the latter. In the end, it all comes to personal taste.
In general, there are a few common criteria which usually determine the image quality: noise, sharpness, color, contrast, dynamic range, composition, and subject.
Noise is usually seen as random red, green, and blue dots most prominently seen in the darker parts of an image. This is due to noise from electronic circuits and stochastic nature of photon capturing. Increasing the amount of light that hits the sensor will hide these ”errors” but it is not always possible.
Image 1: Noise level and averaging. Left: input, middle: average of 5 images, right: average of 100 images.
Sharpness tells us literally how sharp the edges in the image are. In other words, how spread out the edge of a sharp edge is in the image. A sharp image usually looks better than a blurry image. An out-of-focus image is harder to look at, like reading without reading glasses.
Contrast is the difference in the intensity between different areas of a photo. High contrast makes objects more distinguishable from each other, low contrast images are usually dull and gray looking.
Image 2: Contrast. Left: input, right: histogram equalization
Dynamic range: A camera sensor can only capture a limited range of light and this is the dynamic range. It is basically the difference between the lightest light and the darkest dark in the image. Especially in the bright scene images with a lot of contrast, the details can be lost when the light is too bright. Either the brightest areas will turn into overexposed white mass or the darkest areas will turn a black mess without any details.
Image 3: High Dynamic Range (HDR) with very high range of brightness
Composition refers to how the objects and subjects are positioned in the image. Composition is usually done in a such a way that the viewer´s eye is automatically drawn to the most interesting or significant area of the image. If it is impossible to re-arrange the elements, like in landscape photography, another way is to change the photographer´s position.
Color affects the image quality in multiple ways. Usually, an image looks good when the colors are realistic or close to it. This can be achieved for example by increasing or decreasing the saturation and correcting the actual colors by adjusting the white balance.
Subject is the focus and the theme of the image. If we consider that the object is what the photograph is “of”, then the subject is what the photograph is “about”. For example, if we take a picture of a king, his crown could be an object in the picture and the king is the subject.
The story continues
In the next article Image Enhancement techniques we will discuss different techniques to improve the quality of a blurry or unsharp picture. There are a lot of different algorithms for image enhancement and they can be even processed simultaneously. It depends on the use, which techniques are usable.
Visidon specializes in the development of fast and energy efficient image and video processing technologies. Our software products include for example automatic image quality enhancement, computational imaging, and face recognition.
Our technologies utilize the latest microprocessor architectures and machine learning algorithms especially for mobile and embedded industries. Our solutions can be found in over 1 billion mobile phones.