Open access peer-reviewed chapter

Thoughts for Foods: Imaging Technology Opportunities for Monitoring and Measuring Food Quality

Written By

Ayman Eissa, Lajos Helyes, Elio Romano, Ahmed Albandary and Ayman Ibrahim

Submitted: 07 July 2021 Reviewed: 17 July 2021 Published: 22 November 2021

DOI: 10.5772/intechopen.99532

From the Edited Volume

A Glance at Food Processing Applications

Edited by Işıl Var and Sinan Uzunlu

Chapter metrics overview

543 Chapter Downloads

View Full Metrics


In recent decades, the quality and safety of fruits, vegetables, cereals, meats, milk, and their derivatives from processed foods have become a serious issue for consumers in developed as well as developing countries. Undoubtedly, the traditional methods of inspecting and ensuring quality that depends on the human factor, some mechanical and chemical methods, have proven beyond any doubt their inability to achieve food quality and safety, and thus a failure to achieve food security. With growing attention on human health, the standards of food safety and quality are continuously being improved through advanced technology applications that depend on artificial intelligence tools to monitor the quality and safety of food. One of the most important of these applications is imaging technology. A brief discussion in this chapter on the utilize of multiple imaging systems based on all different bands of the electromagnetic spectrum as a principal source of various imaging systems. As well as methods of analyzing and reading images to build intelligence and non-destructive systems for monitoring and measuring the quality of foods.


  • Food
  • Quality
  • Imaging technology
  • electromagnetic spectrum
  • image analysis

1. Introduction

The quality and safety of human food is the concern of all the elements of this system from the consumer, producers, and manufacturers of food, food control organizations, food safety organizations, and the market niche requirements. When agricultural and food products do not meet the quality standards and safety criteria, consumers lose faith in producers leading to the loss of these products’ competitiveness in the market, and then significant economic loss. Although some systems are proposed to achieve food safety and quality by achieving a set of conditions that fall under the so-called Good Manufacturing Practices (GMP) and Hazard Analysis and Critical Control Point (HACCP) which represents the best way to achieve food security through all production steps. Unfortunately, with all these requirements for GMP and HACCP systems and others, they are not sufficient to ensure the production of safe food free of contaminants and defects, so it has become necessary to introduce modern technologies to quality inspect and detect blemishes and contamination. Therefore, the focus was on the development of non-destructive, modern, fast, reliable, and applicable methods that meet the needs of both food manufacturers and producers, as well as the desires of the consumer. In the present scientific climate, an acceleration in the growth of image processing technology has been observed among rapidly growing technologies. As well, image technology forms a core research area not only in engineering and computer science disciplines but also in the agricultural and food sectors. Image analysis is a study technique that aims to quantify the characteristics of each part of the image, both concerning their location and their filling. In fact, the densities are analytically observed as a function of the position concerning a reference point. The image is observed in its most basic units which are pixels (PICture ELement). They can have a square or rectangular shape and can take one or more values depending on the type of acquisition that is made, whether mono or multispectral. Even images taken with common cameras have more than one piece of information on each pixel. In fact, in this case, each pixel will have three numerical information with numbers ranging from 0 to 255, meaning by 0 the absence of color and with 255 the maximum intensity for that band of the electromagnetic spectrum. Image analysis is of interest in many areas of study, from the medical to the criminological, from the building to the agricultural sector. This diffusion is due to the fact that it is a very objective type of analysis that is based on a certain source (the image) and bases a series of calculations on it with a rigorous statistical approach. In the agri-food sector, image analysis is very successful because the appearance of a food product has a whole series of qualitative information that is difficult to parameterize by classical methods. Sensory approaches always remain invaluable tools of judgment, especially if conducted on a representative number of subjects and if carried out with an adequate plan of relief data and statistical adaptations. In addition, as far as the visual component is concerned, human vision, as already indicated, is limited to wavelengths between 390 and 700 nm, with greater sensitivity around 550 nm. The human eye is also influenced by the brightness of the background (simultaneous contrast effect) and tends to overestimate or underestimate information at the boundary between objects of different intensities (Mach Band effect) [1]. According to the widely accepted Retinex theory [2], there are three systems of independent cones consisting of receptors that read in three different wavelength regions of the visible spectrum.

Image analysis techniques, applied in the food field, show the following main advantages: objectivity, continuity over time, and rapid decision-making. An image is a representation of a two-dimensional or three-dimensional reality according to the independent spatial coordinates of an object. This is a one-plane transposition of the object’s descriptive information, placed in exact positions concerning a reference point. In this context, [3] defined the image as a two-dimensional function, f (x, y), where x and y are the spatial coordinates and the amplitude of function, f at any pair of coordinates (x, y) is called the intensity or gray level of the image at that point as depicted in Figure 1. Consequently, if x,y, and the amplitude values of f are finite and discrete quantities, the image is defined as a digital image. So, the digital image is composed of a finite number of elements called pixels, each of which has a particular location and value. The images are generated by the combination of an energy source (electromagnetic, but also ultrasound) and the reflection of the energy emitted by the object.

Figure 1.

Illustrates gray and color images: an array of pixels intensity and color values.

A digital image is an ordered set of numbers each representing the intensity of reflection of the electromagnetic band of which the camera is sensitive. So, the digital image is composed of a finite number of elements called pixels, each of which has a particular location and value. Most digital cameras are equipped with sensors that allow the reading of individual pixels arranged in an arrangement called the Bayer mosaic [4] as shown in Figure 2A, consisting of blocks of four pixels, two green, one red, and one blue. Because each pixel is sensitive only to its own color, the end result is an image with scattered red, green, and blue dots. To achieve gradual tones and smooth photography, the processor or editing software must subsequently debayer Figure 2B. Because each pixel is filtered to record only one of the three colors, the data for each pixel cannot fully specify each of the red, green, and blue values on its own. To obtain a color image, various demosaicing algorithms can be used to interpolate a set of complete red, green, and blue values for each pixel. These algorithms use the surrounding pixels of the corresponding colors to estimate values for a particular pixel. Different algorithms that require various amounts of computing power result in final images of variable quality. This can be done inside the camera, producing a JPEG or TIFF image, or outside the camera using raw data directly from the sensor. Multispectral cameras are equipped with as many sensors as there are bands of the spectrum from which you want to get the reflected information.

Figure 2.

Depicts an arrangement of arranged individual pixels, A) Bayer mosaic, and B) Debayering Process. (

Therefore, an image obtained from a multispectral camera will be a three-dimensional array consisting of as many matrices as there are observed bands. Each pixel in the array will have the value of the intensity of the amount reflected by the photographed object, for that band of the spectrum. The same principle of operation is the basis of spectrophotometers that generally consist of a light source, a lamp, which changes typology in case it is the analysis in the visible spectrum or UV and, in some specific instruments, infrared rays. While normal photography instruments typically limit themselves to capturing the intensity of reflectance of a scene or object for a limited number of spectral bands, corresponding to those needed to produce an image that can be interpreted by the human eye, hyperspectral cameras can capture, for each pixel, the entire spectral response in a wide, almost continuous range, depending on the type of camera itself. As a result, many measurements are available for extracting useful information. Hyperspectral images, therefore, collect a considerable amount of information from the same subject or surface, but at the same time require an important amount of interpretative commitment [5]. Common cameras, which represent the most widely used image capture tool and historically early means of capture, are also optimized to capture photons of light from the visible spectrum, and from the wavebands needed to build an image that can be interpreted by the human eye, providing very limited spectral information [6]. The peaks in the reflectance spectrum, detected by specific equipment, correspond to low absorption of the incident brightness and define the so-called “spectral signature”, unique to each material [7]. Thus, the ability to exploit these differences in reflectance to characterize different materials through spectral response detection.


2. Image analysis procedures

The steps required in the analytical procedure in the case of the study of a phenomenon through image analysis do not differ much from those of classical analytical procedures. Indeed, both methodologies involve measurements, data processing, and consequent reporting. What fundamentally changes in the preparation of the support, which in the case of classical analysis consists in the preparation of the physical sample, while in the image analysis it involves the acquisition of the image from the physical sample, properly treated [8]. The elaboration phase involves the repetition of the measurements, their statistical evaluation, and the appropriate representation of the results. Image-analysis techniques allow you to simultaneously quantify multiple visual attributes and suggest criteria for classifying certain quality performance. The images obtained from the spectrograph, therefore, represent in number the same interval between the observed and available bands. Therefore, for each observed sample, as many photos will be available as there are spectrograph reading frequencies. Samples are hardly flat, and if you want to observe in their entirety, making sure that the reading is not invasive, they must be placed in the reading compartment, without making any cuts. Therefore, for example with potatoes or apples, there is a risk of having readings that can also depend on the variable distance from the emitted source of light. For this reason, images should be observed and sampled in areas where the distance from the sensor is equal for all samples [9]. So, the first step to take is to cut out the sub-sample of photography or better than more sub-photography samples in which to read the amount read by the sensor. Processing through R software can be developed through the EBImage package (Copyright © 2003–2021, Bioconductor), importing images through the read image function, then as arrays of values where pixels are identifiable through coordinates concerning a known point of origin [10]. This allows you to prepare shapes, usually regular (squares or rectangles), to crop sub-plots, through coordinate extraction. So, from each sub-plot, you can read the mean values and the standard deviation of the mean value. Therefore, in R, each sample will constitute a vector that will be valued through the standard averages and deviations read from all the wavelengths used. So, the vectors of the samples will have as many values as there are wavelengths. If multiple sub-plots are used, they can be repeated for the same sample and can therefore be verified through ANOVA tests to assess the variance between the samples and that between the sub-samples. A similar image analysis path is possible even if there are few observed reflectance bands. In the case of RGB images, there are three bands (Red, Green, and Blue). In these cases, you can apply a type of analysis that relies on grayscale arrays (GLMC) [11, 12, 13, 14]. The images are first rendered grayscale and the GLCM algorithm observes the relationship between each pixel and its neighbors Figure 3. In this way, parameters (homogeneity, contrast, dissimilarity, entropy) are measured that show whether the surface of the observed object is homogeneous or has irregularities.

Figure 3.

GLCM on a sub-plot of a potato surface.


3. Software available

In recent decades, many software, or additional packages specific to generic statistical software have been developed, such as Matlab, R, or Python [15]. Most algorithms refer to multi-way analysis, such as PARAFAC, PARAFAC2, N-PLS, Tucker3, and DTLD [16]. There are also some other open-source software implementations used for the multi-way analysis of other communities. For example, the tensor toolbox [17] is powerful for analyzing a wide type of tensor, these include dense, scattered, and symmetric tensors [18] and a Matlab tensorial decomposition package called tensor box that contains various algorithms optimized for the decomposition of a tensor, such as the fast dampened CP gauss-newton algorithm [19]. Recently, some multi-way analysis software packages running in the R environment have also been developed, such as ThreeWay [20] and multiway packages [21] developed by social science statisticians. Meanwhile, there are also some Python multi-way analysis packages available, such as Tensorly [22] and TensorD [23]. Other packages were born for other applications, such as EBImage or Image. Contour detector [24, 25, 26] refer to Otsu’s algorithm [27], in an intention to discriminate bodies, backgrounds, particles and are currently used for image analysis in many sectors, including agro-industrial. An example of applying the identification of the contour and then the shape is shown in Figure 4, [13], which shows the reading of an image of a potato tuber.

Figure 4.

Contour extraction on a potato RGB image.


4. Sources of digital images

Vision is the most important of the senses in human perception, so the images play the monocular most important role in enhancing this human perception. As a result of human limitations, where the human eye can identify and see objects in the visible light band in the electromagnetic spectrum. As well, in light of the evolution of consumers’ desires for obtaining a high-quality and safe food product, and the inability of traditional methods to measure quality to meet the needs of the consumer, there were vigorous motives for developing imaging machines to cover almost the entire electromagnetic (EM) spectrum, ranging from gamma to radio waves. In order to be able to determine and measure the quality of food in general by identifying the appearance quality attributes, different phytochemical elements, internal structure, and detection of external and internal injuries and defects. The principal source for the images is the electromagnetic (EM) energy spectrum. Electromagnetic radiation is an electric and magnetic disturbance that propagates through space at the speed of light (2.998 × 108 ms−1). The electromagnetic spectrum is the set of all possible frequencies or wavelengths of electromagnetic waves. Depending on their frequency or wavelength, electromagnetic waves interact differently with what they encounter in their propagation. The electromagnetic spectrum is the range of all frequencies of electromagnetic radiation from the shortest to the longest wavelength that can be generated physically. This range of wavelengths can be broadly divided into regions which include gamma rays, X-rays, ultraviolet, visible light, infrared, microwaves, and radio waves as shown in Figure 5 [28, 29]. Electromagnetic radiation from the spectrum has found multiple applications ranging from communication to manufacturing. The following is a simplified explanation of all the different types of imaging according to each wavelength on the electromagnetic spectrum.

Figure 5.

Shows the different bands of electromagnetic (EM) spectrum.

4.1 Gamma rays imaging

Gamma rays have the smallest wavelengths (wavelength: <0.01 nm) and the most energy of any wave in the electromagnetic spectrum, that they can pass through most types of materials as shown in Figure 6. This high penetration property makes gamma-ray imaging technology one of the most important imaging techniques for the internal properties of extremely thick objects. Gamma waves may be generated by nuclear explosions, lightning, accelerations of charged particles by strong magnetic fields, and the less dramatic activity of radioactive decay as mentioned [30]. Gamma decay occurs when a nucleus drops to a lower energy state from a higher energy state. Unlike alpha and beta decay, the chemical element does not change and carries no charge. The resulting emission produces gamma rays. The imaging of gamma-ray photons same as any band in the electromagnetic spectrum provides the ability to determine the origin of photons in space. Predominantly, the ability of gamma-ray imaging has been used in medical applications to trace specific radioactive markers to obtain information on transport, distribution, and metabolic or more specifically, to detect cancer or to study certain dynamical behavior, such as drug additions, and recently has been applied in astrophysics applications. Gamma-ray images capture by a Gamma camera (scintillation camera) which is an instrument developed for medical diagnostics to acquiring emitted gamma radiation from internal radioisotopes to create images and this process is called scintigraphy. Gamma camera consists of a detector, collimator, photomultiplier tubes PM tubes, preamplifier, amplifier, pulse height analyzer (PHA), X–Y positioning circuit, and display or recording device [31].

Figure 6.

Describe Gamma-ray position on the EM spectrum and the penetration property.

The detector, PM tubes, and amplifiers are housed in a unit called the detector head. The mechanism of growth and development of agricultural and food sectors requires a major development in modern technology to monitor agricultural operations in general, in addition to food production processes. Researchers have significantly improved the performance of a gamma ray-imaging camera, which is invisible to the human eye. The new technology has potential applications in scientific research, medical treatment, and environmental monitoring. In addition to the agricultural sector, many experiments have used multiple imaging techniques with gamma rays and have reached promising results for their practical application. In this regard, [32] mentioned many studies have been carried out in the last two decades, using the gamma-ray computed tomography (CT) technique in several areas of knowledge other than medicine. As a result, used Gamma-ray computed tomography to characterize soil surface sealing and the study reached that the gamma-ray CT was able to confirm the occurrence of soil surface sealing due to the sewage sludge application and determine average densities and thickness of these layers. Through these results, concluded that the tool of gamma-ray CT allows a detailed analysis of soil bulk density profiles and the detection of very thin compacted or sealed layers. The gamma-ray computed tomography can be applied for wood density analysis, in the field for water infiltration studies and to provide information on the chemical composition of materials [33, 34]. Additionally, [30] pointed out the structure of the positron emission tomography (PET) scanners as depicted in Figure 7 where the PET scanners detect gamma rays with a ring of gamma-ray detectors placed around the subject. Where the special tracer molecules are ingested or injected into the living tissue. The main idea is focused on preparing the tracers by especially compounds to contain one or more radioactive atoms that spontaneously emit positrons (antimatters) positively charged electrons that rapidly colloid with electrons in the neighboring atoms. Then, the collision results in the annihilation of both the positron and electron and the creation of two gamma rays with the energy of a positron or electron. Furthermore, [35] mentioned that the positron-emitting tracer imaging system is one of the powerful techniques for researching the distribution and translocation of water, photoassimilate, mineral nutrients, and environmental pollutants to plants. This system works to detects two gamma rays produced by positron-emitting nuclides with a scintillation camera and therefore enables us to study the movement of elements in intact plants in real-time.

Figure 7.

Illustrative diagram of a positron emission tomography (PET) scan.

Accordingly, described the PET imaging system as a more compact system and flexible in the way to control the environment. Likewise, [36] explained that the positron-emitting tracer imaging system (PETIS) was developed to use the theory of PET in plants. It is equipped with a planar-type imaging apparatus and radioisotopes tracers such as 11C, 13N, 15O, 52Fe, 52Mn, 64Cu, and 107Cd that are produced by a cyclotron and provides 2-D images. As well, [37] concluded that the 2-D and 3-D Gamma-ray imaging techniques have been successfully used in agriculture for the quantification and visualization of various compounds and mechanisms studies within plants such as water uptake and transportation, metal uptake, and transportation, photoassimilate translocation, etc. Modern technologies have become a necessary mechanism for growth and development in the field of agricultural production, food quality, and safety, and researchers are looking forward to achieving the best results in achieving a sustainable development strategy. From this standpoint, gamma-ray imaging has been used with success in several fields, soil analysis, [38] mentioned that the current methods for soil sampling and lab analysis for soil sensing are time-consuming and expensive. So, used hyperspectral gamma-ray energy spectra to predict various surface and subsurface soil properties. It was concluded that the developed model provided a powerful prediction of clay, course, sand, and Fe contents in the 0–15 cm soil layer and pH and course sand contents in the 15–50 cm soil layer. Also, characterized and measured the mineral uptake and translocation within plants using positron emission tomography imaging system (PETIS) such as Mn in barley [39] describe the effects of the reduced form of glutathione (GSH) and study the behavior in the roots oilseeds rape plant [40]. Also, [41] applied PETIS to describe the absorption, transportation, and accumulation of cadmium from culture to spikelet in an intact rice plant. Furthermore, [42] studied using the 64Cu as a tracer in the soybean plant for the transportation from root to the leaves and concluded that the 64Cu could be a useful tracer for the use in plant studies such as the distribution and translocation of copper in intact plants using the PETIS as shown in Figure 8. Subsequently, [43] investigated the ability of the PETIS to visualize and quantitatively analysis of the real-time Cd dynamics from roots to grains in rice cultivars that differed in grain Cd concentrations using PETIS. Moreover, the utilize of positron emission tomography imaging system (PETIS) in the field of tracking water uptake and translocation within plants was studied. Where [35] studied the effect of Aminolevulinic acid (ALA) on H215O translocation from the roots to the shoots of rice plants in real-time by PETIS technology. As well, [44] applied PETIS technology to study the effect of light on H215O flow in rice plants. Where found that the plants were exposed to low light, the H215O flow was activated more slowly. By the same token, [45] studied the visualize of 15O-water flow in tomato and rice plants in light and darkness by using PETIS technology. Although the applications of Gamma-ray imaging techniques are mainly used for research and development purposes, it has extremely great potential to serve as a tool for the development of several operations in the agriculture and food sectors.

Figure 8.

Depicted the positron emission tomography imaging system setup for soybean.

4.2 X-rays imaging

X-rays are a kind of invisible electromagnetic energy with short wavelengths ranging from 0.01 to 10 nanometers and high frequency from 3 × 1019 to 3 × 1016 Hz, and thus high energies in the range 120 electron Volt (eV) to 120 kilo-electron Volt (keV), and it falls in the range of the electromagnetic (EM) spectrum between ultraviolet radiation and gamma rays. X-rays are short electromagnetic waves that behave like particles while interacting with the matter as discrete bundles of energy and are called photons or quanta. Almost, X-rays are classified into soft X-rays and hard X-rays. Soft X-rays have relatively short wavelengths of about 10 nanometers, while hard X-rays have wavelengths of about 100 picometers [46] as shown in Figure 9. In general, [3] mentioned that X-rays are among the oldest sources of EM radiation used for imaging. The best-known use of X-rays in medical diagnostics, but they also are used extensively in industry and other areas, like astronomy. Besides medical diagnostics imaging and astronomy, there are other applications of X-rays such as checking luggage at the airport, inspecting industrial ingredients, and security.

Figure 9.

Determines the location of X-radiation on the electromagnetic spectrum.

As a result of the powerful penetrating X-ray, it has become one of the most important modern applications used in the inspection of agricultural products and food in general. X-ray imaging techniques are the least used in non-destructive methods for internal quality evaluation which are gaining popularity nowadays in various fields of agriculture and food quality evaluation. Although, X-ray techniques, so far predominantly used in medical applications, but also have been explored for internal quality inspection of several agricultural products non-destructively when quality attributes are invisible on the surface of the products. Given considerations of product safety, consumer health, and meeting market needs, the non-destructive nature of these techniques has great potential for wide applications on agricultural and food products. In short, the action idea of an X-ray imaging system is based on the principle of transmission imaging technique, that the X-ray beam penetrates the object and attenuates based on the density variance of the object. Then this attenuated energy that passed through the object is identified through a photodetector, a film, or an ionization chamber on the other side. Thus, the attenuation coefficients of the object components lead to different contrast between these components [46, 47, 48, 49, 50]. Accordingly, [51] reported that the soft X-ray method was rapid and took only 3–5 s to produce an X-ray image. Undoubtedly, X-ray inspection systems are becoming one of the best solutions to ensure product quality, safety and prevent risks in the food sector. Based on the combination of multispectral and X-ray imaging technologies [52] presented a new method for automatic characterization of seed quality. This new method included the application of a normalized canonical discriminant analyses (nCDA) algorithm to obtain spatial and spectral patterns on different seed lots. Reflectance data and X-ray classes based on linear discriminant analysis (LDA) were used to develop the classification models. Concluded that multispectral and X-ray imaging has a strong relationship with seed physiological performance. Reflectance at 940 nm and X-ray data showed high accuracy (>0.96) to predict quality traits such as normal seedlings, abnormal seedlings, and dead seeds as shown in Figure 10. These techniques can be alternative methods for rapid, efficient, sustainable, and non-destructive characterization of seed quality in the future, overcoming the intrinsic subjectivity of the conventional seed quality analysis. In a serious study for nondestructive inspection and detection of foreign materials in food products, [53] demonstrate a method for novelty detection of foreign objects Figure 11 such as wood chips, insects, and soft plastics in food products using grating-based multimodal X-ray imaging. Through using X-ray imaging technique with three modalities absorption, phase contrast, and dark field to pixel correspondence and enhancing organic materials such as wood chips, insects, and soft plastics not detectable by conventional X-ray absorption radiography.

Figure 10.

Illustrates raw RGB images, reflectance images captured at 940 nm (grayscale and transformed images using nCDA algorithm), and X-ray images of ventral and dorsal surfaces of Jatropha curcas seeds.

Figure 11.

Describes eight different foreign objects in three size groups.

An example of X-ray images obtained of all food products with foreign objects from size group 2 at absorption, contrast, and dark field, from top to bottom, respectively as shown in Figure 12. It is clearly visible that there is a different contrast between the three imaging modalities. Concluded that the results give a clear indication of superior detection results from the grating-based method, and especially show promising detection results of organic materials. At the same time, [54] used the X-ray Imaging technique in a study conducted to detect the Infestation by Saw-Toothed Beetles of stored dates fruits, where its main goal was to investigate the capability of X-ray imaging in detecting internal infestations caused by the saw-toothed beetle in stored date fruits.

Figure 12.

Shows the X-ray images obtained for seven food products with foreign materials from size group 2, and the white bar represents 1 cm.

X-ray images of the dates were acquired at 40 kV potential and 1.6 mAs with a resolution of 512 × 512 by using an X-ray machine as shown in Figure 13. In the final analysis, the X-ray imaging system yielded around 97% accuracy in detecting internal infestation of dates with an adult beetle while using a pairwise classification method. Similarly, [55] presented an approach for visual detection of organic foreign objects such as paper and insects in food products using X-ray dark-field imaging. The results proved that the dark-field modality gave larger contrast-to-noise ratios than absorption radiography for organic foreign objects. Additionally, [56] developed an adaptive X-ray image segmentation algorithm based on the local pixels intensities and an unsupervised thresholding algorithm for the determination of infestation sites of several types of fruit such as citrus, peach, guava, etc.

Figure 13.

Shows X-ray images of two dates infested.

The X-ray images were acquired through an X-ray imaging system which consists of a microfocus X-ray source, a line-scan sensor camera both of which are controlled by a desktop computer, and a frame grabber board to acquire and transfer the signal from the line-scan sensor to the host computer as shown in Figure 14. The developed algorithm proved fast in computation time and was implemented in the X-ray scanner for real-time quarantine inspection at a scanning rate of 1.2 m/min.

Figure 14.

Schematic drawing of the X-ray imaging system.

Thus, suspected sites of infestation inside the fruit can be accurately marked on the acquired X-ray image to aid the quarantine officer during the inspection Figure 15 for guava and peach. In conclusion, the detection accuracies of the infestation detection experiments for guava and peach fruits were apparently affected by the selection of sub-image and it decreases as the sub-image size increases. Where, the detection accuracy slightly increased to 95% (guava) and 98% (peach), by reducing the sub-image size to 12 × 12 in the adaptive segmentation procedure. Furthermore, there are many studies on the applications of X-ray imaging techniques for inspection and evaluation quality that have been reported in the field of food and agriculture.

Figure 15.

Illustrates effect of morphological filtering: (a) X-ray image of a sample, (b) segmented spots after adaptive thresholding of (a), and (c) morphological filtering of (b) with three iterations.

For example, [57] used the X-ray image to create a method of measuring the mass of wheat grains via calculating the total grey value. Detecting internal defects in grains or seeds by applying X-ray imaging has also shown promising results where, [58, 59] proved that the X-ray imaging technique can identified wheat grains infested by weevils. Also, the apple bruises were detectable using X-ray imaging and the extracted image features can be used to sort defective apples [60]. As well, [61] concluded that digital X-ray images can detect the internal disorder that leads to tissue breakdown such as the watercore in apples.

4.3 Ultraviolet (UV) imaging

Ultraviolet (UV) radiation has a shorter wavelength and higher energy than visible light band covers the wavelength range 100–400 nm. Moreover, UV radiation was divided into three bands UVA (320–400 nm), UVB (290–320 nm), and UVC (100–290 nm) is the most damaging type of UV radiation. However, it is completely filtered by the atmosphere and does not reach the earth’s surface. Indeed, [30] mentioned that the imaging cannot be used in the region below 290 nm while UVB scattered more than the UVA and visible light. In reflected-UV imaging, UV illumination reflects off a scene then recorded by a UV sensitive camera while in UV fluorescence imaging. Also, UV illumination stimulates fluorescence at a longer wavelength UVA than the UV excitation source. The resulting fluorescence is typically in the visible light band. Applications of the ultraviolet band are varied. They include lithography, industrial inspection, microscopy, lasers, biological imaging, and astronomical observations [3]. In addition, [30] mentioned that UV light tends to be absorbed strongly by many organic materials and makes it possible to visualize the surface topology of an object without the light penetrating the interior parts. In the UV imaging field, little research on the UV camera as the main part of computer vision system based on image processing system has also been carried by the researchers. For example, [62] pointed to that running research focused on the study of reflected ultraviolet imaging (UV) technique, its potential of detecting defects in mangoes, and to develop a computer vision system that could find the reflected area on injured or defected mango’s surface. So, they studied the possibility of a reflected UV imaging technique for the detection of defects on the surface area of mango. concluded that the distinction between RGB color and reflected UV imaging is very clear as shown in Figure 16. The band-pass filter of 400 nm wavelengths was found more suitable to detect the defected or ruptured tissues of mangoes. It might be due to the high photographic value of the UV-A band and since the reflected UV photography well performed over 360 nm as mentioned by [63]. Accordingly, an algorithm for defect segmentation can be developed and CVS could combine with a UV camera and a software algorithm to detect injuries. In this context, [64] developed and tested a prototype UV-based imaging system for real-time detection and separation of dried figs contaminated with aflatoxins as shown in Figure 17. The prototype system was tested by using 400 dried figs.

Figure 16.

Shows different images of shriveled mangoes.

Figure 17.

Schematic of UV-based imaging system for aflatoxin contamination detection.

In the final analysis, the prototype system achieved a 98% success rate in the detection and separation of the dried figs contaminated with aflatoxins. Also, [65] have built a simple computer vision system to detection of anthracnose infection and latex stain by using a low-cost webcam under UV-A illumination. The UV-fluorescence imaging technique has been selected for detecting areas on dried figs that are contaminated with aflatoxin [66]. Similarly, freeze-damaged oranges were also detected using the ultraviolet (UV) fluorescence method by [67] at 365 nm. Furthermore, [68] found that a UV-based computer vision system was effective in identifying stem end injuries in citrus fruits, which was used for fruit sorting. Likewise, [69] introduced a modern method based on UV imaging technique and processing images under a 365 nm UV light for separating pistachio nuts contaminated with aflatoxins. Accordingly, [70] indicated that various hidden defects inside fruits and vegetables can’t be recognized by conventional systems, in contrast, can be identified by the reflected UV imaging technique. In this regard [71] mentioned that there is an important band (i.e., 365 nm) was identified during UV band selection in the application of UV-fluorescence imaging technique for inspecting aflatoxin contamination. Also, for more than 30 years [72] used UV photographs for aflatoxin-producing molds that were identified as gray or black colonies, whereas molds not producing aflatoxins appeared as white colonies.

4.4 Visible light imaging

The visible light spectrum is defined as the segment of the electromagnetic spectrum that the human eye can view. This visible light band is located in between ultraviolet (UV) and infrared (IR) regions, whose wavelength ranges from 400 to 700 nm as shown in Figure 18. Visible light is partly absorbed or diffused and partly reflected from the surfaces of objects, giving them the color, we perceive. This visible light region consists of red, orange, yellow, green, blue, and violet waves. Obviously, each color wave is defined with a specific wavelength where violet-blue is in the area of from 400 to 475 nm, the yellow-green color of about average 550 nm, and red is located in the area of 700 nm [28, 73]. Additionally, [74] mentioned that when the light falls on an object, it is usually reflected, absorbed, or transmitted. The intensity of these phenomena depends on the nature of the material and that specific wavelength region of the electromagnetic spectrum that is being used.

Figure 18.

Shows the characteristics of visible light on the electromagnetic spectrum.

The visual band of the electromagnetic spectrum is the most familiar in all our life activities, it is not surprising that imaging systems based on this visible light band outweigh by far all the others in terms of scope of application [3]. More simply, the emitted, transmitted or reflected visible light from an object carries information about that object which facilitates the quality inspectors to get information concerning the quality. So, visible-light imaging systems, play a significant role to see clearer, farther, and deeper and gaining detailed information about different objects [75]. Imaging machines base on the visible light spectrum or as called color imaging systems has become an extremely significant technique for nondestructively inspecting and assessing the quality of agricultural and food products. So, the color imaging machines are considering a promising technique currently applied for quality measurement of fresh and processed food. Visible light imaging machines operation is summarized, by acquiring images under illumination standard conditions, pixels processing, and analyzing the whole image which can classify and quantify objects. Also, visible light machine vision systems scan and sort millions of items per minute and provide fast, objective, robust measurement, and detailed characterization of color uniformity at a pixel-based level [76, 77, 78]. The simplest machine vision system is mainly composed of a lighting system attached with a camera, and a computer equipped with an image acquisition board as shown in Figure 19, [79, 80]. The accuracy, speed, and consistency of these technological developments represented in visible light imaging have greatly increased their applications in multiple fields in agriculture and food such as applications of pre- and post-harvest, food industry, baking industry, cereals, meat, fish, poultry, fruits and vegetable industry, and liquids. For instance, some agricultural research has been focusing on using machine vision systems based on color imaging and developing algorithms to count agricultural elements, mainly vegetables, and fruits to determine the full maturity, production, and harvest dates [81, 82]. In this regard, [83] presented a method for identifying and counting fruits from images acquired in cluttered greenhouses. The results showed a strong correlation, 94.6%, between the automatic and manual counting data. As well, [84] estimated the mango crop yield using image analysis to count mango from orchard images, and that is through segment the pixels of the images into two groups, fruit, and background, utilizing color and texture information. Then, the mangos were identified to count the number of fruits in the image. The automatic results achieving a strong correlation of 0.91.

Figure 19.

Illustrates the basic components of the machine vision system.

The number of green apples was determined by using RGB color images under natural illumination [85]. Similarly, image analysis was used before harvesting to counting the number of ripe and unripe fruits [86]. Also, [87] proposed a machine vision-based visible and NIR hyperspectral imaging method for automating yield estimation of golden delicious apples on trees at different growth stages. Subsequently, many successful attempts were recorded to use automatic vision systems based on image processing for quality control in post-harvest stages. In this context, [13] inspected potato tubers according to some sensitive quality features such as color, size, mass, firmness, and the texture homogeneity of potato surface Figure 20 through a developed automated vision system. Concluded that the vision system can be applied as a non-destructive, precise, and symmetric technique in-line inspection. Additionally, [88] applied a computer vision system and machine learning algorithms to obtain a prediction model for cherry tomato volume and mass estimation and the results achieved an accuracy of 0.97. Also, the carrot was graded using a machine vision system and the results showed that the constructed image acquisition system success to extract the feature parameters of the carrot accurately [89]. As well, [6] sorted irregular potatoes using the RGB color imaging technique. Furthermore, ripeness determination of grape berries and seeds was performed using image analysis [90]. In a similar trend, the visual quality of agricultural grain is one of the extremely important issues in grain commercialization, which is assessed based on color, shape, and size, which generally impact the product’s market price.

Figure 20.

Describe the extracting distinctive texture features of potato.

The main problems associated with the process of grain quality inspection are the high probability of error occurrence and the difficulty of standardizing the results. So, many proposals have been presented in the field of computer vision systems to assist visual inspection quality of several agricultural grains such as rice [91, 92, 93, 94]. Also, about beans grains, many studies show the need and the importance of computer vision systems based on image processing for bean inspection [95, 96, 97, 98, 99]. In this context, [100] presents a machine vision system (MVS) for visual quality inspection of beans composed of a set of hardware consists of a board that includes an image acquisition chamber, a conveyor belt controlled by a servo motor, and a feeding mechanism and software for segmentation, classification, and defect detection as shown in Figure 21.

Figure 21.

Interface of the software of machine vision system for beans grain.

The results of offline experiments for segmentation, classification, and defect detection achieved, respectively, the average success rates of 99.6%, 99.6%, and 90.0%. While the results obtained in the online mode demonstrated the robustness and viability of this machine vision system, with average success rates of 98.5%, 97.8%, and 85.0%, respectively, to segment, classify, and detect defects in the grains contained in each analyzed image. In the field of inspection quality of meat, imaging methods have been recently applied to visually assess meat and foodstuff quality on the processing line based on color, shape, size, surface texture features [101, 102]. A machine vision system with a support vector machine was utilized to grade the beef fat color. The highest performance percentage of the SVM classifier obtained was 97.4% [103]. Moreover, [104] mentioned that RGB color imaging has been a promising technique for predicting the color of meat. The moisture content of cooked beef joints was correlated with its color, using an RGB color imaging system [105]. The combination of machine vision, linear and nonlinear classifiers was employed for the automatic sorting of chicken pieces like breast, leg, fillet, wing, and drumstick. The results revealed that the total accuracy of online sorting (highest speed about 0.2 m.s−1) was 93% [106]. As well, in the case of fish, visible-light imaging technology has been able to successfully predict the breed, species, quality, and gender of the fish [102]. Also, a machine vision method was used to evaluate the freshness of some fish. The best classification performance was achieved by the support vector machine classifiers with an 86.3% accuracy rate in the assessment of the carp fish based on its freshness [107].

4.5 Infrared (IR) imaging

Infrared (IR) radiation is a type of electromagnetic spectrum, a continuum of frequencies produced when atoms absorb and then release energy that’s invisible to human eyes but that we can feel as heat. IR radiation is emitted by any object with a temperature above absolute zero and the most common sources of infrared radiation are the sun and fire. IR radiation exists in the electromagnetic spectrum at frequencies above those of microwaves and exactly below those of the red visible light band, hence it was called “infrared” as shown in Figure 22. IR frequencies range from about 300 (GHz) up to about 400 (THz). Waves of infrared radiation are longer than those of visible light, ranging from 0.75 to 1000 μm, and are divided into near (NIR, 0.78–3 μm), Mid-Infrared (MIR, 3–50 μm), and Far-Infrared (FIR, 50–1000 μm) as defined by the International Organization for Standardization (ISO 20473, 2007) optics and photonics-spectral bands, [108, 109]. The infrared spectrum (IR) is invisible to the human eye but has a wide range of uses in modern technology.

Figure 22.

An image of infrared wavelengths within the electromagnetic spectrum.

Different wavelengths (NIR, MIR, and FIR) of IR radiation have many different applications. The sources of IR introducing great technological advancements in imaging, thermal imaging, motion detection, gas analyzing, monitoring, and environmental health analysis, etc. IR imaging is widely used in the military, medical, scientific, and industrial fields, since it is able to create a visual with an otherwise non-visible wavelength band to the human eye [110, 111]. Recently, multispectral, and hyperspectral imaging systems based on the IR spectrum, have been used for developing and evaluating most agricultural and food processing operations, such as tracking and estimating the quality of agricultural and food products.

4.5.1 Near-infrared (NIR) imaging

NIR techniques are used for qualitative analysis of agricultural and food products such as grain, fruit, vegetable, meat, fish, chicken, beverages, and dairy products. One of the most important of these techniques, and the most widespread is NIR imaging and spectroscopy, which offers a rapid, non-destructive, and cost-effective method. Development in instrumentation and data analysis techniques of NIR imaging and spectroscopy, expanded the application range to chemical analysis, agricultural and food product analysis, and more. So, NIR imaging is one of the preferred quality monitoring methods in the food industry [112, 113]. Conventional methods of agricultural and food product monitoring are time-consuming, expensive, and require sample destruction. So, the trend was towards fast, accurate, and non-destructive methods. NIR spectroscopy was established as a non-destructive method for quality analysis of food materials as mentioned by [114]. Ordinarily, when the IR radiation interacts with matter, the energy can be absorbed and result in molecular vibrations for example stretching, bending, rocking, wagging, and twisting. Hence, a change occurs in the electric dipole moment (change in the positive–negative charge separation) of the molecule, and the molecules transition to different vibrational levels as shown in Figure 23. These, transitioning from 1st to the 2nd, 3rd, or 4th excited state are known as overtones, and NIR spectroscopy measures these overtones [115]. So, NIR spectroscopy can therefore be used to study organic samples, which contain chemical bonds such as (C-H, O-H, N-H) because these functional groups absorb the energy from radiation in this region [116].

Figure 23.

Shows different vibrational levels for molecules and overtones transitions.

Therefore, [75, 117] mentioned that instead of individual compounds, major functional groups were assigned to specific NIR regions, where at a given wavelength range, a chemical bond will absorb the energy at a specific frequency when the energy matches the energy required to induce a vibrational response Figure 24.

Figure 24.

Shows major analytical bands and relative peak positions for major NIR absorptions.

Hyperspectral imaging based on the NIR band is the most widely used in the quality determination of agricultural and food products. However, NIR spectroscopy assessments do not contain spatial information, which is important to many food inspection applications. Furthermore, the inability of NIR spectrometers to capture internal constituent gradients within food products may lead to discrepancies between predicted and measured composition. Also, conventional Vis/NIR imaging provides only spatial information and does not supply any spectral information, which may lead to deficiencies in monitoring and evaluating the quality of products [118, 119, 120]. To overcome this, multispectral and hyperspectral imaging systems have been developed to combine images that contain spatial and spectral information, acquired at narrow wavebands, sensitive to features of interest on the object.

4.5.2 Hyperspectral imaging (HSI)

Hyperspectral imaging (HSI) or spectroscopic imaging is one of the most promising emerging technologies that integrates conventional imaging and spectroscopy to acquire both spatial and spectral information from an object. Although HSI was originally developed for remote sensing, it has recently emerged as a powerful process analytical tool for automatic non-destructive analysis of agricultural and food products [6, 121, 122, 123, 124, 125]. Where, the non-destructive, and flexible nature of HSI makes it an attractive process analytical technology for the identification of critical control parameters that impact finished product quality. As a result, expected [126, 127] that HSI will be increasingly adopted as a process analytical technology for quality monitoring of agricultural products and the food industry, as has already been the case in the pharmaceutical industry. There is an equally significant aspect, where the importance of the HSI system is that it consists of hundreds of neighboring wavebands for each spatial position (pixel) within the image. Hence, the spectrum considers like a fingerprint that can be used to characterize the composition of that pixel. HSI images are three-dimensional blocks of data, including two dimensions as spatial position and one spectral dimension, so this HSI is known as hypercubes, as clarified in Figure 25. Each hypercube consists of 50–300 images acquired at different wavelengths with a spectral resolution of 1–10 nm. Another significant factor is that the hypercubes (HSI) permit the visualization of biochemical constituents of a sample, as separated forms into areas of the hyper image [6, 122, 128, 129, 130, 131, 132]. In brief, the main idea of the HSI imaging system running is that when the electromagnetic spectrum beam incident on the sample during sample analysis, the radiation turns into forms of reflection, scattering, absorption, and emit electromagnetic energy obtaining different patterns in specific wavelengths, due to the difference in chemical composition and physical structure of the sample. As a consequence, each element has a spectral fingerprint declaring its chemical composition. So, differences in the chemical concentration of the constituents of the sample lead to different reflectance or absorbance values in some main wavelengths [130, 131, 133].

Figure 25.

Schematic of HSI hypercube, the spectral and spatial dimensions relationship.

Generally, the structure of the HSI system consists of some major components: lens, spectrograph, camera, translation stage, illumination unit, and computer system Figure 26. Then, when the sample is highlighted by diffuse illumination such as tungsten-halogen or LED source. then, the sample reflects the light to the lens and is separated into its component wavelengths by diffraction optics contained in the spectrograph, then a two-dimensional image (spatial and spectral dimensions) is formed on the camera and saved on the computer system [122, 134].

Figure 26.

Diagram of the hyperspectral imaging system components.

Consequently, these technological developments in HSI techniques based on NIR as a measuring non-destructive method, accurate, reliable, and fast for quality and safety analysis, have greatly increased the applications in a wide range of agricultural and food products. In this paragraph, some applications will be listed that demonstrate the capability of HSI in the field of food to perform classification, defect and disease detection, and assessment of some chemical characteristics. Furthermore, [135] clarified that the HSI systems can be used for the discrimination of different types of grains, including maize, wheat, barley, oat, soybean, and rice seed, etc. For instance, [136] developed indices for Norway spruce (Picea abies) seeds screening through applying HSI at different wavelengths 1310, 1710, and 1985 nm and the results showed a good classification, recommending the possibility to build inexpensive devices. As well, [137] used HSI based on the NIR band to explore the influence of grain shape and texture on the spectral variation represented in three kinds of cereal barley, wheat, and sorghum using PCA and gradients classification. Concluded that the results of classification gradient images and PC score plots were 91.18, 89.43, and 84.39% respectively, and all were influenced by kernel topography. An equally significant aspect is determining the viability of seeds by applying HSI at different spectral ranges (400–1000, and 1000–2500 nm). Visualization of treated and non-treated corn seeds was also achieved with HSI. The results demonstrated that the spectral range in the 1000–2500 nm performed better in exploring the seed viability [138]. Also, [139] classified viable and non-viable kernels of different cultivars of barley, wheat, and sorghum by using the NIR-HSI system. The results showed that NIR hyperspectral imaging is capable to identify viable and non-viable kernels of different cultivars. In a study for industrial baking of sponge cakes [140], the production process required various quality indicators to be measured continuously such as moisture content and sponge hardness. The existing techniques for performing these measures, randomly selected sponges are removed from the production line, and then samples are manually cut from each sponge by a destructive method to test as shown in Figure 27A. In contrast, the authors used the NIR-HSI system with a spectral range of 900–1700 nm as a non-destructive method to predict both moisture and hardness of cake Figure 27B. The results showed that the moisture and hardness prediction models when using a PLS-R model were 0.99 and 0.98. Accordingly, concluded that HSI is a valid method for predicting sponge cakes’ moisture content and hardness. This study established a proof of concept for a new stand-off cake moisture and hardness monitoring system. Additionally, this HSI system would provide the added advantage to record every product in an HS image, which leads to detect variations in the production process. Also, HSI systems were applied for the ripeness monitoring of a large number of different fruit varieties [141, 142, 143, 144, 145, 146, 147]. Also, defects or blemishes detection such as bruising in fruit [147, 148, 149, 150, 151, 152, 153, 154, 155]. Recent studies on the safety inspection of agricultural products and livestock use multispectral imaging and HSI technologies. HSI methods have been used to determine the contamination of internal secretions on the surface of chickens, surface contamination in food processing, and fecal or foreign contamination of matter for apples and lettuce [118, 156, 157]. While studying the potential application of HSI for defect identification, apple and cucumber are two of the most popular food products that are being studied for bruises and frost injury defects, respectively.

Figure 27.

Illustrated traditional measuring technique (A), and (B) NIR-HSI, (1) single band at 1450 nm, (2) binary image obtained indicating the location of the cavities (in black), (3) Binary mask selecting the center of the cake (white), air bubbles (black), and (4) cake image ready for spectral data extraction.

Moreover, [158] took three varieties of apples to study the damage in apples and noted that the NIR region (700 and 900 nm) was more efficient at determining it. As a result, the NIR-HSI system from 900–1700 nm, to examine its application in the identification of bruises during various periods of storage after bruising was subsequently implemented by [159]. The spectrally reflective image analysis system has also been developed to assess defects on lettuce cut in the processing line. In particular, [160] algorithms have been identified to detect snails and worms. Another significant factor, where HSI systems proved not only to detect non-obvious bruises of fruits but also capable of assessing internal quality parameters such as soluble solid content, firmness, pH value, antioxidant, etc. [161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171].

4.5.3 Mid and far infrared imaging

Mid and far-infrared bands of EM radiation are an extremely useful part of the spectrum. Where, it can provide imaging in the dark, trace heat signatures, and provide sensitive detection of many biomolecular and chemical signals. However, the mid-infrared (MIR) band of the electromagnetic spectrum seems to contain valuable new information about some of the features needed to differentiate the samples, for example, the samples with diseases or some contamination or for quality inspection. Also, the recent development of light sources and imaging systems in MIR allows the use of multi/hyperspectral MIR imaging in many new applications as mentioned by [172, 173]. Also, signals of all IR radiation are known to be sensitive to leaf compounds such as water, lignin, and cellulose, which are essential to the functioning and structure of the leaf [173, 174, 175]. The thermal imaging technique is defined as a non-destructive, contactless, and rapid method for capturing the IR radiation from the object’s surface. Where the surfaces of the hot objects emit electromagnetic waves in the IR region. Thermal imaging systems commonly capture radiation data from 7.5 μm up to 14 μm [176, 177]. This IR range is defined as the transmission window of the atmosphere characterized by the minimum attenuation of radiation [178]. Where the idea of a thermal imaging system based on captures temperature and spatial information simultaneously. Then, delivers the MIR data to be processed through a computer unit and provided in matrices forming called thermograms. From this point, there have been many successful attempts to apply thermal imaging systems as non-destructive and contactless methods to monitor the quality of many agricultural and food products. For example, but not limited, [37] developed an infrared thermal imaging system to detect infestation by Cryptolestes ferrugineus under the seed coat on the germ of the wheat kernels. Found that the overall classification accuracy for a quadratic function was 83.5% and 77.7% for infested and sound kernels, respectively, and for a linear function, it was 77.6% and 83.0% for infested and sound kernels, respectively, in pairwise discriminations. As well, [179] studied the feasibility of applying an IR thermal imaging system to classified fungal infections of stored wheat, the results prove that a thermal imaging system could be a useful tool to find if the wheat grain is infected by fungi or not, where the classification models gave a maximum accuracy of 100% for healthy samples and more than 97% and 96% for infected samples, respectively. Additionally, [180] developed a method to early detect apple bruising based on pulsed-phase thermography. The results indicated the high possibilities of the active thermography method for detecting defects up to several millimeters. Also, [181] conducted a follow-up study in which hyperspectral cameras were used equipped with sensors working in the visible and NIR (400–1000 nm), short-wavelength (1000–2500 nm), and thermal imaging camera in the MIR range (3500–5000 nm) to producing visualizations of bruises and providing information about bruise depth. the results obtained confirmed that the broad-spectrum range (400–5000 nm) of fruit surface imaging can improve the detection of early bruises with varying depths. Likewise, [157] adjusted an infrared lock-in thermography technique for the detection of early bruises on pears, the thermal emission signals from pears were measured using a highly sensitive MIR thermal camera. Found that the phase information of thermal emission from pears provides good metrics to identify quantitative information about both the size and the depth of damage for pears. In the same context, [182] developed a pulsed thermographic imaging system and explore its feasibility in non-destructively detecting bruised blueberries. The results demonstrated the feasibility of pulsed thermography to discriminate between bruised and healthy blueberries. Most recently, in the food processing sector, a study conducted by [183] indicated the possibility of monitoring and evaluating ovens systems through MIR imaging. Where this study aims to demonstrate the applicability of thermal imaging with image processing for the real-time evaluation of oven systems. A thermal camera was adapted to two different oven systems: a standard electric deck oven and a novel gas-fired baking oven with integrated volumetric ceramic burners as shown in Figure 28.

Figure 28.

Shows MIR system for monitoring the baking process: A) MIR camera, B) electric deck oven, and C) a thermogram captured during baking.

MIR data with image processing are used to accomplish a time-resolved and automated monitoring of the baking process for oven system evaluation. Therefore, items to be baked were captured by the thermal camera, detected and feature extraction was performed to calculate the quality feature relevant such as texture homogeneity, temperature distribution, spatial dimensions (width and height), and the corresponding growth kinetic as shown in Figure 29. The results of the proposed study proved its fundamental qualification for comparing, monitoring, and evaluating different oven systems. In the final analysis, concluded that thermal imaging is an emerging and promising technique for the food industry and offers promising possibilities for inline process sensing and monitoring in the food sector.

Figure 29.

Image processing steps with major operations.

4.6 Microwaves imaging (MWI)

Microwave radiation appears on electromagnetic radiation, between IR and radio waves. Where, microwaves refer to alternating current signals in the frequency range from 300 MHz to 300 GHz and (3 x 108 m/sec)/frequency, which gives you a wavelength range from 1 mm to 1 meter. These dimensions allow penetrating deep inside many optically not transparent mediums such as biological tissues, concrete, soil, wood, etc. In this regard, [3] indicated that radar is the dominant application of microwave imaging techniques. Because the imaging radar technique in the microwave band can collect data over any region at any time, regardless of the weather or ambient lighting conditions. Some radar waves can penetrate clouds, and can also see-through vegetation, ice, and extremely dry sand under non-standard conditions. The imaging radar works like a flash camera that provides microwave pulses to illuminate the target area and take a snapshot image. Where, imaging radar uses an antenna instead of a camera lens, attached with digital computer processing to record its images. In a radar image, one can see only the microwave energy that was reflected toward the radar antenna. There are many similarities between optical imaging, using a digital camera, and microwave imaging, using an antenna array as highlighted in Figure 30. In this type of imaging known as microwave holography, one or more antennas in the array illuminate the scene with a radiofrequency (RF) signal. Part of this signal is reflected in the other antennas, which record both the amplitude and phase of the reflected signal. These reflected RF signals are then processed to form an image of the scene [184, 185, 186, 187]. Microwave imaging techniques have shown excellent capabilities in various fields such as civil engineering, biomedical diagnostics, safety, industrial applications, and have in the latest decades experienced strong growth as a research topic in the agricultural and food fields.

Figure 30.

Highlights similarities between visible light imaging, and microwave imaging.

Microwave imaging technology means the initial rapid screening of the hidden objects in an object’s internal structure employing electromagnetic fields at microwave frequencies (300 MHz-30 GHz). Microwave images are maps of the electrical property distributions in dielectric samples [188, 189]. Therefore, microwave imaging for agricultural and food applications is nowadays of great interest, having the potential of providing information about the internal quality of agricultural and food products. There are three main reasons for the growing interest and rapid development of microwave-based methodologies, starting with the idea that the microwave band can penetrate all materials (unless ideal conductors), and the related scattered fields are representative of the overall volume of the object under test and not only of its surface; the second main interest reason that the microwave imaging modalities are very sensitive to the water content of the specimen, which makes them extremely suitable by particularly for food processing techniques; and thirdly it contactless concerning the specimen. Microwave imaging (radar tomography) has been used to evaluate the physical properties of food. In particular, the microwave imaging technique, able to identify the composition and the shape of biological materials, for the quality control of packed foods, and identify the degree of ripeness of fruits [190, 191, 192]. It could also be said through several investigations focused on the use of microwave technologies that microwave imaging techniques are used to probe inaccessible domains and to reveal the dielectric properties of the media that they penetrate. This technique aims to fully characterize the area in terms of positions, shapes, and complex permittivity profiles of the dielectric discontinuities (i.e., the scatterers). This aim is achieved by using inverse scattering algorithms to be analyzing the scattered field reflected by the material under consideration. Therefore, inverse scattering methods have been applied in many applications such as medical diagnosis, subsurface monitoring or geophysical inspection, and nondestructive evaluation and testing in various fields [193, 194, 195, 196, 197, 198]. Accordingly, [199] focused on the application of microwave imaging technology for food contamination monitoring where the mechanism of this technology is based on transmission across the food sample to exploit the local dielectric proprieties variation that means a foreign object detection. Ordinarily, the microwave imaging system is composed of two main components hardware and software Figure 31, where the hardware part collects data, and the software process them to generate the output. Where the transmitter antenna generates EM waves toward the sample, that in food ambient it can be reasonably considered a homogeneous material, and a receiver antenna collects them. After their acquisition, dedicated software processes the data to generate the outputs where the detected intrusion is reported.

Figure 31.

Illustrates components of antenna microwave imaging system.

By the same token, [200] investigated the dielectric properties of fresh eggs during storage through frequency range 20–1800 MHz using an open-ended coaxial probe on thick albumen and yolk of eggs after 1–15 days of storage at room temperature. Also, [201] concluded that the dielectric properties of egg albumen and yolk were distinguished over the frequency range of 10–1800 MHz. Also, [202] presented a form of food security sensing using a waveguide antenna microwave imaging system to identify the health status of eggs. Therefore, proposed a waveguide antenna system with a frequency range of 7–13 GHz and a maximum gain of 17.37 dBi, with a scanning area of 30x30 cm2. The results found that the proposed waveguide antenna microwave imaging sensing system could effectively identify the health status of many eggs very quickly. As a consequence, concluded that the waveguide antenna microwave imaging sensing system provides a simple, non-destructive, effective, and rapid method for food security applications. Images are undoubtedly the optimum technique in representing concepts to the human brain. Regardless of whether the product is fresh fruits or prepared food, color and moisture content are important attributes that food and agricultural engineers regularly look for. Therefore, [203] suggested an investigation focused on image acquisition technologies that can reveal the information of interest in 2-D using the visible, and non-visible (radar tomography) bands of radiation. The visible band was applied for color grading of oil palms and the computerized radar tomography was used to map the moisture content in grain. The results of this study found that the vision system correctly classified 92% of oil palms by four-color categories, and the radar tomography at 1 GHz frequency accurately mapped the homogeneity and heterogeneity in moisture content of grain over the moisture range 12–39%. At the same time, the microwave imaging technique is particularly useful for monitoring foods also after the packaging, without the necessity of opening the package. This is due to the microwave’s ability to easily penetrate any type of non-metallic packages. Furthermore, it has the ability to identify unwanted or extraneous objects (such as glass or plastics pieces) embedded in food that cannot be detected with standard metal detectors. Microwave imaging technique in the frequency band from 8–12 GHz has been used to assess the contents of a package of cookies. The main purpose of applying this technique was to assess and ensure whether all the cookies within the package and ensure if their shape is preserved after the distribution or not. Through Figure 32a, the results concluded that the imaging technique in the microwave radiation range is capable of reconstruction of a package of cookies, and it can be noticed from the microwave image that one cookie is missed, and another is broken. Also, in a study to show the potentialities and abilities of microwave imaging techniques for food processing applications. A sample of cheese was corrupted by placing a small piece of plastic material inside it to verify the ability of microwave imaging technology to detect this sample in a non-destructive manner. In this inspection technique, Figure 32b clearly shows the reconstruction of the dielectric distribution of the cheese piece through a microwave image, which clearly showed the presence of a small piece of plastic material and identified it as a yellow area [191].

Figure 32.

Highlights microwave images, a) reconstruction of a package of cookies and b) identification of an abnormal object inside a piece of cheese.

4.7 Radio waves imaging

Radio waves are a type of electromagnetic radiation best-known for their use in communication technologies, such as television, mobile phones, and radios. According to NASA, radio waves have the longest wavelengths (1 mm to more than 100 km), also have the lowest frequencies, from about 3 kHz, up to about 300 GHz in the EM spectrum. The National Telecommunications and Information Administration generally divides the radio spectrum into nine bands Table 1. Low to medium frequencies, the lowest of all radio frequencies, have a long-range and are useful in penetrating water and rock. While, the high, very high, and ultra-high bands of radio frequencies include FM radio, broadcast television sound, public service radio, cellphones, and global positioning system (GPS). Moreover, super, and extremely high frequencies perform the highest frequencies in the radio band and are sometimes considered to be part of the microwave band. Imaging in the radio band, as in the case of imaging at the other end of the electromagnetic wave (gamma rays). Medicine and astronomy are the major applications of imaging in the radio band. Magnetic resonance imaging (MRI), or nuclear magnetic resonance scanner (NMR), is mostly known as a magnetic resonance imaging device. Because of its strong magnetism, the efficient polarization and further excites the focused proton singly included in water molecules present in the tissue. The technique of magnetic resonance imaging (MRI) is based on the magnetic field and pulses of radio radiation energy to evaluate the properties of objects, mostly applied for the diagnosis of various ailments internal to human and animal bodies. The main idea of the MRI technique is based on the magnetization of the atomic nuclei of the object using strong magnets and the nuclei rotate the magnetic field at variable speeds, which can be detected by the scanner and converted into usable data through Fourier transform. In the MRI technique, the hydrogen atom is used as a base atom because water is plentiful in all biological systems [3, 191, 204, 205].

NumbersRadio bandsFrequency rangeWavelength range
1Extremely Low Frequency (ELF)<3 kHz>100 km
2Very Low Frequency (VLF)3 to 30 kHz10 to 100 km
3Low Frequency (LF)30 to 300 kHz1 m to 10 km
4Medium Frequency (MF)300 kHz to 3 MHz100 m to 1 km
5High Frequency (HF)3 to 30 MHz10 to 100 m
6Very High Frequency (VHF)30 to 300 MHz1 to 10 m
7Ultra-High Frequency (UHF)300 MHz to 3 GHz10 cm to 1 m
8Super High Frequency (SHF)3 to 30 GHz1 to 1 cm
9Extremely High Frequency (EHF)30 to 300 GHz1 mm to 1 cm

Table 1.

Illustrates the nine bands classified of the radio spectrum.

Noting the magnetic nature of this MRI [204, 206] has mentioned that the low magnetic nature of the hydrogen protons which have different behaviors depending on the type of the tissues (e.g., lipids and water). The inspected object is placed within the magnet usually having 0.2–3.0 Tesla magnetic field power (T). This constant magnetic field is produced by radio-frequency pulses on the appropriate resonant frequency known as the Larmor frequency. It causes an excited state for the protons in the sample due to energy absorption. These protons generate radio waves, the emission can be detected by the receiver coil, producing an NMR signal. The basis for MR imaging is measuring the intensity signal of MR, accurate spatial placement of signal intensities, and cross-sectional representation of the signal intensities with the greyscale. Having high moisture content, agricultural products yield strong signals when applying an MRI technique. Therefore, [207] monitor the ripening of mangoes by using MRI technique and found that signal magnetic resonance intensity of the pericarp in MR images varied with the ripening stage. Also, [208] studied the prediction of sensory texture quality attributes of raw and cooked potatoes by NMR-imaging. MRI analysis on the obtained data and subsequent sensory analysis of the cooked potatoes displayed the high potential of employing advanced image analysis on MR-imaging data from raw potatoes to predict sensory attributes related to the texture of cooked potatoes. In short, concluded that MR-imaging besides giving well-known information about water distribution also gives information about anatomic structures within raw potatoes, which are considered important for the perceived textural properties of the cooked potatoes. Moreover, [209] designed an MRI apparatus characterized by its small, lightweight, and usable in an ordinary research room was devised for developmental research and quality estimation of foods and agricultural products. The proton-specified MRI was easy to operate and provided well-depicted images of internal structures, the distribution and mobility of water and oils, and susceptibility differences inside materials, demonstrating that the devised machine is useful for food and agricultural research. As well, [210] examined the changes in kiwi fruit tissue structure to evaluate the effect of storage conditions and found water migration in the direction of the outer region in the pericarp during storage. Also, [211, 212] applied MRI techniques to inspect the physicochemical changes of cherry tomatoes and found the potential of MRI techniques for tomato classification according to maturity. MRI is a technique that permits watercore detection without destroying the sample. From this point, [213] investigated the watercore distribution inside apple fruit (block or radial), and its incidence (% of tissue) by the non-invasive and non-destructive technique of MRI to obtain 20 inner tomography slices from each fruit and analyze the damaged areas using an interactive 3D segmentation method as shown in Figure 33. Apples with block watercore were grouped in Euler numbers between −400 and 400 with a small evolution. For apples with radial development, the Euler number was highly negative, up to −1439. Significant differences were also found regarding sugar composition, with higher fructose and total sugar contents in apples from the upper canopy, compared to those in the lower canopy location. Also, noted significantly higher sorbitol and lower sucrose and fructose contents were found in watercore-affected tissue compared to the healthy tissue of affected apples and compared to healthy apples. Additionally, [214] mentioned that by using MRI, the results of additional tests such as chemical analysis, oil and moisture distribution, sugar level, pH and physical analysis of structure, voids, the thickness of filling and coating, are immediately tested within seconds on the production line. Thus, the idea about the value of the MRI technique and its application in the food industry is going to improve and maintain the quality in processing, testing, and optimizing the parameters.

Figure 33.

Shows the MR images of central apple slices belonging to the four watercore levels, classified by three experts. (1) Sound apple; (2) light watercore; (3) medium watercore; (4) strong water core.

However, the high cost of imaging facilities is another barrier to the exploitation of MRI in the food industry. As well, [215] presented a detailed discussion on the fundamentals of MRI in the study of food materials. Also, [216] pointed to that the MRI is done with the use of an NMR instrument equipped with magnetic gradient coils. Where these coils have the capability to collect data spatially and create two-dimensional and three-dimensional images displaying diverse physicochemical characteristics. Likewise, [217] adopted the idea that fast and non-destructive solutions for sensing watercore would be readily accepted in the postharvest industry. Therefore, conducted a comparative study between X-ray CT and MRI as potential imaging technologies for detecting watercore disorder of different apple cultivars. After the acquisition of X-ray and MR images the 3D datasets of X-ray CT and MRI were matched, the images obtained on quantitatively identical fruit were compared. The results indicated that both MRI and CT were able to detect watercore disorder of different apple cultivars, however, the contrast in MRI images was superior as shown in Figure 34. Finally, concluded that the mean and variance of the frequency distribution of MRI and X-ray CT intensity appeared to be a parameter that allows the identification of healthy apples from affected fruit. A study by [218] provided a potential and detailed description of all components of the MRI system in agricultural fruits and vegetables for the assessment of maturity and quality parameters. As well, [219] used the MRI technique for non-invasive imaging of plant roots in different soils. Where used barley as a model plant to investigate the achievable image quality and the suitability for root phenotyping of six natural soil substrates of commonly occurring soil textures.

Figure 34.

Shows the X-ray CT (left) and MR images (right) cross-sections of sound Ascara and watercore Verde doncella fruit and their segmentation results.

The results are compared with two artificially composed substrates previously documented for MRI root imaging as shown in Figure 35. The results demonstrated that only one soil did not allow imaging of the roots with MRI. In the artificially composed substrates, soil moisture above 70% of the maximal water holding capacity (WHCmax) impeded root imaging. For the natural soil substrates, soil moisture did not affect MRI root image quality in the investigated range of 50–80% WHCmax. Concluded that with the characterization of different soils, investigations such as trait stability across substrates are now possible using non-invasive MRI. Subsequently, [220] presented a review conducted on the use of NMR/MRI techniques, for inspection of some agricultural fruits and vegetables, and explained the benefits of their implementation in the assessment of internal quality attributes such as internal defects, water content, nutrition content, maturity, fruit firmness, seed detection, physicochemical and microbiological quality in both commercial and industrial applications. Accordingly, concluded that the low-field nuclear magnetic resonance (LF-NMR) and MRI are viable technologies in assessing water status, which can significantly impact the quality of fruits and vegetables’ texture, tenderness, and microstructure. Despite considerable developments in the quality measurement of fruits and vegetables and their products, the implementation of these techniques at an industrial level has been unsatisfactory. As well, [221] used MRI to study the changes in the internal structure of tomato fruit during development as a function of maturity. The internal structure of intact cherry tomato fruit at six different maturity stages (green, breaker, turning, pink, light red, and red) was measured using a series of two-dimensional (2D) MR images as shown in Figure 36. water content appears evenly distributed in the pericarp region from breaker to light red maturity stages.

Figure 35.

MR Image for barley seedlings 3 days after sowing in eight different substrates at four different soil moisture levels.

Figure 36.

Demonstrates the MR images of three cherry tomato cultivars Tiara, Tiara TY, and Unicornat, at different maturity stages.

MR signal intensity changes when different maturity stages are observed. Especially, signal intensity variation between the pericarp and locule regions is observed. Quantifying variations of signal intensity using a ratio of signal between pericarp and locule different regions enables the assignment of the maturity of cherry tomato. Additionally, concluded that since MRI provides detailed internal structure information, characterization of internal defects (e.g., bruises, voids, impact damage) and other quality factors is possible.


5. Conclusions

Inspecting and measuring the external and internal quality of agricultural and food products and assuring their safety from diseases and contamination, is one of the most important issues facing the food sector at present. This is a result of multiple and repeated complaints against agricultural producers and food manufacturers for the inability to meet quality requirements that meet the consumer’s desires. When agricultural and food products do not meet quality standards and safety criteria, consumers lose faith in producers leading to the loss of these products’ competitiveness in the market, and thus significant economic loss. With consumers rapidly growing demand for safer and better-quality food. So, agricultural producers and food manufacturers are working hard to eliminate sources of food contamination and achieve better quality. Although some systems are proposed to achieve food safety and quality by achieving a set of conditions that fall under the so-called good manufacturing practices (GMP) and hazard analysis and critical control point (HACCP) which represents the best way to achieve food security through all production steps. Unfortunately, with all these requirements for GMP and HACCP systems and others, they may not completely ensure the production of safe food free of contaminants and defects. So, it has become necessary to introduce modern technologies to quality inspect and detect blemishes and contamination and then reject these products that are not fit for human consumption. Therefore, the focus was on the development of non-destructive, modern, fast, reliable, and applicable methods that meet the needs of both food manufacturers and producers, as well as the desires of the consumer. So, the majority of all quality detection systems use electromagnetic wave measurements across all regions of the electromagnetic spectrum through imaging technologies. Gamma-ray and X-ray imaging technologies have high frequency and energy and are often used for irradiation, plant breeding applications, determining food quality, and food safety. Although applications of this technique are mainly used for research and development work, it has great potential to serve as a tool for the development of various plant varieties, assessment and quality assurance, and management practices for a wide range of agro-food practices. As well, UV and visible light imaging has proved itself to be very reliable and efficient for performing several tasks such as evaluating color, shape, size, and detect external defects. Additionally, these imaging systems can empower the agricultural and food industry with a new tool to detect defects and contaminations to ensure food safety and quality. Undoubtedly, with the evolution of a new generation of detectors and cameras, imaging within UV and visible light regions will have great potential in food defense and safety. Furthermore, IR and HSI systems can combine spectral and spatial data of a sample. For this reason, the HSI system became a standalone unit for non-destructive analysis of the physical, textural, and chemical parameters of the sample. So, the IR-HIS system has gained fame and a good reputation and is elaborately tested to predict chemical composition, detect defects, and adulterate agricultural and food products. Also, microwave and radio-wave imaging can improve the efficiency of real-time monitoring of food production, storing, and control quality chain. There appears to be an acceleration in the growth of hardware and software of imaging systems to overcome the limitations of this technology will help the agricultural and food industry in implementing the different imaging systems for rapid and in-line quality monitoring applications such as foreign material detection, discrimination external and internal quality attributes of agricultural and food products and detecting various defects and diseases. In conclusion, in this chapter, it has been shown that the different modern imaging technologies could provide unquestionably advantages for monitoring the quality and safety of agri-food production and processing.



The corresponding author thanks all members of this chapter for their scientific efforts. Also, extends his thanks to both Agricultural Engineering Research Institute (AEnRI), Agricultural Research Center (ARC); Menoufia University, Faculty of Agriculture, Department of Agriculture Engineering; Szent István University, Faculty of Agricultural and Environmental Sciences, Horticultural Institute; CREA (Council for Agricultural Research and Economics) of Treviglio, Italy; and King Fahd University of Petroleum and Minerals, Saudi Arabia, for their full support to the work team.


Conflict of interest

The authors declare no conflict of interest.


  1. 1. Kingdom FAA. Mach bands explained by response normalization. Frontiers in Human Neuroscience. 2014; 8, 843. DOI:10.3389/fnhum.2014.00843
  2. 2. Land EH, McCann JJ. Lightness and Retinex Theory. Journal of the Optical Society of America. 1971; 61: 1-11.
  3. 3. Gonzalez RC, Woods, RE. Digital Image Processing. 2nd ed. Prentice Hall, Upper Saddle River; 2002. ISBN 10: 0201180758 / ISBN 13: 9780201180756
  4. 4. Cheremkhin PA, Lesnichii V V, Petrov NV. Use of spectral characteristics of DSLR cameras with Bayer filter sensors. Journal of Physics: Conference Series. 2014; 536: 012021. DOI:10.1088/1742-6596/536/1/012021
  5. 5. Bellman R. Adaptive Control Processes: A Guided Tour, Princeton University Press. 1961;
  6. 6. ElMasry G, Cubero S, Molto E, Blasco J. In-line sorting of irregular potatoes by using automated computer-based machine vision system. Journal of Food Engineering. 2012; 112: 60-68. DOI: 10.1016/j.jfoodeng.2012.03.027
  7. 7. Ceamanos X, Valero S. Processing hyperspectral images. In: Nicolas B, Mehrez Z, editors. Handbook of Optical Remote Sensing of Land Surface. Elsevier; 2016. p. 163-200. DOI:10.1016/C2015-0-01220-5
  8. 8. Yao-Ze F, Sun D-W. Application of Hyperspectral Imaging in Food Safety Inspection and Control: A Review. Critical Reviews in Food Science and Nutrition. 2012; 52: 1039-1058. DOI: 10.1080/10408398.2011.651542
  9. 9. Kumar S, Kapil P, Bhardwaj Y, Acharya US, Gupta C. Anomaly Detection in Fruits using Hyper Spectral Images. International Journal of Trend in Scientific Research and Development. 2019; 3(4): 394-397. DOI: 10.31142/ijtsrd23753
  10. 10. Pau G, Fuchs F, Sklyar O, Boutros M, Huber W. EBImage—an R package for image processing with applications to cellular phenotypes. Bioinformatics. 2010; 26(7): 979-981. DOI: 10.1093/bioinformatics/btq046
  11. 11. Haralick RM, Shanmugam K, Dinstein I. Textural Features for Image Classification. IEEE Transaction on Systems, Man and Cybernetics. 1973; 3(6): 610-621. DOI: 10.1109/TSMC.1973.4309314
  12. 12. Park Y, Guldmann JM. Measuring continuous landscape patterns with Gray-Level Co-Occurrence Matrix (GLCM) indices: An alternative to patch metrics? Ecological Indicators. 2020; 109: 1-18. DOI: 10.1016/j.ecolind.2019.105802
  13. 13. Ibrahima A, El-Bialeea N, Saad M, Romano E. Non-Destructive Quality Inspection of Potato Tubers Using Automated Vision System. International Journal on Advanced Science Engineering Information Technology. 2020; 10(6): 2419-2428. DOI: 10.18517/ijaseit.10.6.13079
  14. 14. Romano E, Bergonzoli S, Pecorella I, Bisaglia C, De Vita P. Methodology for the definition of durum wheat yield homogeneous zones by using satellite spectral indices. Remote Sens. 2021;13: 2036.
  15. 15. Yu H, Guo L, Kharbach M, Han W. Multi-Way Analysis Coupled with Near-Infrared Spectroscopy in Food Industry: Models and Applications. Foods. 2021; 10: 802.
  16. 16. Gourvénec S, Tomasi G, Durville C, Di Crescenzo E, Saby C, Massart D, Bro R, Oppenheim G. CuBatch, a MATLAB® interface for n-mode data analysis. Chemometrics and Intelligent Laboratory Systems. 2005; 77 (1-2): 122-130. DOI: 10.1016/j.chemolab.2004.01.027
  17. 17. Kolda TG, Bader BW. Algorithm 862: MATLAB tensor classes for fast algorithm prototyping. ACM Transactions on Mathematical Software. 2006; 32(4): 635-653. DOI: 10.1145/1186785.1186794
  18. 18. Phan AH, Tichavský P, Cichocki A. Tensorbox: A Matlab Package for Tensor Decomposition. LABSP RIKEN: Saitama, Japan. 2012.∼phan/tensorbox.php
  19. 19. Phan AH, Tichavsky P, Cichocki A. Low complexity damped Gauss–Newton algorithms for CANDECOMP/PARAFAC. SIAM Journal on Matrix Analysis and Applications. 2013; 34: 126-147.
  20. 20. Giordani P, Kiers HA, Del Ferraro MA. Three-way component analysis using the R package ThreeWay. Journal of Statistical Software. 2014; 57: 1-23. DOI: 10.18637/jss.v057.i07
  21. 21. Helwig NE. Multiway: Component Models for Multi-Way Data. R Package Version 1.0-5. 2021; https://CRAN.Rproject. org/package=multiway.
  22. 22. Kossaifi J, Panagakis Y, Anandkumar A, Pantic M. Tensorly: Tensor learning in python. Journal of Machine Learning Research. 2019; 20: 1-6.
  23. 23. Hao L, Liang S, Ye J, Xu Z. TensorD: A tensor decomposition library in TensorFlow. Neurocomputing. 2018; 318: 196-200. DOI: 10.1016/j.neucom.2018.08.055
  24. 24. Gregoire P, Florian F, Oleg S, Michael B, Wolfgang H. EBImage - an R package for image processing with applications to cellular phenotypes. Bioinformatics. 2010; 26(7): 979-981. DOI: 10.1093/bioinformatics/btq046
  25. 25. Grompone von Gioi R, Randall G. Unsupervised smooth contour detection. Image processing on line. 2016; 6: 233-267. DOI: 10.5201/ipol.2016.175
  26. 26. Grompone von Gioi R. A Contrario Line Segment Detection, Springer. 2014; DOI: 10.1007/978-1-4939-0575-1
  27. 27. Otsu N. A threshold selection method from gray-level histograms. IEEE Transactions on Systems, Man, and Cybernetics. 1979; 9 (1): 62-66. DOI: 10.1109/TSMC.1979.4310076
  28. 28. Starr C. Biology: Concepts and Applications. Thomson Brooks Cole; 6th edition. 2005; ISBN 0-534-46226-X
  29. 29. Williamson SJ, Cummins H. Light and Color in Nature and Art. American Journal of Physics. 1983; 1(2). DOI: 10.1119/1.13928
  30. 30. Richards A. X rays and gamma rays: crookes tubes and nuclear light. In Alien Vision: exploring the electromagnetic spectrum with imaging technology. SPIE Press, Washington. 2001. ISBN-10 : 0819441422
  31. 31. Saha GB. Physics and radiobiology of nuclear medicine. Springer, New York. 2006. DOI: 10.1007/978-0-387-36281-6
  32. 32. Pires LF, de Macedo JR, de Souza MD, Bacchi OS, Reichardt K. Gamma-ray computed tomography to characterize soil surface sealing. Applied Radiation and Isotopes. 2002; 57: 375-380. DOI: 10.1016/S0969-8043(02)00098-2
  33. 33. Naime JM, Cruvinel PE, Silva AM, Crestana S, Vaz CMP. Applications of X and g-rays dedicated computerized tomography scanner in agriculture. In: Paulo, E.C., Luiz, A.C. (Eds.), Agricultural Tomography. Embrapa - CNPDIA, Sao Carlos. 2000; 96-104
  34. 34. Macedo A, Vaz, CMP, Naime JM, Jorge LAC, Crestana S, Cruvinel PE, Pereira JCD, Guimaraes MF, Ralisch R. Soil management impact and wood science recent contributions of Embrapa Agricultural Instrumentation Center using CT imaging. In: Paulo, E.C., Luiz, A.C. (Eds.), Agricultural Tomography. Embrapa- CNPDIA, Sao Carlos. 2000; 44-54
  35. 35. Tsukamoto T, Uchida H, Nakanishi H, Nishiyama S, Tsukada H, Matsuhashi S, Nishizawa NK, Mori S. H215O translocation in rice was enhanced by 10 μm 5-aminolevulinic acid as monitored by positron-emitting tracer imaging system (PETIS). Soil Science and Plant Nutrition. 2004; 50(7):1085-1088. DOI: 10.1080/00380768.2004.10408578
  36. 36. Kawachi N, Kikuchi K, Suzui N, Ishii S, Fujimaki S, Ishioka NS, Watabe H. Imaging of carbon translocation to fruit using carbon-11-labeled carbon dioxide and positron emission tomography. IEEE Transactions on Nuclear Science. 2011; 58(2): 395-399. DOI: 10.1109/TNS.2011.2113192
  37. 37. Manickavasagan A, Jayas DS, White, NDG. Thermal imaging to detect infestation by Cryptolestes ferrugineus inside wheat kernels. Journal of Stored Products Research. 2008; 44(2): 186-192. DOI: 10.1016/j.jspr.2007.10.006
  38. 38. Rossel RAV, Taylor JJ, Mcbratney AB. Multivariate calibration of hyperspectral g-ray energy spectra for proximal soil sensing. European Journal of Soil Science. 2007; 58: 343-353. DOI: 10.1111/j.1365-2389.2006.00859.x
  39. 39. Tsukamoto T, Nakanishi H, Kiyomiya S, Watanbe S, Matsuhashi S, Nishizawa NK, Mori S. 52Mn translocation in barley monitored using a positron-emitting tracer imaging system. Soil Science and Plant Nutrition. 2006; 52(6): 717-725.
  40. 40. Nakamura S, Suzui N, Nagasaka T, Komatsu F, Ishioka NS, Tanabata SI, Kawachi N, Rai H, Hattori H, Chino M, Fujimaki S. Application of glutathione to roots selectively inhibits cadmium transport from roots to shoots in oilseed rape. Journal of Experimental Botany. 2013; 64(4): 1073-1081. DOI: 10.1093/jxb/ers388
  41. 41. Fujimaki S, Suzui N, Ishioka NS, Kawachi N, Ito S, Chino M, Nakamura S. Tracing cadmium from culture to spikelet: noninvasive imaging and quantitative characterization of absorption, transport, and accumulation of cadmium in an intact rice plant. Plant Physiology. 2010; 152(4): 1796-1806. DOI: 10.1104/pp.109.151035
  42. 42. Watanabe S, Iida Y, Suzui N, Katabuchi T, Ishii S, Kawachi N, Hanaoka H, Watanabe S, Matsuhashi S, Endo K, Ishioka NS. Production of no-carrier-added 64Cu and applications to molecular imaging by PET and PETIS as a biomedical tracer. Journal of Radioanalytical and Nuclear Chemistry. 2009; 280: 199-205. DOI:10.1007/S10967-008-7443-9
  43. 43. Ishikawa S, Suzui N, Ito-Tanabata S, Ishii S, Igura M, Abe T, Kuramata M, Kawachi N, Fujimaki S. Real-time imaging and analysis of differences in cadmium dynamics in rice cultivars (Oryza sativa) using positron-emitting107Cd tracer. BMC Plant Biology. 2011; 11(1): 1-12. DOI: 10.1186/1471-2229-11-172
  44. 44. Kiyomiya S, Nakanishi H, Uchida H, Nishiyama S, Tsukada H, Ishioka NS, Watanabe S, Osa A, Mizuniwa C, Ito T, Matsuhashi S, Hashimoto S, Sekine T, Tsuji A, Mori S. Light activates H215O flow in rice: detailed monitoring using a positron-emitting tracer imaging system (PETIS). Physiol Plant. 2001; 113(3): 359-367. DOI: 10.1034/j.1399-3054.2001.1130309.x
  45. 45. Mori S, Kiyomiya S, Nakanishi H, Ishioka NS, Watanabe S, Osa A, Matsuhashi S, Hashimoto S, Sekine T, Uchida H, Nishiyama S, Tsukada H, Tsuji A. Visualization of 15O-water flow in tomato and rice in the light and dark using a positron-emitting tracer imaging system (PETIS). Soil Science and Plant Nutrition. 2000; 46(4): 975-979. DOI: 10.1080/00380768.2000.10409163
  46. 46. Kotwaliwale N, Singh K, Kalne A, Jha SN, Seth N, Kar A. X-ray imaging methods for internal quality evaluation of agricultural produce. Journal of Food Science and Technology. 2014; 51(1): 1-15. DOI: 10.1007/s13197-011-0485-y
  47. 47. Adedeji AA, Ngadi MO. Microstructural characterization of deep-fat fried breaded chicken nuggets using X-ray micro-computed tomography. Journal of Food Process Engineering. 2010; 34(6): 2205-2219. DOI: 10.1111/j.1745-4530.2009.00565.x
  48. 48. Vavrik D, Jakubek J. Radiogram enhancement and linearization using the beam hardening correction method. Nuclear Instruments and Methods in Physics Research. 2009; 607(1): 212-214. DOI: 10.1016/j.nima.2009.03.156
  49. 49. Ammari H. An Introduction to Mathematics of Emerging Biomedical Imaging. 1st ed; Springer-Verlag Berlin Heidelberg, Germany. 2008; X, 198p. DOI: 10.1007/978-3-540-79553-7
  50. 50. Patton JA, Turkington TG. SPECT/CT physical principles and attenuation correction. Journal of Nuclear Medicine Technology. 2008; 36(1): 1-10. DOI:
  51. 51. Neethirajan S, Jayas DS, White NDG. Detection of sprouted wheat kernel using soft X-ray image analysis. Journal of Food Engineering. 2007; 81(3): 509-513. DOI: 10.1016/j.jfoodeng.2006.11.020
  52. 52. Bianchini VJM, Mascarin GM, Silva LCAS, Arthur V, Carstensen JM, Boelt B, da Silva CB. Multispectral and X-ray images for characterization of Jatropha curcas L. seed quality. Plant Methods. 2021; 17(9): 1-13. DOI: 10.21203/
  53. 53. Einarsdottir H, Emerson MJ, Clemmensen LH, Scherer K, Willer K, Bech M, Larsen R, Ersbøll BK, Pfeiffer F. Novelty detection of foreign objects in food using multi-modal X-ray imaging. Food Control. 2016; 67: 39-47. DOI: 10.1016/j.foodcont.2016.02.023
  54. 54. Al-Mezeini N, Manickavasagan A, Al-Yahyai R, Al-Wahaibi A, Al-Raeesi A, Khriji L. X-ray Imaging of Stored Dates to Detect Infestation by Saw-Toothed Beetles. International journal of fruit science. 2016; 16(1): 42-56. DOI: 10.1080/15538362.2015.1044692
  55. 55. Nielsen MS, Lauridsen T, Christensen LB, Feidenhans'l R. X-ray darkfield imaging for detection of foreign bodies in food. Food Control. 2013; 30(2): 531-535. DOI: 10.1016/j.foodcont.2012.08.007
  56. 56. Jiang JA, Chang HY, Wu KH, Ouyang CS, Yang MM, Yang EC, Chen TW, Lin TT. An adaptive image segmentation algorithm for X-ray quarantine inspection of selected fruits. Computers and electronics in agriculture. 2008; 60(2): 190-200. DOI: 10.1016/j.compag.2007.08.006
  57. 57. Karunakaran C, Jayas DS, White, NDG. Mass determination of wheat kernels from X-ray images. American Society of Agricultural and Biological Engineers. Conference: 2004, DOI: 10.13031/2013.16742
  58. 58. Haff RP, Slaughter DC. Real-time X-ray inspection of wheat for infestation by the granary weevil, Sitophilus Granarius (L.). Transactions of the ASAE. 2004; 47(2): 531-537. DOI: 10.13031/2013.16022
  59. 59. Karunakaran C, Jayas DS, White, NDG. Soft X-ray inspection of wheat kernels infested by Sitophilus oryzae. Transactions of the ASAE. 2003; 46(3): 739-745. DOI: 10.13031/2013.13576
  60. 60. Shahin MA, Tollner EW, Gitaitis RD, Sumner DR, Maw BW. Apple classification based on surface bruises using image processing and neural networks. Transactions of the ASAE. 2002; 45 (5): 1619-1627. DOI: 10.13031/2013.11047
  61. 61. Kim S, Schatzki TF. Apple watercore sorting system using X-ray imagery. I. Algorithm development. Transactions of the ASAE. 2000; 43 (6): 1695-1702. DOI: 10.13031/2013.3070
  62. 62. Patel KK, Kar A, Khan MA. Potential of reflected UV imaging technique for detection of defects on the surface area of mango. Journal of Food Science and Technology. 2019; 56(2): 1295-1301. DOI: 10.1007/s13197-019-03597-w
  63. 63. Cosentino A. Practical notes on ultraviolet technical photography for art examination. Conservar Património. 2015; 21: 53-62. DOI: 10.14568/cp2015006
  64. 64. Özlüoymak ÖB. Development of a UV-based Imaging System for Real-Time Detection and Separation of Dried Figs Contaminated with Aflatoxins. Journal of Agricultural Sciences. 2014; 20: 302-316. DOI: 10.15832/tbd.87873
  65. 65. Nagle M, Intani K, Mahayothee B, Sardsud V, Muller J. Nondestructive mango quality assessment using image processing: inexpensive innovation for the fruit handling industry. In: Conference on international research on food security, natural resource management and rural development (Tropentag), Gottingen (Germany), 1-4. 2012
  66. 66. Özlüoymak ÖB. A Research on Separation System Design of Aflatoxin Contaminated Dried Fig [Ph.D.]. Adana, Turkey: Çukurova University; 2012
  67. 67. Slaughter DC, Obenland DM, Thompson JF, Arpaia ML, Margosan DA. Nondestructive freeze damage detection in oranges using machine vision and ultraviolet fluorescence. Postharvest Biology and Technology. 2008; 48(3): 341-346. DOI: 10.1016/j.postharvbio.2007.09.012
  68. 68. Blasco J, Aleixos N, Gomez J, Molto E. Citrus sorting by identification of the most common defects using multispectral computer vision. Journal of Food Engineering. 2007; 83(3): 384-393. DOI: 10.1016/j.jfoodeng.2007.03.027
  69. 69. Karami MA, Mirabolfathy M. Neural Network to Separate Aflatoxin Contaminated Pistachio Nuts. Acta Horticulturae. 2006; 726 (726): 605-610. DOI: 10.17660/ActaHortic.2006.726.103
  70. 70. Kleynen O, Destain MF. Detection of defects on fruits by machine vision and unsupervised segmentation. In: AgEng conference, September 12-16. Technological Institute, Leuven, Belgium, 2004; 1006-1007.
  71. 71. Steiner WE, Rieker RH, Battaglia R. Aflatoxin Contamination in Dried Figs: Distribution and Association with Fluorescence. Journal of Agricultural Food Chemistry. 1988; 36(1): 88-91. DOI: 10.1021/jf00079a022
  72. 72. Yabe K, Ando Y, Ito M, Terakado N. Simple Method for Screening Aflatoxin-Producing Molds by UV Photography. Applied and Environmental Microbiology. 1987; 53(2): 230-234. DOI: 10.1128/AEM.53.2.230-234.1987
  73. 73. Robertson GL. Food packaging principles and practice, 3rd ed. CRC Press, Taylor and Francis group, UK. 2012. ISBN 9781439862414
  74. 74. Jha SN. Non-destructive evaluation of food quality. Ist ed. Heidelberg, Springer; 2010. 18-22p. DOI: 10.1007/978-3-642-15796-7
  75. 75. ElMasry G, Sun DW. Principles of hyperspectral imaging technology. In: Da-Wen Sun, editors. Handbook of Hyperspectral imaging for food quality analysis and control. 1st ed. Academic Press, San Diego, California, USA, 2010. 3-43p. DOI: 10.1016/B978-0-12-374753-2.10001-2
  76. 76. Lou W, Nakai S. Application of artificial neural networks for predicting the thermal inactivation of bacteria: a combined effect of temperature, pH and water activity. Food Research International. 2001; 34(47): 573-579. DOI: 10.1016/S0963-9969(01)00074-6
  77. 77. Brosnan T, Sun DW. Inspection and grading of agricultural and food products by computer vision systems - a review. Computers and Electronics in Agriculture. 2002; 36 (2–3): 193-213.
  78. 78. Wu D, Sun DW. Colour measurements by computer vision for food quality control-a review. Trends in Food Science and Technology. 2013; 29(1): 5-20. DOI: 10.1016/j.tifs.2012.08.004
  79. 79. Chen YR, Chao K, Kim MS. Machine vision technology for agricultural applications. Computers and Electronics in Agriculture. 2002; 36(2): 173-191. DOI: 10.1016/S0168-1699(02)00100-X
  80. 80. Hong H, Yang X, You Z, Cheng F. Visual quality detection of aquatic products using machine vision. Aquacultural Engineering. 2014; 63: 62-71. DOI: 10.1016/j.aquaeng.2014.10.003
  81. 81. Payne A, Walsh K, Subedi P, Jarvis D. Estimating mango crop yield using image analysis using fruit at ‘stone hardening’ stage and night time imaging. Computers and Electronics in Agriculture. 2014; 100: 160-167. DOI: 10.1016/j.compag.2013.11.011
  82. 82. Wijethunga P, Samarasinghe S, Kulasiri D, Woodhead I. Digital Image Analysis Based Automated Kiwifruit Counting Technique. 23rd International Conference Image and Vision Computing, New Zealand, Christchurch, Publisher: IEEE, 2008. 1-6. DOI: 10.1109/IVCNZ.2008.4762149
  83. 83. Song Y, Glasbey CA, Horgan GW, Polder G, Dieleman JA, van der Heijdenc GWAM. Automatic fruit recognition and counting from multiple images. Biosystems Engineering. 2014; 118: 203-215. DOI: 10.1016/j.biosystemseng.2013.12.008
  84. 84. Payne AB, Walsh KB, Subedi PP, Jarvis D. Estimation of mango crop yield using image analysis – Segmentation method. Computers and Electronics in Agriculture. 2013; 91: 57-64. DOI: 10.1016/j.compag.2012.11.009
  85. 85. Linker R, Cohen O, Naor A. Determination of the number of green apples in RGB images recorded in orchards. Computers and Electronics in Agriculture. 2012; 81: 45-57. DOI: 10.1016/j.compag.2011.11.007
  86. 86. Narendra VG, Hareesh KS. Quality inspection and grading of agricultural and food products by computer vision—a review. International Journal of Computer Applications. 2010; 2(1): 975-8887. DOI: 10.5120/612-863
  87. 87. Safren O, Alchanatis V, Ostrovsky V, Levi O. Detection of green apples in hyperspectral images of apple-tree foliage using machine vision. Transactions of the ASABE. 2007; 50(6): 2303-2313. DOI: 10.13031/2013.24083
  88. 88. Nyalala I, Okinda C, Nyalala L, Makange N, Chao Q, Chao L, Yousaf K, Chen K. Tomato volume and mass estimation using computer vision and machine learning algorithms: Cherry tomato model. Journal of Food Engineering. 2019; 263: 288-298. DOI: 10.1016/j.jfoodeng.2019.07.012
  89. 89. Weijun X, Wang F, Yang D. Research on Carrot Grading Based on Machine Vision Feature Parameters. IFAC PapersOnLine. 2019; 52(30): 30-35. DOI: 10.1016/j.ifacol.2019.12.485
  90. 90. Rodriguez-Pulido FJ, Ferrer-Gallego R, Gonzalez-Miret ML, Rivas-Gonzalo JC, Song Y, Glasbey CA, Horgan GW, Polder G, Dieleman JA, van der Heijden GWAM. Automatic fruit recognition and counting from multiple images. Biosystems Engineering. 2014; 118(1), 203-215. DOI: 10.1016/j.biosystemseng.2013.12.008
  91. 91. Aggarwal AK, Mohan R. Aspect ratio analysis using image processing for rice grain quality. International Journal of Food Engineering. 2010; 6(5): 1-14. DOI: 10.2202/1556-3758.1788
  92. 92. Dubosclard P, Larnier S, Konik H, Herbulot A, Devy M. Automated visual grading of grain kernels by machine vision. In: Twelfth International Conference on Quality Control by Artificial Vision. 2015; 9534: 1-8. DOI: 10.1117/12.2182793
  93. 93. Zareiforoush H, Minaei S, Alizadeh MR, Banakar A, Samani BH. Design, development and performance evaluation of an automatic control system for rice whitening machine based on computer vision and fuzzy logic. Computers and Electronics in Agriculture. 2016; 124:14-22. DOI: 10.1016/j.compag.2016.01.024
  94. 94. Bhat S, Panat S, Arunachalam N. Classification of rice grain varieties arranged in scattered and heap fashion using image processing. In: Ninth International Conference on Machine Vision (ICMV 2016). 2017; 10341: 1-6. DOI: 10.1117/12.2268802
  95. 95. Kiliç K, Boyaci IH, Köksel H, Küsmenoglu I. A classification system for beans using computer vision system and artificial neural networks. Journal of Food Engineering. 2007; 78(3): 897-904. DOI: 10.1016/j.jfoodeng.2005.11.030
  96. 96. Venora G, Grillo O, Ravalli C, Cremonini R. Identification of Italian landraces of bean (Phaseolus vulgaris L.) using an image analysis system. Scientia Horticulturae. 2009; 121(4): 410-418. DOI: 10.1016/j.scienta.2009.03.014
  97. 97. Laurent B, Ousman B, Dzudie T, Carl MFM, Emmanuel T. Digital camera images processing of hard-to-cook beans. Journal of Engineering and Technology Research. 2010; 2(9): 177-188. DOI: 10.5897/JETR.9000027
  98. 98. Araújo SA, Alves WAL, Belan PA, Anselmo KP. A computer vision system for automatic classification of most consumed Brazilian beans. In: Bebis G. et al. (eds) Advances in Visual Computing. ISVC 2015. Lecture Notes in Computer Science, Springer, 2015. p. 45-53. DOI: 10.1007/978-3-319-27863-6_5
  99. 99. Belan PA, De Macedo RAG, Pereira MA, Alves WAL, Araújo SA. A fast and robust approach for touching grains segmentation. In: Campilho A., Karray F., ter Haar Romeny B. (eds) Image Analysis and Recognition. Lecture Notes in Computer Science, vol 10882. Springer; 2018. p. 482-489. DOI: 10.1007/978-3-319-93000-8_54
  100. 100. Belan PA, de Macedo RAG, Alves WAL, Santana JCC, Araújo SA. Machine vision system for quality inspection of beans. The International Journal of Advanced Manufacturing Technology. 2020; 111 (11-12), 1-15. DOI: 10.1007/s00170-020-06226-5
  101. 101. Girolami A, Napolitano F, Faraone D, Braghieri A. Measurement of meat color using a computer vision system. Meat science. 2013; 93(1): 111-118. DOI: 10.1016/j.meatsci.2012.08.010
  102. 102. Jackman P, Sun DW, Allen P. Recent advances in the use of computer vision technology in the quality assessment of fresh meats. Trends in Food Science and Technology. 2011; 22(4), 185-197. DOI: 10.1016/j.tifs.2011.01.008
  103. 103. Chen K, Sun X, Qin C, Tang X. Color grading of beef fat by using computer vision and support vector machine. Computers and Electronics in Agriculture. 2010; 70(1): 27-32. DOI: 10.1016/j.compag.2009.08.006
  104. 104. Mancini RA, Hunt MC. Current research in meat color. Meat Sci. 2005; 71(1):100-121. DOI: 10.1016/j.meatsci.2005.03.003
  105. 105. Zheng C, Sun DW, Zheng L. Correlating color to moisture content of large cooked beef joints by computer vision. Journal of Food Engineering. 2006; 77(4): 858-863. DOI: 10.1016/j.jfoodeng.2005.08.013
  106. 106. Teimouri N, Omid M, Mollazade K, Mousazadeh H, Alimardani R, Karstoft H. On-line separation and sorting of chicken portions using a robust vision-based intelligent modeling approach. Biosystems Engineering. 2018; 167(4): 8-20. DOI: 10.1016/j.biosystemseng.2017.12.009
  107. 107. Gu J, He N, Wu X. A New Detection Method for Fish Freshness. In Computational Intelligence and Design (ISCID), 2014 Seventh International Symposium 2. 2014; 555-558. DOI: 10.1109/ISCID.2014.153
  108. 108. Vatansever F, Hamblin MR. Far infrared radiation (FIR): its biological effects and medical applications, Photonics Lasers Med. 2012; 1(4): 255-266. DOI: 10.1515/plm-2012-0034
  109. 109. Jain RK, Hoffman AJ, Jepsen PU, LIU PQ, Turchinovich D, Vitiello MS. Mid-infrared, long-wave infrared, and terahertz photonics: introduction. Optics Express. 2020; 28(9): 14169-14175. DOI: 10.1364/OE.395165
  110. 110. Bae TW. Spatial and temporal bilateral filter for infrared small target enhancement. Infrared Physics and Technology. 2014; 63: 42-53.
  111. 111. Kim DK. Flame detection using region expansions and on-line variances in infrared image. Journal of Korea Multimedia Society. 2009; 12(11): 1547-1556
  112. 112. Huang H, Yu H, Xu H, Ying Y. Near infrared spectroscopy for on/in-line monitoring of quality in foods and beverages: A review. Journal of Food Engineering. 2008; 87(3): 303-313. DOI: 10.1016/j.jfoodeng.2007.12.022
  113. 113. Peng Y, Wenxiu W. Application of near-infrared spectroscopy for assessing meat quality and safety. In: Theophanides T, editors. Handbook of Infrared Spectroscopy - Anharmonicity of Biomolecules, Crosslinking of Biopolymers, Food Quality and Medical Applications. Ist ed. IntechOpen; 2015. p. 137-163. DOI: 10.5772/58912
  114. 114. Scotter C. Use of near infrared spectroscopy in the food industry with particular reference to its applications to on/in-line food processes. Food Control. 1990; 1(3): 142-149. DOI: 10.1016/0956-7135(90)90006-X
  115. 115. Workman J, Weyer L. Practical guide and spectral atlas for interpretive near-infrared spectroscopy. 2nd ed. CRC Press, Taylor & Francis Group; 2012. 326 p. DOI: 10.1201/b11894
  116. 116. Aenugu HPR, Kumar DS, Parthiban SN, Ghosh S, Banji D. Near Infra-Red Spectroscopy—An Overview. International Journal of ChemTech Research. 2011; 2011, 3: 825-836. ISSN: 0974-4290
  117. 117. Genot V, Bock L, Dardenne P, Colinet G. Use of near-infrared reflectance spectroscopy in soil analysis. A review. Biotechnology, Agronomy, Society and Environment. 2014; 18: 247-261
  118. 118. Liu Y, Chen YR, Kim MS, Chan DE, Lefcourt AM. Development of simple algorithms for the detection of fecal contaminants on apples from visible/near infrared hyperspectral reflectance imaging. Journal of Food Engineering. 2007; 81(2): 412-418. DOI: 10.1016/j.jfoodeng.2006.11.018
  119. 119. Park B, Lawrence KC, Windham WR, Smith D. Performance of hyperspectral imaging system for poultry surface fecal contaminant detection. Journal of Food Engineering. 2006; 75(3): 340-348. DOI: 10.1016/j.jfoodeng.2005.03.060
  120. 120. Ariana D, Lu R, Guyer D.E. Hyperspectral reflectance imaging for detection of bruises on pickling cucumbers. Computers and Electronics in Agriculture. 2006; 53(1): 60-70. DOI: 10.1016/j.compag.2006.04.001
  121. 121. Xiong Z, Sun DW, Xie A, Han Z, Wang L. Potential of hyperspectral imaging for rapid prediction of hydroxyproline content in chicken meat. Food Chemistry. 2015; 175: 417-422. DOI: 10.1016/j.foodchem.2014.11.161
  122. 122. Gowen AA, O’Donnell CP, Cullen PJ, Downey G, Frias JM. Hyperspectral imaging e an emerging process analytical tool for food quality and safety control. Trends in Food Science and Technology. 2007; 18(2): 590-598. DOI: 10.1016/j.tifs.2007.06.001
  123. 123. Monteiro S, Minekawa Y, Kosugi Y, Akazawa T, Oda K. Prediction of sweetness and amino acid content in soybean crops from hyperspectral imagery. ISPRS Journal of Photogrammetry and Remote Sensing. 2007; 62(1): 2-12. DOI: 10.1016/j.isprsjprs.2006.12.002
  124. 124. Smail V, Fritz A, Wetzel D. Chemical imaging of intact seeds with NIR focal plane array assists plant breeding. Vibrational Spectroscopy. 2006; 42(2): 215-221. DOI: 10.1016/j.vibspec.2006.02.004
  125. 125. Uno Y, Prasher S, Lacroix R, Goel P, Karimi Y, Viau A, Patel R.M. Artificial neural networks to predict corn yield from Compact Airborne Spectrographic Imager data. Computers and Electronics in Agriculture. 2005; 47(2): 149-161. DOI: 10.1016/j.compag.2004.11.014
  126. 126. Lewis E, Schoppelrei J, Lee E, Kidder L. Near-infrared chemical imaging as a process analytical tool. In: Bakeev K, editor, Handbook of Process analytical technology. 1st ed. Oxford: Blackwell Publishing; 2005. p. 187-225. DOI: 10.1002/9780470988459.ch7
  127. 127. Koehler F, Lee E, Kidder L, Lewis N. Near infrared spectroscopy: the practical chemical imaging solution. Spectroscopy Europe. 2002; 14(3): 12-19
  128. 128. Mahesh S, Manickavasagan A, Jayas DS, Paliwal J, White, NDG. Feasibility of near-infrared hyperspectral imaging to differentiate Canadian wheat classes. Biosystems Engineering. 2008; 101(1): 50-57. DOI: 10.1016/j.biosystemseng.2008.05.017
  129. 129. Singh CB, Jayas DS, Paliwal J, White NDG. Identification of insect-damaged wheat kernels using short-wave nearinfrared hyperspectral and digital colour imaging. Computers and Electronics in Agriculture. 2010; 73(2): 118-125. DOI: 10.1016/j.compag.2010.06.001
  130. 130. Ariana DP, Lu R. Quality evaluation of pickling cucumbers using hyperspectral reflectance and transmittance imaging: Part II. Performance of a prototype. Sensing and Instrumentation for Food Quality and Safety. 2008; 2(3): 152-160. DOI: 10.1007/s11694-008-9058-9
  131. 131. Shaw G, Manolakis D. Signal processing for hyperspectral image exploitation. IEEE Signal Processing Magazine. 2002; 19(1): 12-16. DOI: 10.1109/79.974715
  132. 132. Lu RF, Chen YR. Hyperspectral imaging for safety inspection of food and agricultural products. In: SPIE Conference on Pathogen Detection and Remediation for Safe Eating, Boston. 1998; DOI: 10.1117/12.335771
  133. 133. Siche R, Vejarano R, Aredo V, Velasquez L, Saldana E, Quevedo R. Evaluation of Food Quality and Safety with Hyperspectral Imaging (HSI). Food Engineering Reviews. 2016; 8(3): 306-322. DOI: 10.1007/s12393-015-9137-8
  134. 134. Cheng JH, Sun D.W. Hyperspectral imaging as an effective tool for quality analysis and control of fish and other seafoods: Current research and potential applications. Trends in Food Science and Technology. 2014; 37(2): 78-91. DOI: 10.1016/j.tifs.2014.03.006
  135. 135. Huang H, Liu L, Ngadi MO. Recent developments in hyperspectral imaging for assessment of food quality and safety. Sensors. 2014; 14(4): 7248-7276. DOI: 10.3390/s140407248
  136. 136. Dumont J, Hirvonen T, Heikkinen V, Mistretta M, Granlund L, Himanen K, Fauch L, Porali I, Hiltunen J, Keski-Saari S, Nygren M, Oksanen E, Hauta-Kasari M, Keinänen M. Thermal and hyperspectral imaging for Norway spruce (Picea abies) seeds screening. Computers and Electronics in Agriculture. 2015; 116: 118-24. DOI: 10.1016/j.compag.2015.06.010
  137. 137. Manley M, Mcgoverin CM, Engelbrecht P, Geladi P. Infuence of grain topography on near infrared hyperspectral images. Talanta. 2012; 89(2): 223-230. DOI: 10.1016/j.talanta.2011.11.086
  138. 138. Ambrose A, Kandpal LM, Kim MS, Lee WH, Cho BK. High speed measurement of corn seed viability using hyperspectral imaging. Infrared Physics and Technology. 2016; 75: 173-179. DOI: 10.1016/j.infrared.2015.12.008
  139. 139. McGoverin CM, Engelbrecht P, Geladi P, Manley M. Characterisation of non-viable whole barley, wheat and sorghum grains using nearinfrared hyperspectral data and chemometrics. Analytical and Bioanalytical Chemistry. 2011; 401(7): 2283-2289. DOI: 10.1007/s00216-011-5291-x
  140. 140. Polak A, Coutts FK, Murray P, Marshall S. Use of hyperspectral imaging for cake moisture and hardness prediction. IET Image Processing. 2019; 13(7): 1152-1160. DOI: 10.1049/iet-ipr.2018.5106
  141. 141. Chen H, Qiao H, Feng Q, Xu L, Lin Q, Cai K. Rapid Detection of Pomelo Fruit Quality Using Near-Infrared Hyperspectral Imaging Combined With Chemometric Methods. Frontiers in Bioengineering and Biotechnology. 2021; 8: 616943. DOI: 10.3389/fbioe.2020.616943
  142. 142. Gao Z, Shao Y, Xuan G, Wang Y, Liu Y, Han X. Real-time hyperspectral imaging for the in-field estimation of strawberry ripeness with deep learning. Artificial Intelligence in Agriculture. 2020; 4: 31-38. DOI: 10.1016/j.aiia.2020.04.003
  143. 143. Pu YY, Sun DW, Buccheri M, Grassi M, Cattaneo TMP, Gowen A. Ripeness classification of Bananito fruit (Musa acuminata, AA): a comparison study of visible spectroscopy and hyperspectral imaging. Food Anal. Methods. 2019; 12: 1693-1704.
  144. 144. González-Cabrera M, Domínguez-Vidal A, Ayora-Cañada M.J. Hyperspectral FTIR imaging of olive fruit for understanding ripening processes. Postharvest Biology and Technology. 2018; 145: 74-82. DOI: 10.1016/j.postharvbio.2018.06.008
  145. 145. Munera S, Amigo JM, Blasco J, Cubero S, Talens P, Aleixos N. Ripeness monitoring of two cultivars of nectarine using VIS-NIR hyperspectral reflectance imaging. Journal of Food Engineering. 2017; 214: 29-39. DOI: 10.1016/j.jfoodeng.2017.06.031
  146. 146. Zhang C, Guo C, Liu F, Kong W, He Y, Lou B. Hyperspectral imaging analysis for ripeness evaluation of strawberry with support vector machine. Journal of Food Engineering. 2016; 179: 11-18. DOI: 10.1016/j.jfoodeng.2016.01.002
  147. 147. Lu Y, Huang Y, Lu R. Innovative hyperspectral imaging-based techniques for quality evaluation of fruits and vegetables: a review. Applied Sciences 2017; 7(2): 1-36. doi:10.3390/app7020189
  148. 148. Zeng X, Miao Y, Ubaid S, Gao X, Zhuang S. Detection and classification of bruises of pears based on thermal images. Postharvest Biology and Technology. 2020; 161(1): 111090. DOI: 10.1016/j.postharvbio.2019.111090
  149. 149. Zhu X, Lia G. Rapid detection and visualization of slight bruise on apples using hyperspectral imaging. International journal of food properties. 2019; 22(1): 1709-1719. DOI: 10.1080/10942912.2019.1669638
  150. 150. Fan S, Li C, Huang W, Chen L. Detection of blueberry internal bruising over time using NIR hyperspectral reflectance imaging with optimum wavelengths. Postharvest Biology and Technology. 2017; 134: 55-66. DOI: 10.1016/j.postharvbio.2017.08.012
  151. 151. Zhang BH, Li JB, Fan SX, Huang WQ, Zhao CJ, Liu CL, Huang D.F. Hyperspectral Imaging Combined with Multivariate Analysis and Band Math for Detection of Common Defects on Peaches (Prunus Persica). Computers and Electronics in Agriculture. 2015; 114: 14-24. DOI: 10.1016/j.compag.2015.03.015
  152. 152. Qiang L, Mingjie T. Detection of Hidden Bruise on Kiwi fruit Using Hyperspectral Imaging and Parallelepiped Classification. Procedia Environmental Sciences. 2012; 12: 1172-1179. DOI: 10.1016/j.proenv.2012.01.404
  153. 153. Li J, Rao X, Ying Y. Detection of common defects on oranges using hyperspectral reflectance imaging. Computers and Electronics in Agriculture. 2011; 78(1): 38-48. DOI: 10.1016/j.compag.2011.05.010
  154. 154. Xing J, Baerdemaeker JD. Bruise detection on ‘Jonagold’ apples using hyperspectral imaging. Postharvest Biology and Technology. 2005; 37(2): 152-162. DOI: 10.1016/j.postharvbio.2005.02.015
  155. 155. Mehl PM, Chen YR, Kim MS, Chan DE. Development of hyperspectral imaging technique for the detection of apple surface defects and contaminations. Journal of Food Engineering. 2004; 61(1): 67-81. DOI: 10.1016/S0260-8774(03)00188-2
  156. 156. Kandpal LM, Cho BK. A review of the applications of spectroscopy for the detection of microbial contaminations and defects in agro foods. Journal of Biosystems Engineering. 2014; 39(3): 215-226. DOI: 10.5307/JBE.2014.39.3.215
  157. 157. Kim G, Kim GH, Park J, Kim DY, Cho BK. Application of infrared lock-in thermography for the quantitative evaluation of bruises on pears. Infrared Physics and Technology. 2014; 63: 133-139. DOI: 10.1016/j.infrared.2013.12.015
  158. 158. Lu R, Chen YR, Park B, Choi K.H. Hyperspectral imaging for detecting bruises in apples. In: ASAE Annual International Meeting, Paper No. 993120, St. Joseph, Michigan. 1999
  159. 159. Lu R. Detection of bruise on apples using near-infrared hyperspectral imaging. Transactions of the ASAE. 2003; 46(2): 1-8. DOI: 10.13031/2013.12941
  160. 160. Changyeun M, Giyoung K, Moon SK, Jongguk L, Kangjin L, Wang-Hee L, Byoung-Kwan C. On-line fresh-cut lettuce quality measurement system using hyperspectral imaging. Biosystems engineering. 2017; 156: 38-50. DOI: 10.1016/j.biosystemseng.2017.01.005
  161. 161. Yang B, Gao Y, Yan Q, Qi L, Zhu Y, Wang B. Estimation Method of Soluble Solid Content in Peach Based on Deep Features of Hyperspectral Imagery. Sensors. 2020; 20(18): 1-12. DOI: 10.3390/s20185021
  162. 162. Pu Y, Sun D, Riccioli C, Buccheri M, Grassi M, Cattaneo TM, Gowen A. Calibration transfer from micro nir spectrometer to hyperspectral imaging: A case study on predicting soluble solids content of bananito fruit (Musa acuminata). Food Analytical Methods. 2018; 11(11): 1021-1033. DOI: 10.1007/s12161-017-1055-3
  163. 163. Ma T, Li X, Inagaki T, Yang H, Tsuchikawa S. Noncontact evaluation of soluble solids content in apples by near-infrared hyperspectral imaging. Journal of Food Engineering. 2017; 224: 53-61. DOI: 10.1016/j.jfoodeng.2017.12.028
  164. 164. Pu H, Liu D, Wang L, Sun DW. Soluble Solids Content and pH Prediction and Maturity Discrimination of Lychee Fruits Using Visible and Near-Infrared Hyperspectral Imaging. Food Analytical Methods. 2016; 9(1): 235-244. DOI: 10.1007/s12161-015-0186-7
  165. 165. Li J, Peng YK, Chen LP, Huang WQ. Near-infrared hyperspectral imaging combined with cars algorithm to quantitatively determine soluble solids content in “Ya” pear. Spectroscopy and Spectral Analysis. 2014; 34(5): 1264-1269. DOI: 10.3964/j.issn.1000-0593(2014)05-1264-06
  166. 166. Leiva-Valenzuela GA, Lu R, Aguilera JM. Prediction of firmness and soluble solids content of blueberries using hyperspectral reflectance imaging. 2013; Journal of Food Engineering. 115(1): 91-98. DOI: 10.1016/j.jfoodeng.2012.10.001
  167. 167. Baiano A, Terracone C, Peri G, Romaniello R. Application of hyperspectral imaging for prediction of physico-chemical and sensory characteristics of table grapes. Computers and Electronics in Agriculture. 2012; 87: 142-151. DOI: 10.1016/j.compag.2012.06.002
  168. 168. Rajkumar P, Wang N, EImasry G, Raghavan GSV, Gariepy Y. Studies on banana fruit quality and maturity stages using hyperspectral imaging. Journal of Food Engineering. 2012; 108(1): 194-200. DOI: 10.1016/j.jfoodeng.2011.05.002
  169. 169. Mendoza F, Lu R, Ariana D, Cen H, Bailey B. Integrated spectral and image analysis of hyperspectral scattering data for prediction of apple fruit firmness and soluble solids content. Postharvest Biology and Technology. 2011; 62(2): 149-160. DOI: 10.1016/j.postharvbio.2011.05.009
  170. 170. Peng Y, Lu R. Analysis of spatially resolved hyperspectral scattering images for assessing apple fruit firmness and soluble solids content. Postharvest Biology and Technology. 2008; 48(1): 52-62. DOI: 10.1016/j.postharvbio.2007.09.019
  171. 171. ElMasry G, Wang N, ElSayed A, Ngadi M. Hyperspectral imaging for nondestructive determination of some quality attributes for strawberry. Journal of Food Engineering. 2007; 81(1): 98-107. DOI: 10.1016/j.jfoodeng.2006.10.016
  172. 172. El Fakir C, Poffo L, Billiot B, Besnard P, Goujon JM. Active Hyperspectral Mid-Infrared Imaging Based on Widely Tunable QCL Laser. Conference: 21st International Conference on Transparent Optical Networks (ICTON), Angers, France. 2019; 1-4. DOI: 10.1109/ICTON.2019.8840448
  173. 173. Buitrago MF, Groen TA, Hecker CA, Skidmore AK. Spectroscopic determination of leaf traits using infrared spectra. International journal of applied earth observation and geoinformation. 2018; 69: 237-250. DOI: 10.1016/j.jag.2017.11.014
  174. 174. Ribeiro da Luz B, Crowley JK. Spectral reflectance and emissivity features of broad leaf plants: Prospects for remote sensing in the thermal infrared (8.0-14.0 μm). Remote Sensing of Environment. 2007; 109(4): 393-405. DOI: 10.1016/j.rse.2007.01.008
  175. 175. Ribeiro da Luz B. Attenuated total reflectance spectroscopy of plant leaves: a tool for ecological and botanical studies. New phytologist. 2006; 172(2): 305-318. DOI: 10.1111/j.1469-8137.2006.01823.x
  176. 176. Chrzanowski K. Testing thermal imagers: Practical guidebook. Warsaw: Military University of Technology. 2010; INSB 978-83-61486-81-7
  177. 177. Minkina W, Dudzik S. Handbook of Infrared thermography: Errors and uncertainties. 1st ed. Chichester: Wiley; 2009. 212 p. ISBN-10: 0470747188
  178. 178. Ball M, Pinkerton H. Factors affecting the accuracy of thermal imaging cameras in volcanology. Journal of Geophysical Research Atmospheres. 2006; 111(B11): 1-14. DOI: 10.1029/2005JB003829
  179. 179. Chelladurai V, Jayas DS, White NDG. Thermal imaging for detecting fungal infection in stored wheat. Journal of Stored Products Research. 2010; 46(3): 174-179. DOI: 10.1016/j.jspr.2010.04.002
  180. 180. Baranowski P, Mazurek W, Witkowska-Walczak B, Sławiński C. Detection of early apple bruises using pulsed-phase thermography. Postharvest Biology and Technology. 2009; 53(3): 91-100. DOI: 10.1016/j.postharvbio.2009.04.006
  181. 181. Baranowski P, Mazurek W, Wozniak J, Majewska U. Detection of early bruises in apples using hyperspectral data and thermal imaging. Journal of Food Engineering. 2012; 110(3): 345-355. DOI: 10.1016/j.jfoodeng.2011.12.038
  182. 182. Kuzy J, Jiang Y, Li C. Blueberry bruise detection by pulsed thermographic imaging. Postharvest Biology and Technology. 2018; 136: 166-177. DOI: 10.1016/j.postharvbio.2017.10.011
  183. 183. Takacs R, Jovicic V, Rasic AZ, Geier D, Delgado A, Becker T. Evaluation of baking performance by means of mid-infrared imaging. Innovative Food Science and Emerging Technologies. 2020; 61: 102327. DOI: 10.1016/j.ifset.2020.102327
  184. 184. Lopez-Sanchez J, Fortuny-Guasch J. 3-D radar imaging using range migration techniques. IEEE Transactions on Antennas and Propagation. 2000; 48(5): 728-737. DOI: 10.1109/8.855491
  185. 185. Sheen D, McMakin D, Hall T. Three-dimensional millimeter-wave imaging for concealed weapon detection. IEEE Transactions on Microwave Theory and Techniques. 2001; 49(9): 1581-1592. DOI: 10.1109/22.942570
  186. 186. Cook BS, Tehrani B, Cooper JR, Tentzeris MM. Multilayer inkjet printing of millimeter-wave proximity-fed patch arrays on flexible substrates. IEEE Antennas and Wireless Propagation Letters. 2013; 12: 1351-1354. DOI: 10.1109/LAWP.2013.2286003
  187. 187. Scott S. Three-Dimensional Microwave Imaging for Indoor Environments [Ph.D.]. University of California; 2017
  188. 188. Fear EC, Meaney PM, Stuchly MA. Microwaves for breast cancer detection? IEEE Potentials. 2003; 22(1): 12-18. DOI: 10.1109/MP.2003.1180933
  189. 189. Fear EC, Hagness SC, Meaney PM, Okonieweski M, Stuchluy MA. Enhancing breast tumor detection with near-field imaging. IEEE Microwave Magazine. 2002; 3(1): 48-56. DOI: 10.1109/6668.990683
  190. 190. Ambrosanio M, Scapaticci R, Crocco L. A Simple Quantitative Inversion Approach for Microwave Imaging in Embedded Systems. International Journal of Antennas and Propagation. 2015; 12: 1-18. DOI: 10.1155/2015/129823
  191. 191. Manickavasagan A, Jayasuriya H, editors. Handbook of Imaging with Electromagnetic Spectrum. 1st ed. (eds.). Berlin, Heidelberg: Springer; 2014. 204 p. DOI: 10.1007/978-3-642-54888-8
  192. 192. Pastorino M, editor. Handbook of Microwave Imaging. 1st ed. John Wiley & Sons; 2010. 304 p. ISBN: 978-0-470-60247-8
  193. 193. Donelli M, Franceschini D, Rocca P, Massa A. Three-dimensional microwave imaging problems solved through an efficient multiscaling particle swarm optimization. IEEE Transactions on Geoscience and Remote Sensing. 2009; 47(5):1467-1481. DOI: 10.1109/TGRS.2008.2005529
  194. 194. Benedetti M, Donelli M, Lesselier D, Massa A. A two-step inverse scattering procedure for the qualitative imaging of homogeneous cracks in known host media-preliminary results. IEEE Antennas and Wireless Propagation Letters. 2007; 6: 592 - 595. DOI: 10.1109/LAWP.2007.910954
  195. 195. Huang T, Mohan AS. A microparticle swarm optimizer for the reconstruction of microwave images. IEEE Transactions on Antennas and Propagation. 2007; 55(3 I): 568-576. DOI: 10.1109/TAP.2007.891545
  196. 196. Donelli M, Massa A. Computational approach based on a particle swarm optimizer for microwave imaging of two-dimensional dielectric scatterers. IEEE Transactions on Microwave Theory and Techniques. 2005; 53(5), 1761-1776. DOI: 10.1109/TMTT.2005.847068
  197. 197. Bort E, Donelli M, Martini A, Massa A. An adaptive weighting strategy for microwave imaging problems. IEEE Transactions on Antennas and Propagation. 2005; 53(5): 1858-1862. DOI: 10.1109/TAP.2005.846811
  198. 198. Caorsi S, Massa A, Pastorino M, Donelli M. Improved microwave imaging procedure for non-destructive evaluations of two-dimensional structures. IEEE Transactions on Antennas and Propagation. 2004; 52(6): 1386-1397. DOI: 10.1109/TAP.2004.830254
  199. 199. Giordano A. Microwave Imaging Technology for Food Contamination Monitoring [MSc]. College of Electronic Engineering, Telecommunications and Physics, Politecnico Di Torino; 2018
  200. 200. Ragni L, Al-Shami A, Mikhaylenko G, Tang J. Dielectric characterization of hen eggs during storage. Journal of Food Engineering. 2007; 82(4): 450-459. DOI: 10.1016/j.jfoodeng.2007.02.063
  201. 201. Guo W, Trabelsi S, Nelson SO, Jones DR. Storage effects on dielectric properties of eggs from 10 to 1800 MHz. Journal of Food Science. 2007; 72: 335-340. DOI: 10.1111/j.1750-3841.2007.00392.x
  202. 202. Tai TC, Wu HW, Hung CY, Wang YH. Food Security Sensing System Using a Waveguide Antenna Microwave Imaging through an Example of an Egg. Sensors. 2019; 20(3): 699. DOI:10.3390/s20030699
  203. 203. Abdullah MZ, Guan LC, Lim KC, Karim AA. The applications of computer vision system and tomographic radar imaging for assessing physical properties of food. Journal of Food Engineering. 2004; 61(1): 125-135. DOI: 10.1016/S0260-8774(03)00194-8
  204. 204. Sándor K, Zsolt P, Ádám C, György B, Tamás M, Tamás D. Non-destructive imaging and spectroscopic techniques to investigate the hidden-lifestyle arthropod pests: a review. Journal of Plant Diseases and Protection. 2020; 127: 283-295.
  205. 205. Singh V, Sharma N, Singh S. A review of imaging techniques for plant disease detection. Artificial Intelligence in Agriculture. 2020; 4: 229-242. DOI: 10.1016/j.aiia.2020.10.002
  206. 206. Hart AG, Bowtell RW, Köfckenberger W, Wenseleers T, Ratnieks FLW. Magnetic resonance imaging in entomology: a critical review. Journal of Insect Science. 2003; 3(5),1-9. DOI: 10.1673/031.003.0501
  207. 207. Joyce DC, Hockings PD, Mazucco RA, Shorter AJ. 1H-Nuclear magnetic resonance imaging of ripening ‘Kensington Pride’ mango fruit. Functional Plant Biology. 2002; 29(7): 873-879. DOI: 10.1071/PP01150
  208. 208. Thybo AK, Szczypinski PM, Karlsson AH, Dønstrup S, Stødkilde-Jørgensen HS, Andersen HJ. Prediction of sensory texture quality attributes of cooked potatoes by NMR-imaging (MRI) of raw potatoes in combination with different image analysis methods. Journal of Food Engineering. 2004; 61(1): 91-100. DOI: 10.1016/S0260-8774(03)00190-0
  209. 209. Koizumi M, Naito S, Ishida N, Haishi T, Kano H. A dedicated MRI for food science and agriculture. Food Science and Technology Research. 2008; 14 (1): 74 - 82. DOI: 10.3136/fstr.14.74
  210. 210. Taglienti A, Massantini R, Botondi R, Mencarelli F, Valentini M. Postharvest structural changes of Hayward kiwifruit by means of magnetic resonance imaging spectroscopy. Food Chemistry. 2009; 114(4): 1583-1589. DOI: 10.1016/j.foodchem.2008.11.066
  211. 211. Ciampa A, Dell’Abate MT, Masetti O, Valentini M, Sequi P. Seasonal chemical-physical changes of PGI Pachino cherry tomatoes detected by magnetic resonance imaging. Food Chemistry. 2010; 122(4): 1253-1260. DOI: 10.1016/j.foodchem.2010.03.078
  212. 212. Zhang L, McCarthy MJ. Measurement and evaluation of tomato maturity using magnetic resonance imaging. Postharvest Biology and Technology. 2012; 67, 37-43. DOI: 10.1016/j.postharvbio.2011.12.004
  213. 213. Melado-Herrerosa A, Munoz-Garcíab MA, Blancoc A, Valc J, Fernández-Valled M, Barreiro P. Assessment of watercore development in apples with MRI: Effect of fruit location in the canopy. Postharvest Biology and Technology. 2013; 86: 125-133. DOI: 10.1016/j.postharvbio.2013.06.030
  214. 214. Divya S, Thyagarajan D, Sujatha G. Magnetic resonance imaging technology for process control and quality maintenance in food quality operation. International Journal of Engineering and Technology (IJET). 2013; 4(6): 441-449. ISSN: 0975-4024
  215. 215. Van As H, Van Duynhovenc J. MRI of plants and foods. Journal of Magnetic Resonance. 2013; 229: 25-34. DOI: 10.1016/j.jmr.2012.12.019
  216. 216. Chizhik VI, Chernyshev YS, Donets AV, Frolov VV, Komolkin AV, Shelyapina MG. Basic principles of detection of nuclear magnetic resonance. In: Chizhik VI, Chernyshev YS, Donets AV, Frolov VV, Komolkin AV, Shelyapina MG. Handbook of Magnetic resonance and its applications. 1st ed. Chizhik, Cham: Springer; 2014. P. 93-162. DOI: 10.1007/978-3-319-05299-1_2
  217. 217. Herremansa E, Melado-Herrerosb A, Defraeyea T, Verlindenc B, Hertoga M, Verbovena P, Vald J, Fernández-Vallee ME, Bongaersf E, Estradeg P, Weversh M, Barreirob P, Nicolaïa BM. Comparison of X-ray CT and MRI of watercore disorder of different apple cultivars. Postharvest Biology and Technology. 2014; 87: 42-50. DOI: 10.1016/j.postharvbio.2013.08.008
  218. 218. Patel KK, Khan MA, Kar A. Recent developments in applications of MRI techniques for foods and agricultural produce—An overview. Journal of Food Science and Technology. 2015; 52(1): 1-26. DOI: 10.1007/s13197-012-0917-3
  219. 219. Pfugfelder D, Metzner R, van Dusschoten D, Reichel R, Jahnke S, Koller R. Non-invasive imaging of plant roots in diferent soils using magnetic resonance imaging (MRI). Plant Methods. 2017; 13(102): 1-9. DOI: 10.1186/s13007-017-0252-9
  220. 220. Kamal T, Cheng S, Khan IA, Nawab K, Zhang T, Song Y, Wang S, Nadeem M, Riaz M, Khan MAU, Zhu BW, Tan M. Potential uses of LF-NMR and MRI in the study of water dynamics and quality measurement of fruits and vegetables. Journal of Food Processing and Preservation. 2019; 43(474): 1-21. DOI: 10.1111/jfpp.14202
  221. 221. Baek S, Lim J, Lee JG, McCarthy MJ, Kim SM. Investigation of the maturity changes of cherry tomato using magnetic resonance imaging. Applied Sciences. 2020; 10(15): 5188. DOI: 10.3390/app10155188

Written By

Ayman Eissa, Lajos Helyes, Elio Romano, Ahmed Albandary and Ayman Ibrahim

Submitted: 07 July 2021 Reviewed: 17 July 2021 Published: 22 November 2021