Open access peer-reviewed chapter

Advanced Optical Technologies in Food Quality and Waste Management

Written By

John Chauvin, Ray Duran, Stanley Ng, Thomas Burke, Kenneth Barton, Nicholas MacKinnon, Kouhyar Tavakolian, Alireza Akhbardeh and Fartash Vasefi

Submitted: June 17th, 2020 Reviewed: April 7th, 2021 Published: June 29th, 2021

DOI: 10.5772/intechopen.97624

Chapter metrics overview

264 Chapter Downloads

View Full Metrics


Food waste is a global problem caused in large part by premature food spoilage. Seafood is especially prone to food waste because it spoils easily. Of the annual 4.7 billion pounds of seafood destined for U.S. markets between 2009 and 2013, 40 to 47 percent ended up as waste. This problem is due in large part to a lack of available technologies to enable rapid, accurate, and reliable valorization of food products from boat or farm to table. Fortunately, recent advancements in spectral sensing technologies and spectroscopic analyses show promise for addressing this problem. Not only could these advancements help to solve hunger issues in impoverished regions of the globe, but they could also benefit the average consumer by enabling intelligent pricing of food products based on projected shelf life. Additional technologies that enforce trust and compliance (e.g., blockchain) could further serve to prevent food fraud by maintaining records of spoilage conditions and other quality validation at all points along the food supply chain and provide improved transparency as regards contract performance and attribution of liability. In this chapter we discuss technologies that have enabled the development of hand-held spectroscopic devices for detecting food spoilage. We also discuss some of the analytical methods used to classify and quantify spoilage based on spectral measurements.


  • Spoilage
  • valorization
  • spectroscopy
  • hyperspectral imaging
  • artificial intelligence

1. Introduction

Food waste is a significant problem in both developed and developing economies [1]. The global seafood industry faces unprecedented challenges as demand increases, consumer preferences change, and expectations of quality increase. Consumers are demanding more transparency and a commitment to sustainability while access to at-capacity or overfished fishery resources is strained and food production and retail profit margins are thin. Global food supply chains can be a cause of improper food storage, which leads to by-product waste. Whether through a lack of quality control or a hold in the distribution process, food spoilage can occur even before the product reaches markets. Approximately 35% of fish are lost to waste globally with between 30% and 35% loss in most regions of the world [2]. Seafood’s perishability is largely to blame.

A major goal to improve the overall valorization of food and reduce agro-food waste or diversion to by-products is earlier and rapid detection of spoilage. Like medical evaluations for disease, early detection can lead to quicker response from manufacturers or consumers to increase the shelf life of food products. Microbiologists and food scientists have developed a variety of methods to detect surface microbials and pathogenic microorganisms including culturing and colony-counting methods, polymerase chain reaction (PCR)-based amplification for DNA analysis, immunoassay analysis, chromatography, and mass spectrometry [3, 4]. Unfortunately, these techniques have limited versatility and restrictive methodologies that are not practical with on-site and on-demand food quality and safety control [3, 4, 5]. However, spectroscopic technologies have shown great promise for enabling early detection of spoilage to help minimize food waste.

Another problem affecting consumers and contributing to global food waste is the lack of transparent pricing for food products as a function of shelf life. Alongside government and industry regulation, intelligent dynamic pricing based on projected shelf life at retail and other upstream points along the supply chain can encourage efforts to reduce waste. This requires new tools for tracking food products at all points along the supply chain. These tools must be easy to incorporate, objective, verifiable, and provide data on quality, provenance, and freshness.

A pioneer in the development of food quality and traceability technologies, SafetySpect is developing a new handheld quality, adulteration, and traceability (QAT) scanner to address many of these issues. Utilizing hyperspectral multi-mode technology to provide species identification and direct measurements of freshness/spoilage in a handheld device can address challenges of waste and mislabeling. In seafood and meat processing, distribution, and storage supply chains in developed and developing economies, this is likely to meaningfully decrease food waste and increase sustainable access to safe, healthy, and nutritious foods. It will also decrease costs and increase profit within supply chains by providing better attribution of liability and verification of supply contract performance. This transparency will provide incentives to upstream supply chain participants to improve operational methods that can result in the degradation of product or unnecessary, accelerated spoilage.

1.1 Current trends for examining fish quality

The main approach to improving valorization and by-product management is early detection of spoilage. A common method for detecting spoilage in fish is the Torry Freshness Score [6]. This systematic scoring method was developed in the UK to provide an objective assessment of fish quality. It uses the human senses to examine specific parts of the fish. For example, an evaluator will observe gill odors, skin tension, opaqueness of the eyes, and overall smell of the fish and provide a freshness rating between 0 (lowest) to 10 (highest). However, this manual approach to evaluating fish samples is time consuming and may be more susceptible to evaluator bias or human error. This motivates the development of technologies that enable rapid evaluation of fish quality with minimal human interpretation.

Spectroscopic approaches offer a robust, non-destructive means of detecting and evaluating the extent of food quality issues. In recent decades, advancements in micro-electro-mechanical systems (MEMS) and micro-electro-opto-mechanical systems (MEOMS) have enabled the development of miniaturized spectroscopic devices that can be used for analysis at all points along the food supply chain, from farm fields to distribution centers to retail markets. Hyperspectral imaging (HSI) combines spectroscopy and imaging to enable evaluation of an object’s spectroscopic composition at a high spatial resolution, thus providing a more comprehensive evaluation tool for any given sample [7, 8, 9]. As the scale and complexity of food supply networks continues to grow, there is an ever increasing need for low-cost, portable, analytical devices to combat the corresponding growth in vulnerability of food products to adulteration, contamination, and fraud [10]. In the next section, we discuss a variety of technologies that have enabled the recent development of portable and handheld spectroscopic devices that can and have been used for evaluating the quality of food products.


2. Spectroscopy and hyperspectral imaging technologies

2.1 Infrared spectroscopy

One of the most common approaches used for quality control of food products involves the analysis of vibrational spectra via infrared spectroscopy. The spectral peaks and valleys formed by the fundamental vibrational modes (and their harmonics) of key structures within organic molecules can be used to detect the presence of abnormalities or measure the abundances of specific chemical components. The near infrared (NIR) and mid-infrared (MIR) spectral regions are of high interest in food analysis applications.

2.1.1 Infrared detectors

The rapid proliferation of visible digital camera technology over the past few decades is due in large part to the use of inexpensive silicon-based detectors which can sense wavelengths in the visible region and in the infrared region up to about 1050 nm. For longer wavelengths, however, detectors composed of different materials are required. Indium gallium arsenide (InGaAs) detectors have become the dominant technology for detectors on the market, surpassing germanium (Ge), lead sulfide (PbS), and lead selenide (PbSe) detectors [11]. Unfortunately, these detectors are generally more costly than silicon-based detectors. Furthermore, for wavelengths beyond 1700 nm, the noise in these detectors becomes so high that cooling is required to keep it to a manageable level [12]. To minimize the cost of these more expensive detectors, developers of handheld infrared spectrometers have sought to simplify detector designs by reducing the number of elements required. Near-infrared spectroscopy (NIRS)

NIRS covers the approximate wavelength spectrum of 780 to 2500 nm. Within this range lie signals from the vibration of organic chemical bonds such as oxygen-hydrogen (O-H), carbon-hydrogen (C-H), nitrogen-hydrogen (N-H), and sulfur-hydrogen (S-H), as well as their overtones [13]. Instrument cost and robustness is generally better for NIR than for MIR [14]. However, NIR spectral peaks tend to be weak and broad with significant overlapping of absorption peaks because of a combination of vibrational spectra from multiple chemical bonds, making straightforward interpretation difficult, if not impossible [15]. Spectral preprocessing techniques (e.g., smoothing, detrending, and taking derivatives) and multivariate statistical methods (e.g., nonlinear partial least squares, Fisher determinant analysis, and artificial neural networks) are invoked to extract the information hidden in the spectra. Despite these disadvantages, the advantage in terms of lower cost, increased safety for the environment and operators, and superior chemical specificity and applicability to a broad range of sample types has made NIR spectroscopy a popular approach for food analysis [11, 15]. These advantages have in turn encouraged the development of numerous portable NIR spectrometers based on a variety of designs. Dispersive NIR spectrometers

In conventional dispersive spectrometer designs, broadband light is passed through the sample and into an entrance slit to create a narrow line of light which is imaged onto a detector. A dispersive grating is inserted in the path causing the image of the narrow line to be spread out into a spectrum where the light is separated into its various wavelengths. These are then focused onto an array detector. Figure 1 shows a basic dispersive spectrometer based on the Czerny-Turner design which uses mirrors to minimize the overall size of the design.

Figure 1.

Dispersive spectrometer based on the Czerny-Turner design [16].

One major disadvantage of the design shown in Figure 1 is the need for an array InGaAs detector which can be expensive. Alternative designs requiring only a single-element detector have been developed to help mitigate this expense. For example, instead of collecting all wavelengths simultaneously, spectrometers based on the Fabry-Perot interferometer design shown in Figure 2 collect the wavelengths sequentially. This design features one fixed and one moveable half-silvered mirror aligned along the same optical axis. As the light bounces between the mirrors, constructive and destructive interference determines the spectrum of light that passes through the other side of the moveable mirror and onto the detector. When the spacing between the fixed and moveable mirrors equals an integer number of half wavelengths, maximum constructive interference occurs leading to a peak in the output spectrum at that wavelength. As with the Michelson interferometer, the spectral range of interest is thus examined by translating the moveable mirror over a specific spatial extent. An example of a compact instrument based on the Fabry-Perot design is Spectral Engines’ MEMS Fabry-Perot spectrometer, the NIRONE [18].

Figure 2.

Fabry-Perot spectrometer design [17]. Linear variable filter (LVF) NIR spectrometers

One problem common to all dispersive designs like the Czerny-Turner is that the light must be allowed to disperse over a given spatial extent such that the wavelengths can separate before reaching the detector. This causes limitations in designing for compactness [19]. One way to surmount this problem is with the use of LVFs which are generally formed from wedge-shaped optics and behave much like a Fabry-Perot interferometer but scan by lateral position along the filter instead of by movement of a mirror along the optical axis [12]. The LVF can be applied directly to a detector array, leading to a simple and compact mechanical design with no moving parts (see Figure 3). Viavi’s MicroNIR OnSite features an LVF applied to a 128-pixel InGaAs array [21].

Figure 3.

Diagram showing the operating principle behind the LVF used in the MicroNIR OnSite [20]. Hadamard spectrometers

The Hadamard spectrometer design has a couple of key advantages over the conventional dispersive design. First, it overcomes the slow scanning process of dispersive techniques where individual wavelengths must be collected one after another. This is often referred to as the multiplexing or Fellget advantage. Additionally, Hadamard spectrometers tend to be more sensitive and the sensors themselves have a higher optical throughput, resulting in what is termed the Jacquinot advantage [22]. Figure 4 shows the basic layout for this type of spectrometer, the key component of which is the mask positioned just before the focusing lens. This mask blocks out a certain portion (usually ~50%) of the diffracted light at a time. The blocking elements are moved in discrete steps to form a binary matrix where the elements of each successive row are shifted by one position to the left. The detector readings are recorded with each shift and pieced together to form a data vector. A Hadamard transformation using the binary matrix is applied to the data matrix to yield the measured spectrum [12].

Figure 4.

Basic design layout for a Hadamard spectrometer [23].

The Hadamard mask itself can be implemented in a variety of ways. Early designs used a coil and magnet to move the mask linearly in front of a separate entrance slit (see Figure 5(a)). The microPHAZIR NIR spectrometer by Thermo Fisher Scientific uses a programmable MEMS diffraction grating as the Hadamard mask. Texas Instruments offers two NIR Hadamard spectrometer devices based on its Digital Light Projection (DLP®) digital micromirror device (DMD) technology, the DLP NIRscan and the DLP NIRscan Nano (see Figure 5(b)) [12]. The DMD contains an array of individually pivotable micromirrors which can be aligned and flipped in sequence to form the Hadamard mask [12].

Figure 5.

Hadamard spectrometer designs and devices. (a) a coil and magnet [23] mask design; (b) layout of Texas Instruments’ DLP® DMD-based NIRscan Nano optical engine [12]. Applications of handheld NIR spectrometers for food analysis

Advancement in MEMS technology and LVFs has led to the rapid miniaturization of NIR spectrometers, thus enabling the development of portable NIR spectrometers. Portable NIR spectrometers have been used to evaluate the quality of fruits and vegetables primarily during the pre-harvest stage while on the vine/tree. Such analyses focus on maturity parameters to enable determination of optimal harvest dates and include measurements of soluble solids content (SSC), titratable acidity, pH, weight, size, firmness, juice content, juice weight, pericarp thickness, and others [15].

NIR analyses of meat and fish are typically performed for shelf-life estimation and freshness evaluation. Examples include traceability analysis of pasture-fed lambs and stall-fed lambs, authenticity testing for pork and pork fat in veal sausages, moisture, protein, and fat analysis in Iberian pork muscles, fat characterization in Iberian ham, freshness evaluation in beef sirloin and beef eye of round, shelf life estimation of pork meat, and monitoring and control of the drying process in fermented sausages [13].

Portable NIR spectrometers have also been used to measure quality factors in milk and beverages. Components such as fat, protein, lactose, and moisture percentages have been measured to determine milk quality [15], and NIR spectral differences have been exploited to distinguish between organic and non-organic milk products [24]. Quality of rice wine, tea drinks, and beers have been evaluated via NIR measurement of alcohol, nitrogen, apparent extract, and non-sugar solids percentages, polyphenol and free amino acid concentrations, and bitterness and beer distinction factors [15].

2.2 Mid-infrared spectroscopy

The mid-infrared (MIR) spectrum covers a range of wavelengths from ~2500 nm to ~5000 nm and contains many of the fundamental absorption bands of organic components. Spectra in this range are very sensitive to chemical composition, leading to high specificity. Furthermore, organic functional groups produce well-delineated absorption bands in this region, a feature that can be exploited to individually separate different components present in a mixture by their unique fingerprints in the absorption spectrum [14]. Given the high cost of InGaAs detectors and the need for cooling to lower noise to a manageable level, MIR spectrometers generally feature single-element detector designs. Most exploit a technique based on the Fourier transform.

2.2.1 Fourier transform infrared spectrometer (FTIR)

One subset of FTIR spectrometers is based on the Michelson interferometer design that was used for Michelson and Morley’s speed of light measurements. This interferometer consists of two optical pathways oriented perpendicular to one another (see Figure 6). A collimated broadband light source enters from the left and strikes a half-silvered mirror (i.e., beam-splitter) oriented at 45°. Half of the beam then passes through this mirror and strikes a stationary mirror at the end of the pathway where it is reflected back toward the beam-splitter. The other half of the incident beam is directed toward a mirror that is allowed to move back and forth along this pathway. Upon reflection from this mirror and arrival at the beam-splitter, the two reflected beams produce an interference pattern which is focused on a single-element detector. The sample is typically inserted between the beam-splitter and the detector. After the moving mirror is swept through its full range of motion and the full interferogram recorded, these patterns are processed to produce the spectrum. Processing in this case is done with a Fourier transform which converts the sensor response as a function of spatial mirror position to a function of frequency. The Fourier transform accomplishes this by determining the optimal mixture of sine and cosine functions that can replicate the sensor response.

Figure 6.

Diagram of an FTIR spectrometer based on the Michelson interferometer [25].

Like Hadamard spectrometry, FTIR spectrometry has several advantages over dispersive spectrometry such as that used in most NIR spectrometers. It enjoys both the multiplexing (Fellget) advantage and the throughput (Jacquinot) advantage. This latter characteristic serves to significantly reduce the noise in the sensor output. As this design includes only one moving component, the mirror in the path with the variable length, FTIR instruments have a mechanical design that it is highly robust to breakdown. Finally, many FTIR instruments include a HeNe laser that acts as an internal calibration standard, eliminating the need for calibration during operation (Connes advantage) [26]. SiWare Systems’ NeoSpectra-Scanner is an FTIR NIR spectrometer with a MEMS-based Michelson interferometer design [27].

Another popular design for FTIR spectrometers is based on the property of attenuated total reflection (ATR). As shown in Figure 7, broadband infrared light is directed into a high refractive index crystal typically made of germanium, silicon, zinc sulfide, or diamond [28]. The ends of this crystal are cut such that the angle of incidence for the light will result in total internal reflection through the crystal. Although the light wave does not propagate outside of the crystal, an evanescent wave can still pass through the top of the crystal where the sample is placed. This evanescent wave interacts with the sample and absorbs portions of the infrared light, resulting in an attenuation of the light that reaches the detector. One of the primary advantages of this technique is that the light does not have to travel through the entire sample as it does for other designs, which often results in severe attenuation and loss of signal. An example of a handheld FTIR spectrometer based on the ATR design is the Ocean MZ5, a miniature ATR-FTIR spectrometer produced by Ocean Optics [29].

Figure 7.

Diagram illustrating the ATR concept [28].

2.2.2 Applications of handheld MIR spectrometers for food analysis

MIR spectrometers have long been used for food analysis, but most have been conducted in a laboratory setting. Examples include detection of food spoilage bacteria in meat and dairy produce, brand authentication of a range of Trappist beers, and adulteration of milk and of beef burgers [10]. More recently, portable MIR devices have been used for the simultaneous analysis of sugar and amino acid concentrations in raw potato tubers, the measurement of quality factors in tomato juices, and the measurement of fatty acid content of marine oil dietary supplements [30].

2.3 Raman spectroscopy

Raman spectroscopy is often seen as complimentary to infrared spectroscopy given the relative nature of the phenomena involved. While infrared spectroscopy measures the absorption of energy, Raman spectroscopy measures the exchange of energy with radiation provided by a monochromatic light source (usually a laser with a wavelength in the ultraviolet to NIR range). This exchange causes a shift in the source’s wavelength. Molecules are infrared active only if the vibration induced by the source results in a change to the dipole moment, whereas the Raman shift is caused by changes in the molecules’ polarization [10]. Thus, these two methods provide mutually exclusive information. Raman peaks tend to be much sharper than infrared peaks and data collection tends to be faster, but the Raman effect is inherently weaker. Furthermore, Raman spectrometers tend to be more expensive to manufacture than their infrared counterparts.

2.3.1 Raman spectrometers

Figure 8 shows an example design for a Raman spectrometer. Light from the laser is directed to the sample and the output is passed through a notch filter to separate out all but the Raman scattered light. A spectrograph grating then disperses this light into its constituent wavelengths and onto a detector. Metrohm’s Mira M-1 is an example of a portable Raman spectrograph with a 785 nm laser [32]. Laser wavelengths for other Raman spectrometers can range from the ultraviolet (UV) to the NIR bands. Since spectral sensitivity and resolution increase with decreasing laser wavelength, UV lasers tend to be optimal for applications featuring bio-molecules [33].

Figure 8.

Basic diagram of a Raman spectrometer [31].

2.3.2 Applications of handheld Raman spectrometers for food analysis

Recent applications of portable and handheld Raman spectrometers for food analysis include the detection of organophosphate and organothiophosphate pesticides on apple skins, the detection of fungicides and parasiticides on citrus fruits and bananas, authenticity and origin of vegetable and essential oils, detection of marker compounds for illegal alcoholic beverages, detection of adulteration in beef burgers, identification of rapid meat spoilage, and prediction of pork quality on a slaughterhouse line [10].


3. Artificial intelligence and machine learning techniques for spectral analysis

Once the spectroscopic data has been collected, sophisticated algorithms, and capable processors to host these algorithms, are needed to convert this data into useful information. The microchip revolution that started back in the 1960’s has continued unabated [34] and silicon vendors continue to innovate with even mature technologies such as field programmable gate arrays (FPGAs) [35] with transistor counts that exceed one billion in number. This growth of computing power coupled with advances in spectroscopy have enabled modern machine learning algorithms to be implemented that can lead to significant positive changes to food safety, adulteration and fraud.

In this section, we discuss key machine learning algorithms that have been applied to spectroscopy in general and to food valorization applications specifically. We first examine methods used to extract from the data features that are non-redundant and information-rich and can be used for accurate classification and quantification of food spoilage and food quality.

3.1 Feature extraction

Most spectral datasets contain subsets of features that are highly redundant or subject to high amounts of noise. The inclusion of such features in a classification or regression algorithm generally leads to suboptimal performance. Feature extraction is the process by which redundant or noisy features are removed from the dataset, leaving a smaller set of features with a high amount of signal content. Here, we discuss popular methods for feature extraction that have been used for spectroscopic applications.

3.1.1 Principal component analysis (PCA)

PCA is a common method of feature extraction enabled through dimensionality reduction. PCA provides dimensionality reduction by representing the variance in the data within the smallest number of components possible. Each principal component is a linear combination of the original components and is calculated in an iterative fashion by identifying the weight vector that, when applied to the original data components, contains that largest amount of the remaining variance and is orthogonal to the previously calculated principal components. As a result, the majority of the variance (typically ~99% or more) is contained within the first few (typically 3–5) principal components, meaning the others can be safely ignored with negligible loss of information. These principal components are also eigenvectors of the data’s covariance matrix and can be computed by eigendecomposition. The corresponding eigenvalues are proportional to the variance represented within each principal component and can be used to identify the principal components which are considered “significant.” According to the Kaiser criterion [36], eigenvectors with eigenvalues less than 1 can be considered insignificant.

In additional to its dimensionality reduction benefit, PCA tends to yield principal components that provide good separability between data collected from different classes. It is this property that makes PCA such an effective tool for feature extraction. Principal components also provide qualitative clues to key underlying molecular constituent differences and relative abundances, since their spectral characteristics, often are the key features in the second, third and higher principal components.

PCA is widely used in food chemistry studies [37] and specifically for analysis of food spoilage. For example, in 2020 Saleem et al. [38] presented a new method for predicting microbial spoilage and detecting its location in bakery goods using HSI. HSI cameras monitored baked goods over a period of time as they were allowed to spoil. PCA was applied to difference images created by subtracting images collected at the beginning of the monitoring period, when the goods were fresh, and images collected at later times. The researchers then used PCA to separate pixels representing spoiled portions of the good from unspooled portions.

3.1.2 Sparse representation

Similar in concept to PCA, sparse representation methods are mathematical processes applied to data with the goal of transforming the data to a new representation containing as few non-zero elements as possible. This is achieved by conducting a trade-off between goodness-of-fit and sparsity. The transformation attempts to produce an accurate reproduction of the original data but is regulated with a cost penalty the increases with the number of non-zero components. Example sparse representation algorithms include basis pursuit, sparse dictionary learning, L1-regularization, and non-negative matrix factorization (NMF) [39].

With respect to HSI, sparsity can also be enforced through wavelength selection processes that identify a small number of information-rich wavelengths and discard all other wavelengths. Lei and Sun [40] developed a sparse coefficients wavelength selection and regression (SCWR) method for NIR spectral calibration to select the wavelengths that contributed most to the determination of the spectral response. They applied this method to a dataset if NIR spectra from potatoes with dehydration loss as the response variable. A model based on 23 selected wavelengths (from an original set of 200) predicted hydration loss with an accuracy that exceeded those yielded by common competing methods.

3.1.3 Autoencoders

Benefitting from recent advancements in both algorithms and processing technology, neural networks and their derivatives have experienced rapid development over the past decade. Another method for dimensionality reduction and feature extraction is based on a particular type of neural network called an autoencoder. An autoencoder network contains input (e,g., spectral measurement) and output layers of the same size but includes hidden layers in between with gradually decreasing numbers of nodes (see Figure 9). During training, the network weights are updated until the output is the same as the input within an acceptable tolerance. The layers from the input to the bottleneck center thus effectively encode a compressed version of the input signal. This set of layers is referred to as the “encoder” section. The compressed signal can then be uncompressed in the later layers (called the “decoder”) to form a copy of the signal in the output layer. This process has the added benefit of removing noise in the input during encoding such that the decoded copy is more representative of the true response. In 2021, Vasafi et al. [41] made an initial application of an autoencoder in the field of food production process control by using it to detect anomalies such as changes in fat, temperature, added water, and cleaning solution during milk processing. Anomalies were found to result in significantly higher reconstruction error at the autoencoder output layer as compared with the control (i.e., “normal”) data.

Figure 9.

Autoencoder neural network showing bottleneck which separates the encoder and decoder portions.

3.1.4 Partial least squares regression (PLSR)

PLSR is a well-known and often used means of conducting regression in the presence of noise. Regression provides a function that predicts a response from a data input (as opposed to classification which assigns the input to a class). While both PCA and PLSR are derived from experimental data, PCA is more qualitative by nature, often used in an exploratory manner, and is an unsupervised learning method. PLSR on the other hand is more quantitative and is a multi-dimensional evaluation that is linear. Both methods rely on computing a maximum covariance, PCA in the original data and PLSR in the data and response.

PLSR works best when the observed variables are highly correlated and noisy [42], which is a benefit in hyperspectral analysis where data at nearby wavelengths can be highly correlated. Also, PLSR assumes that the data set is linear and that that projection holds in a new subspace. However, if linearity does not hold up for the model, there are several ways to deal with this problem that include polynomials, splines or small neural nets [43]. There is also an easy way to deal with non-linearity [44], the basic idea being to expand the data matrix with square, cubic, or cross product terms.

PLSR continues to be a popular tool for food analysis and quality control applications. Jiang et al. [45] invoked PLSR to model the relationship between NIR spectra from hyperspectral images of chicken to total Pseudomonasspp. and Enterobacteriaceaecounts (PEC) to predict PEC rapidly. Cavaglia et al. [46] similarly applied PLSR to predict density and pH in ATR-MIR spectra from alcoholic fermentation samples.

3.1.5 Wavelets

Wavelets are mathematical functions that, like Fourier analysis, transform data into its constituent spectral components. However, unlike the standard Fourier transform, wavelet transforms can provide frequency information for specific locations in the temporal or spatial domain. Wavelets of different shapes (called mother wavelets) focus on different portions of the frequency spectrum and are typically used in combination to analyze the full spectral bandwidth of concern. Each mother wavelet can also be rescaled to form daughter wavelets to change the resolution in the temporal or spatial domain and thus examine higher or lower portions of the frequency spectrum in more detail. Some wavelet examples are shown in Figure 10.

Figure 10.

Wavelet examples from different wavelet families.

Wavelet analysis has been used in spectroscopic applications as a means of extracting useful features from specific regions of the spectra. For example, Qi et al. [47] applied a single wavelet form at seven different scales to extract features from shortwave infrared (SWIR) hyperspectral reflectance images from peanuts. Using these features, they were able to distinguish moldy portions of peanuts from healthy portions. Wavelet analysis can also be applied in the image domain to extract 2D features. Ji et al. [48] applied wavelet transforms to hyperspectral visible-NIR images of potatoes (after first applying PCA) to decompose the original images into sub-band images at different scales to extract textural features that would enable the identification of bruising in the potatoes.

3.1.6 Splines

Splines are piecewise linear or polynomial functions that are combined to approximate a given set of data. Splines are often used as smoothing functions to approximate data curves while eliminating the “roughness” caused by noise. Like the other techniques discussed above, splines benefit the feature extraction process by focusing on the true signal within the data.

One application of splines common to chemometric analysis is the regression analysis method of multivariate adaptive regression splines (MARS) [49]. MARS models nonlinearities in data by fitting splines to specific regions of the input variable range. The regions are separated by “hinge” functions that have a value of zero for all locations except within the region of applicability. The transition points that link consecutive splines are called “knots.” In a forward process knots or splines are added to yield a close fit to the data. In a backward process the least contributing terms are pruned to minimize overfitting. Garre et al. [50] compared a model developed using MARS to similar regression models developed to predict the amount of waste in food production and quantify model uncertainties. The MARS model achieved a precision comparable to that of more sophisticated machine learning models such as random forest methods developed to deal with the high variability in decision trees while maintaining low bias [51].

Closely related to spline regression is Savitzky–Golay filtering in which the data points are convolved with a set of filter weights, much like a weighted moving average. However, as the filter moves to each successive data point, a polynomial of degree pis fit to the data within the filter window, and the point in the center of the window is replaced by the polynomial value at that point [52]. One key benefit of Savitzky–Golay filtering is that it tends to preserve high frequency signal components while rejecting high frequency noise (often found in CCD or InGaAs arrays or photon-starved detection systems) whereas standard finite impulse response (FIR) filters tend to remove these signal components [53].

Savitzky–Golay filtering is a popular pre-processing technique that has been used extensively for food spectral analysis. Examples since 2020 include the use of Savitzky–Golay filtering in pre-processing NIR spectra to improve classification performance in the identification of allergens in powdered food materials [54], filter noise from FTIR spectra of instant freeze-dried coffee and MIR spectra of fruit puree samples [53], and NIR reflectance spectra of Indonesia rice flour-based food to enable accurate classification and level estimation of added sweeteners [55].

3.2 Classification

Automatic detection of food spoilage requires an algorithm that can successfully classify a food product (or part of a food product) as spoiled or healthy, either by detecting the presence of contaminants or by classifying physical changes to the product. A variety of sophisticated machine learning algorithms have been developed over the past few decades to provide accurate classification, and many of these have been used in spectroscopic and food quality applications. Here, we discuss two of the most popular classification algorithms, the support vector machine (SVM) and the artificial neural network, both of which take as input a set of features that are typically generated using the methods described in the previous section. We also discuss deep learning methods, which have advanced rapidly since 2012 and are being used in a wide variety of applications including food analysis. Unlike more conventional machine learning methods, deep learning methods include their own automatic feature extraction process [56].

3.2.1 Support vector machines (SVM)

An SVM is a supervised learning algorithm that seeks to find the separating hyperplane between data points of different classes that minimizes classification error. The position of the hyperplane is determined by the set of points (called “support vectors”) that are closest to it. The basic concept of the SVM is intuitive when the hyperplane is linear and the classification is binary (see Figure 11). However, SVMs can also be applied to data whose classes are not linearly separable by transforming the data from the original space into one in which they are linearly separable. This is often accomplished using the so-called “kernel trick” in which a kernel function compares vectors in the new space without performing the actual transformation, thus minimizing computation cost. Common kernels include linear, radial (i.e., Gaussian), and polynomial.

Figure 11.

Separating hyperplane determination for a Support Vector Machine. The hyperplane is positioned to maximize the margin between the support vectors.

A reliable and robust machine learning classifier, the SVM has been used in many hyperspectral imaging food analysis applications. A few examples since 2020 include the detection of spoilage in visible-NIR imagery of baked goods [38], detection of bacterial foodborne pathogens in visible-NIR imagery [57], and detection of fish fillet substitution and mislabeling through accurate classification of fillet species from imagery collected from visible-NIR, fluorescence with UV excitation, SWIR, and Raman spectral bands [58].

3.2.2 Artificial neural networks

Artificial neural networks are another popular supervised classification method that has been surging in popularity with the advances in processing technology over the past few decades. Conventional artificial neural networks are based on the multilayer perceptron (MLP) architecture (see Figure 12) which was designed to resemble neurons in the brain. Such neurons accept some number of input values and remain dormant until the sum of inputs rises above a certain threshold value, at which point the neurons “fire.” This nonlinear thresholding effect is enabled in artificial neural network nodes by nonlinear activation functions that determine each node’s output value. Common activation functions include the sigmoid, hyperbolic tangent, and rectified linear unit functions.

Figure 12.

A simple illustration of a multi-layer perceptron neural network architecture. Each circle represents a neural network node and each arrow represents the weight that connects a node in one layer to a node in the subsequent layer.

Artificial neural networks are trained by initializing the network weights (usually with random values) and comparing the predicted results at the output layer to known target values. An error metric is calculated based on the difference between the prediction and target values, and the network weights are updated by calculating partial derivatives of the error with respect to each weight, starting with the output layers and moving backward toward the input layer in a process known as backpropagation. This entire process is repeated until the error is brought below an acceptable tolerance or other stopping criteria are met.

Like SVMs, artificial neural networks have been widely used for spectral classification applications due to their ability to achieve high accuracy. In 2020, Balabanov et al. [59] developed a vison-based system with an artificial neural network to detect defects in apples passing on a conveyer belt by analyzing HSI in the visible and NIR spectral ranges. Although once popular, these MLP-based neural networks are rapidly being replaced by deep learning neural networks which not only offer superior performance, but also ease the data processing pipeline by eliminating the need for manual feature selection and extraction.

Prior to the advent of modern sophisticated processing technology, neural networks were limited in size due to their computational loads which grew with the number of layers and the number of nodes within each layer. As this processing technology advanced, more and more layers could be added to neural networks to improve their performance (although the theoretical reason for why this is the case is still poorly understood). Furthermore, as shown in Figure 13, layers could be added to perform different operations on the data, such as convolution and averaging (often called “pooling” in this context). The network then performs feature extraction by learning the weights in the convolutional layers which yield accurate classifications. In essence, the network learns which filters should be applied to the data to best extract the signal within. Pooling layers following the convolutional layers then apply averaging to help prevent overfitting. Following the successive convolutional and pooling (and possibly other) layers, the results are concatenated into a single-dimensional vector and fed into an MLP neural network to combine these features for classification.

Figure 13.

Basic CNN architecture. Data at the input layer is passed through a convolutional layer which generates feature sets. These are then reduced in size through averaging in the pooling layer and the resulting features are concatenated. The final layers form an MLP-based neural network to yield the final classifications.

In 2021 alone, deep learning neural networks have been used to classify beef freshness from visible-NIR reflectance spectra [60], to analyze NIR HSI to detect the presence of contamination during food packing [61], and to conduct a series of different food quality analyses from NIR spectra [62].


4. Food traceability and dynamic pricing

4.1 Inadequacies of existing traceability technologies

Traceability and real time analysis of food products that can help minimize waste will require new tools for quality assurance, authentication and digital supply chain management that can track products from harvest to market. Such tools must be objective, verifiable, provide data on quality, provenance, and freshness, and easy to incorporate at multiple nodes in the supply chain. Current technologies are inadequate to address most of these challenges and address only some components of the most difficult problems.

State of the art seafood traceability platforms provide tools for establishing chain of custody but lack dynamic pricing features or the verifiable and trusted freshness and authenticity data. These current platforms rely on estimates of shelf life based on catch date and storage conditions. These inputs are insufficiently verifiable and quantifiable for digital tools based on them to be broadly trusted and accepted for dynamic pricing, and they do not address authentication and quality metrics or capabilities at all.

As of 2021, dynamic pricing software solutions also lack higher quality verifiable and trusted freshness and authenticity data and are primarily designed for final retail discounting, often integrated only into broader retailer systems. This makes them less effective for application to upstream supply chain node tasks and adding value for each node in the supply chain.

Products for quantitative measurement of fish freshness rely primarily on destructive laboratory-based methods that are not capable of accurate spot checks of individual fish or fish portions or cannot be realistically and easily repeated at low cost at multiple points along the supply chain. Tools are available that measure tissue conductivity through fish skin, primarily to assess moisture, but they are not designed to address broader nutrient content, species, and traceability.

4.2 SafetySpect’s quality, adulteration, and traceability (QAT) technology

One approach to addressing the problem of traceability and rapid detection of spoilage is SafetySpect’s newly developed handheld QAT scanner that optically detects previously established chemical signatures of seafood freshness, quality, and species ID. This device integrates several types of spectroscopic data through its fusion-AI algorithm into simple, human readable reports. The handheld scanner will enable spot-checks of quality, species ID, and freshness all along the supply chain. Integration with a mobile device and app can couple the QAT output to blockchain-enabled, cloud-based, supply chain management platforms for tracking product quality, freshness, and species ID from harvest to market. When integrated with a digital platform using blockchain and dynamic pricing technology, these tools will support the modernization of the global fishing and seafood processing industries, supply chains and retail outlets, as well as provide accurate information to consumers about the sustainability, freshness, and quality of their seafood purchases. Specially designed apps can also integrate smallholder producers in developing economies into the broader, emerging digital supply chain platforms in these markets.

By providing trusted product and pricing data at any node of the supply chain, QAT fundamentally changes the business models of seafood processors, wholesalers and retailers, making it practical to (a) identify mislabeled product, and (b) dynamically price perishable seafood at multiple purchase decision points – beyond traditional final-discounting by retailers. The quantitative underlying data provides high confidence and additional visibility and trust in the freshness of such a highly perishable good, and makes intelligent pricing based on quality/freshness practical at all nodes along the supply chain before final retail sale.

With this capability, QAT will spur innovation in the execution of seafood supply chains by providing accountability at each node for maintaining quality. This will drive improvements in purchase decision making, traceability, authentication, and inventory planning. Retailers can use dynamic pricing in both their purchase and final sale decision making. Given the high proportion of seafood sales attributable to the largest retailers in both developed and developing markets, retailers can have significant economic incentive to adopt such technologies, and the power to encourage its adoption by upstream suppliers.

Technologies like SafetySpect QAT will have four major impacts on the global seafood industry: (1) Reduce waste by tracking fish freshness, thus enabling vastly improved freshness-based dynamic pricing tools at multiple supply chain nodes; (2) Increased visibility and trusted information; (3) Increased value of seafood across markets and positive impact on public health by providing assurance of quality, species, and freshness; (4) Positive impact on a number of sustainability goals including better management of at-risk wild fisheries, combatting illegal poaching of fish and wild animal species, improving food security, and providing better economic engagement and access for marginalized smallholder producers to broader digital supply chain systems and platforms, reducing resource consumption and combatting climate change.


5. Conclusion

Food waste is a global problem caused in large part by food spoilage that has gone undetected. This problem exacerbates world hunger issues and affects consumers by causing them to pay too high a price for food products whose shelf lives have been improperly or inadequately estimated. Several technologies have been applied to improve the detection of food spoilage and provide better valorization of food products at all points along the food supply chain, but many of these methods are either unreliable or damage the samples being evaluated. Spectroscopic techniques, on the other hand, offer a reliable and non-invasive means of detecting and quantifying spoilage. Recent developments in fundamental spectroscopic technologies have enabled the development and productization of portable and handheld devices for conducting analysis of food products in-situ. Furthermore, algorithmic advancements have improved our ability to extract the most relevant features from the spectroscopic data and yield highly accurate classifications and quantifications of spoilage. These technologies, in combination with advancements such as blockchain, used in conjunction with technologies like SafetySpect’s QAT scanner, offer the promise to reduce food waste and extend shelf lives through detection of spoilage at earlier points along the food supply chain and will provide the ability to impose intelligent pricing and traceability tracking for the benefit of consumers around the globe.


  1. 1. Love DC, Fry JP, Milli MC, Neff RA. Wasted seafood in the United States: Quantifying loss from production to consumption and moving toward solutions. Glob Environ Change. 2015 Nov 1;35:116-24
  2. 2. The State of World Fisheries and Aquaculture 2020 [Internet]. FAO; 2020 [cited 2020 Nov 14]. Available from:
  3. 3. Wang K, Pu H, Sun D-W. Emerging Spectroscopic and Spectral Imaging Techniques for the Rapid Detection of Microorganisms: An Overview. Compr Rev Food Sci Food Saf. 2018;17(2):256-73
  4. 4. Vasefi F, Booth N, Hafizi H, Farkas DL. Multimode Hyperspectral Imaging for Food Quality and Safety. Hyperspectral Imaging Agric Food Environ [Internet]. 2018 Nov 5 [cited 2020 Jun 14]; Available from:
  5. 5. Feng Y-Z, Sun D-W. Application of hyperspectral imaging in food safety inspection and control: a review. Crit Rev Food Sci Nutr. 2012;52(11):1039-58
  6. 6. Sensory assessment scoresheets for fish and shellfish - Torry & QIM [Internet]. Seafish. [cited 2020 Nov 7]. Available from:
  7. 7. Liu D, Zeng X-A, Sun D-W. Recent developments and applications of hyperspectral imaging for quality evaluation of agricultural products: a review. Crit Rev Food Sci Nutr. 2015;55(12):1744-57
  8. 8. Elmasry G, Barbin DF, Sun D-W, Allen P. Meat quality evaluation by hyperspectral imaging technique: an overview. Crit Rev Food Sci Nutr. 2012;52(8):689-711
  9. 9. Foca G, Ferrari C, Ulrici A, Sciutto G, Prati S, Morandi S, et al. The potential of spectral and hyperspectral-imaging techniques for bacterial detection in food: A case study on lactic acid bacteria. Talanta. 2016 Jun 1;153:111-9
  10. 10. Ellis DI, Muhamadali H, Haughey SA, Elliott CT, Goodacre R. Point-and-shoot: rapid quantitative detection methods for on-site food fraud analysis – moving out of the laboratory and into the food supply chain. Anal Methods. 2015;7(22):9401-14
  11. 11. Yan H, Siesler HW. Hand-held near-infrared spectrometers: State-of-the-art instrumentation and practical applications. NIR News. 2018 Nov;29(7):8-12
  12. 12. Crocombe RA. Portable Spectroscopy. Appl Spectrosc. 2018 Dec 1;72(12):1701-51
  13. 13. Kademi HI, Ulusoy BH, Hecer C. Applications of miniaturized and portable near infrared spectroscopy (NIRS) for inspection and control of meat and meat products. Food Rev Int. 2019 Apr 3;35(3):201-20
  14. 14. Masna N, Paul SD, Chen C, Mandal S, Bhunia S. Eat, but Verify: Low-Cost Portable Devices for Food Safety Analysis. IEEE Consum Electron Mag. 2018 Dec 11;8
  15. 15. Santos CAT dos, Lopo M, Páscoa RNMJ, Lopes JA. A Review on the Applications of Portable Near-Infrared Spectrometers in the Agro-Food Industry. Appl Spectrosc. 2013 Nov 1;67(11):1215-33
  16. 16. Part 4: The Optical Bench - B&W Tek [Internet]. [cited 2020 Aug 29]. Available from:
  17. 17. Oliveres R. What is a Laser Spectrum Analyzer? [Internet]. My Laser Spectrum. 2016 [cited 2020 Aug 29]. Available from:
  18. 18. NIRONE SENSOR | Spectral Engines | High-performance, compact and reliable NIR spectral sensor | Acal BFi BE [Internet]. [cited 2020 Aug 29]. Available from:
  19. 19. INC TT ERIC BALTZ and ROGER KIRSCHNER, RESEARCH ELECTRO-OPTICS. For Compactness and Ruggedness, Linear Variable Filters Fit the Bill [Internet]. [cited 2020 Aug 29]. Available from:
  20. 20. Pederson CG, Friedrich DM, Hsiung C, von Gunten M, O’Brien NA, Ramaker H-J, et al. Pocket-size near-infrared spectrometer for narcotic materials identification. In: Druy MA, Crocombe RA, editors. Baltimore, Maryland, USA; 2014 [cited 2020 Aug 30]. p. 91010O. Available from:
  21. 21. MicroNIR OnSite-W [Internet]. 2017 [cited 2020 Aug 30]. Available from:
  22. 22. Gat N, Pottebaum T, Scriven G, Brandt R, Systems O-K. Hadamard Transform Imaging Spectrometry (HTIS) and Compressive Detection Techniques. :19
  23. 23. Du Y, Zhou G. A MEMS-driven Hadamard transform spectrometer. In: MOEMS and Miniaturized Systems XVII [Internet]. International Society for Optics and Photonics; 2018 [cited 2020 Aug 29]. p. 105450X. Available from:
  24. 24. Liu N, Parra HA, Pustjens A, Hettinga K, Mongondry P, van Ruth SM. Evaluation of portable near-infrared spectroscopy for organic milk authentication. Talanta. 2018 Jul 1;184:128-35
  25. 25. Fourier-Transform Spectrometers [Internet]. [cited 2020 Aug 28]. Available from:
  26. 26. FTIR Spectroscopy Basics - US [Internet]. [cited 2020 Aug 28]. Available from: //
  27. 27. Spectral Sensors [Internet]. NeoSpectra Sensors. 2018 [cited 2020 Aug 30]. Available from:
  28. 28. Ausili A, Sánchez M, Gómez-Fernández JC. Attenuated total reflectance infrared spectroscopy: A powerful method for the simultaneous study of structure and spatial orientation of lipids and membrane proteins. Biomed Spectrosc Imaging. 2015 Jan 1;4(2):159-70
  29. 29. Ocean MZ5 - Mini ATR MIR Spectrometer Features | SpectrEcology [Internet]. Spectrecology - Spectroscopy & Optical Sensing Solutions. [cited 2020 Aug 30]. Available from:
  30. 30. Bureau S, Cozzolino D, Clark CJ. Contributions of Fourier-transform mid infrared (FT-MIR) spectroscopy to the study of fruit and vegetables: A review. Postharvest Biol Technol. 2019 Feb;148:1-14
  31. 31. The Raman Spectrophotometer [Internet]. [cited 2020 Aug 30]. Available from:
  32. 32. Mira M-1 Advanced Package [Internet]. [cited 2020 Aug 30]. Available from:
  33. 33. Raman FAQs - What laser wavelengths are used for Raman spectroscopy? - HORIBA [Internet]. [cited 2021 Mar 28]. Available from:
  34. 34. Moore’s Law Turns 50- How Much Longer Can it Last? : vTools Events [Internet]. [cited 2020 Nov 8]. Available from:
  35. 35. FPGAs & 3D ICs [Internet]. Xilinx. [cited 2020 Nov 8]. Available from:
  36. 36. Kaiser HF. The application of electronic computers to factor analysis. Educ Psychol Meas. 1960;20:141-51
  37. 37. Granato D, Santos JS, Escher GB, Ferreira BL, Maggio RM. Use of principal component analysis (PCA) and hierarchical cluster analysis (HCA) for multivariate association between bioactive compounds and functional properties in foods: A critical perspective. Trends Food Sci Technol. 2018 Feb 1;72:83-90
  38. 38. Saleem Z, Khan MH, Ahmad M, Sohaib A, Ayaz H, Mazzara M. Prediction of Microbial Spoilage and Shelf-Life of Bakery Products Through Hyperspectral Imaging. IEEE Access. 2020;8:176986-96
  39. 39. Zhang Z, Xu Y, Yang J, Li X, Zhang D. A survey of sparse representation: algorithms and applications. IEEE Access. 2015;3:490-530
  40. 40. Lei T, Sun D-W. A novel NIR spectral calibration method: Sparse coefficients wavelength selection and regression (SCWR). Anal Chim Acta. 2020 May;1110:169-80
  41. 41. Vasafi PS, Paquet-Durand O, Brettschneider K, Hinrichs J, Hitzmann B. Anomaly detection during milk processing by autoencoder neural network based on near-infrared spectroscopy. J Food Eng. 2021 Jun;299:110510
  42. 42. Rosipal R. Nonlinear Partial Least Squares: An Overview. Chemoinformatics Adv Mach Learn Perspect Complex Comput Methods Collab Tech. 2010 Jan 1
  43. 43. Björk A. Chemometric and signal processing methods for real time monitoring and modeling: applications in the pulp and paper industry. 2007
  44. 44. Berglund A, Wold S. INLR, implicit non-linear latent variable regression. J Chemom. 1997;11(2):141-56
  45. 45. Jiang S, He H, Ma H, Chen F, Xu B, Liu H, et al. Quick assessment of chicken spoilage based on hyperspectral NIR spectra combined with partial least squares regression [Internet]. 2021 [cited 2021 Mar 26]. Available from: /paper/Quick-assessment-of-chicken-spoilage-based-on-NIR-Jiang-He/94d06b673cd5e0eaf0452144c9ee30d2e9e25b44
  46. 46. Cavaglia J, Schorn-García D, Giussani B, Ferré J, Busto O, Aceña L, et al. ATR-MIR spectroscopy and multivariate analysis in alcoholic fermentation monitoring and lactic acid bacteria spoilage detection. Food Control. 2020 Mar;109:106947
  47. 47. Qi X, Jiang J, Cui X, Yuan D. Moldy Peanut Kernel Identification Using Wavelet Spectral Features Extracted from Hyperspectral Images. Food Anal Methods. 2020 Feb;13(2):445-56
  48. 48. Ji Y, Sun L, Li Y, Ye D. Detection of bruised potatoes using hyperspectral imaging technique based on discrete wavelet transform. Infrared Phys Technol. 2019 Dec;103:103054
  49. 49. Friedman JH. Multivariate Adaptive Regression Splines. Ann Stat. 1991 Mar;19(1):1-67
  50. 50. Garre A, Ruiz MC, Hontoria E. Application of Machine Learning to support production planning of a food industry in the context of waste generation under uncertainty. Oper Res Perspect. 2020;7:100147
  51. 51. Breiman L. Random Forests. Mach Learn. 2001 Oct 1;45(1):5-32
  52. 52. Press WH, Teukolsky SA. Savitzky-Golay Smoothing Filters. Comput Phys. 1990;4(6):669
  53. 53. Kew H. A model for spectroscopic food sample analysis using data sonification. Int J Speech Technol [Internet]. 2021 Jan 13 [cited 2021 Mar 27]; Available from:
  54. 54. Rady A, Fischer J, Reeves S, Logan B, James Watson N. The Effect of Light Intensity, Sensor Height, and Spectral Pre-Processing Methods When Using NIR Spectroscopy to Identify Different Allergen-Containing Powdered Foods. Sensors. 2020 Jan;20(1):230
  55. 55. Masithoh RE, Rondonuwu FF, Setyabudi FMCS, Cho BK. Development of calibration model for determination of sweeteners additives in Indonesia rice flour-based food by FT-NIR spectroscopy. IOP Conf Ser Earth Environ Sci. 2020 Aug 7;542:012017
  56. 56. Hastie T, Tibshirani R, Friedman J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition [Internet]. 2nd ed. New York: Springer-Verlag; 2009 [cited 2020 Nov 8]. (Springer Series in Statistics). Available from:
  57. 57. Bonah E, Huang X, Yi R, Aheto JH, Yu S. Vis-NIR hyperspectral imaging for the classification of bacterial foodborne pathogens based on pixel-wise analysis and a novel CARS-PSO-SVM model. Infrared Phys Technol. 2020 Mar;105:103220
  58. 58. Qin J, Vasefi F, Hellberg RS, Akhbardeh A, Isaacs RB, Yilmaz AG, et al. Detection of fish fillet substitution and mislabeling using multimode hyperspectral imaging techniques. Food Control. 2020 Aug 1;114:107234
  59. 59. Balabanov PV, Divin AG, Egorov AS, Yudaev VA, Lyubimova DA. Vision system for detection of defects on apples using hyperspectral imaging coupled with neural network and Haar cascade algorithm. IOP Conf Ser Mater Sci Eng. 2020 May 28;862:052058
  60. 60. Shin S, Lee Y, Kim S, Choi S, Kim JG, Lee K. Rapid and non-destructive spectroscopic method for classifying beef freshness using a deep spectral network fused with myoglobin information. Food Chem. 2021 Aug;352:129329
  61. 61. Medus LD, Saban M, Francés-Víllora JV, Bataller-Mompeán M, Rosado-Muñoz A. Hyperspectral image classification using CNN: Application to industrial food packaging. Food Control. 2021 Jul;125:107962
  62. 62. Zeng J, Guo Y, Han Y, Li Z, Yang Z, Chai Q, et al. A Review of the Discriminant Analysis Methods for Food Quality Based on Near-Infrared Spectroscopy and Pattern Recognition. Mol Basel Switz. 2021 Feb 1;26(3)

Written By

John Chauvin, Ray Duran, Stanley Ng, Thomas Burke, Kenneth Barton, Nicholas MacKinnon, Kouhyar Tavakolian, Alireza Akhbardeh and Fartash Vasefi

Submitted: June 17th, 2020 Reviewed: April 7th, 2021 Published: June 29th, 2021