Production processes for biopharmaceuticals with mammalian cells have to provide a nearly optimal environment to promote cell growth and product formation. Design and operation of a bioreactor are complex tasks, not only with respect to reactor configuration and size but also with respect to the mode of operation. New concepts for the design and layout of process strategies are required to meet regulatory demands and to guarantee efficient, safe, and reproducible biopharmaceutical production. Key elements are critical process parameters (CPPs), which affect critical quality attributes (CQAs), quality by design (QbD), process analytical tools (PAT), and design of experiment (DoE). In this chapter, some fundamentals including process and control strategies as well as concepts for process development are discussed. Examples for novel model-based concepts for the design of experiments to identify suitable fed-batch-feeding strategies are shown.
- cell culture
- process strategy
- design of experiment (DoE)
- kinetic model
- quality by design
The increasing demand for biopharmaceuticals in recent decades has led to diverse methods of process development and process establishment [1, 2]. Besides small molecules, biopharmaceuticals are used for the medication of former untreatable diseases and provide a novel class of therapeutics. At the same time, applications such as personalized medicine are coming up . The approvals of biopharmaceuticals produced with mammalian cells have increased from 33% in 1989 to 60% from 2010 to 2014. As shown in Figure 1, 30% of the approved products in the http://www.pharmexec.com/global-biopharmaceuticals-market-growcagr-96-2020 USA were novel therapeutic antibodies, followed by hormones and blood-related medications.
The global biopharmaceuticals market was valued at approx. 160 US$ billion in 2014, and it is expected to grow with a compound annual growth rate of 9.6% from 2015 to 2020 . In 2013, 63 US$ billion sales were only generated from the three tumor necrosis factor (TNF)-binding antibodies Humira® (adalimumab), Enbrel® (etanercept), and Remicade® (infliximab) . According to BioPlan Association, 36% of the top 1000 biopharma companies are in North America followed by Europe with 24.3%. Seventeen percent are in Japan, China, and other Asian regions. Indian companies account for 7.6% and all unlisted regions (e.g., Russia and Latin-America) are under 4% each .
Trends for the future indicate a growing market of share up to 50% of the top 100 pharmaceuticals to be bio-based . At the same time, the costs for the development of biopharmaceutical drugs increased rapidly from 413 US$ billion (1980, year 2013 dollars) to 2558 US$ billion in 2013 . Reasons are the increasing regulatory restrictions due to a lack of transparency of pharma companies as well as cost-intensive drug development, preclinical, and clinical trials. The majority (approx. 70%) of recombinant protein therapeutics are produced nowadays in suspension Chinese hamster ovary (CHO) cell cultures. More than two decades of experience in the biopharmaceutical industry have demonstrated that CHO-cells do not only possess characteristics required for reproducible and efficient large-scale production of these drugs but can be regarded as safe for expressing drugs for human use [8, 9]. To ensure the safety of biological therapeutics, regulatory guidance requires adventitious agent testing of the bulk harvest .
Even if a novel pharmaceutical enters the market, selling prices in cost-conscious health-care systems are relatively low . Due to variations in the quality of products in former times, the Food and Drug Administration (FDA) has moved from an end-product-based quality control (quality by inspection) with low restrictions concerning the production process to a more defined process [quality by design (QbD)] with the incorporation of knowledge. In this chapter, fundamentals including process and control strategies as well as concepts for process development in the field of quality by design are discussed. Examples for novel model-based concepts for the design of experiments (DoEs) are shown.
2. The regulatory prospective: quality by design
Process harmonization for the production of biopharmaceuticals is driven forward by regulatory authorities and industrial companies. Therefore, the International Conference of Harmonization (ICH) was founded by regulatory authorities and industry associations of Japan, Europe, and the US. The World Health Organization (WHO) acts as a standing observer and helps to connect the regulatory authorities within the United Nations. Harmonization of regulatory requirements enables industry to reduce development times by removing the duplication of studies that was previously required to gain market approval for a new drug in each of the three regions. This directly affects the bottom line through reduced development times and regulatory review times. Therefore, these activities shall not only ensure the highest level of safety for production but also enhance the competitive position of those companies that choose to operate using its standards .
Reasons for the harmonization of biopharmaceuticals are the globalization of companies and the former diverse regulatory requirements to produce and sell biopharmaceuticals. Besides this, the development of robust bioprocesses to manufacture active, stable, and high-quality medicine with predefined quality attributes is still challenging. To harmonize the requirements for biopharmaceutical production, the ICH quality guidelines ICH Q8 (R2), ICH Q9, and ICH Q10 introduced the concept of quality by design. This concept includes quality-risk management systems and the implementation of Pharmaceutical Quality Systems [13–15].
The aim of quality by design is to guarantee a production process with high operational and process stability, in view of the product quality. Key elements are critical process parameters (CPP), which affect critical quality attributes (CQAs), and design of experiment (DoE). By combining these techniques, an efficient, safe, and reproducible biopharmaceutical production process can be designed and will be approved from the regulatory authorities. Besides the implementation of good manufacturing practice (GMP) and environmental regulations, rigorous safety constraints need to be fulfilled. In addition, process analytical technologies (PAT) should be implemented to measure critical process parameters online and enhance a knowledge-based process design.
ICH Q8 describes the basic concepts for science and risk-based pharmaceutical manufacturing process development and defines their minimal requirements. It is necessary to keep in mind that
“In all cases, the product should be designed to meet patients’ needs and the intended product performance.” .
At first, the properties of the pharmaceutical with respect to quality need to be defined in the Quality Target Product Profile (QTPP). It contains all characteristics needed to guarantee the quality, safety, and efficiency of the drug and taking, for example, the dosage form, dosage strengths, route of administration, dissolution, and different quality measures into account. It is the fundamental definition of the drug and essential for the process development. Therefore, it should include all sources of prior knowledge, for example, literature, knowledge achieved during drug development, and experience within the companies. As an example, Goetze et al.  used attribute studies with clinical trial samples for a mAb treatment and described how information can be used to define the QTPP. After the QTPP of the drug is defined, quality attributes that have an effect on the product quality should be identified.
These properties, or critical quality attributes (CQAs), can be of “physical, chemical, biological or microbiological”  origin and can change if the product and process knowledge increases. The CQAs should include the whole characteristic of the drug and is the basis of a robust manufacturing process development. Established quality attributes are the cell concentration during the cell culture production process, the product titer, aggregation, glycosylation, and the sialylation grade [17, 18]. Afterwards, the production process is designed by using prior knowledge, initial experimental data, and quality-risk management methods (ICH Q9). In contrast to business and government areas, only few examples for the implementation of quality-risk management in the pharmaceutical industry were known in the past. The implementation of these methods, for example, failure mode effects analysis (FMEA), hazard operability analysis (HAZOP), and fault tree analysis (FTA), is described in the ICH 9 Guideline to guarantee, that
“[…] product quality should be maintained throughout the product lifecycle such that the attributes that are important to the quality of the drug (medicinal) product remain consistent with those used in the clinical studies.” .
Quality-risk management methods offer a systematic approach to risk assessment, control, communication, and review during the product lifecycle. The aim is to iteratively increase the knowledge about the potential CQAs and the impact of variations on the quality of the drug product. After the identification of CQAs, process parameters and material attributes should be linked to each other to increase the process understanding. This should be done by using design of experiment methods and/or mathematical models and leads to the definition of the Design Space. The Design Space is the linkage of process parameters and input variables to the CQAs and their ranges, which do not affect the drug quality during manufacturing. The Design Space is defined in the regulatory approval and variations within the Design Space are not considered as a change in view of the regulators. Variations outside the Design Space are changes that need new regulatory approval. The main advantage of the Design Space is the flexibility of the manufacturing process in predefined ranges. Due to the deeper knowledge obtained during the risk assessment, the influence of variations is identified and evaluated. Afterwards, a control strategy is designed, based on the identification of CQAs and the definition of the Design Space. It should include how critical process parameters and material attributes are controlled to ensure the product quality. Therefore, it can describe and evaluate how, for example, input materials, product specifications, unit operations in downstreaming, in-process testing, container closure systems, and/or intermediates can be controlled. This can result in less effort for end-product testing and real-time release of the drug. After a process is approved by the regulatory agencies, continuous improvement and product lifecycle management is recommended. Therefore, the ICH Q10 Guideline describes the implementation of pharmaceutical quality systems to
“[…] enhance the quality and availability of medicines around the world in the interest of public health.” .
It defines knowledge management, quality-risk management, and responsibilities of the company management to implement continuous improvement of the manufacturing process, product quality, and pharmaceutical quality system.
3. Requirements for process development
There is a rising demand for accelerated process development and increased efficiency and economics of production processes. Especially for the development of processes for biopharmaceutical products, requirements for increased process understanding have evolved from the PAT and quality by design philosophy. Processes become more complex and sophisticated, for example, by switching from simple batch to more complex fed-batch or continuous perfusion processes. The number of process variables that have to be monitored and their complexity has increased. Furthermore, the demands related to quality management and documentation (GMP) increased dramatically. A benefit of these higher efforts for development is the increasing process knowledge. This can speed up technical transfer from development into manufacturing, deliver a more optimized, robust process with higher titers and better reproducibility, and aid in troubleshooting and root-cause analysis of deviations during production. In the following, this is described for the development of process and control strategies.
3.1. Process and control strategies
Batch and fed-batch-operated processes are predominantly used in the industry, although perfusion processes have been proven for certain applications to be of higher advantage, for example, when fragile and highly glycosylated products are involved [19–21]. In principle, the reactor content is supplemented with nutrients during fed-batch cultivations. The following challenges occur during the fed-batch cultivation of mammalian cell cultures :
Due to the exponential growth, the demand of nutrients and oxygen increases exponentially, which can be challenging for the feed-control strategy.
Changes in the metabolism of mammalian cells commonly occur during the course of a cultivation or between different cultivations.
Fed-batch-operated processes typically use feed(s) with high substrate concentrations, which lead to an increase of the production. Hence, inhibiting metabolites accumulate, especially when medium with high substrate concentrations is used.
Only few cultivation parameters can be measured online during the process, making the design of the control strategy an even more challenging task.
Process control at low substrate concentrations causes low levels of specific substrate uptake and metabolite production rates. This causes a decrease in cell growth and programmed cell death, that is, apoptosis due to substrate limitation or metabolite inhibition.
The main advantage of fed-batch operation, compared to batch mode, is the extension of the growth phase as well as the stationary phase, resulting in a higher cell and product concentration and product yield. At the same time, choosing this operation mode leads to increased effort for the equipment and control. The optimization of fed-batch processes requires complex knowledge regarding the composition of the feed and the operation of the feed flow. The latter commonly occurs without the use of additional data from the process (repeated fed-batch, constant feed, linear and exponential increase of feed flow). The determination of an adequate feed flow is mainly realized experimentally [22, 23]. Feeding strategies that use additional data from the process include bolus-feeding, that is, the substitution of substrate that was consumed in a given time interval, and proportional feeding, which is based on the oxygen uptake rate (OUR) [22, 24, 25].
4. Design of experiment as a powerful tool for process optimization
The development of complex biotechnological production processes in combination with ICH Guidelines is mostly done with approved design of experiment (DoE) methods. After a brief introduction, advantages and limitations of these methods will be discussed in the following part.
4.1. History of DoE
The concept for design of experiment was first introduced in 1926 by the statistician Ronald Fisher in the article “Field Experiments: How They Are Made and What They Are.” He proposed the statistical planning and evaluation of field experiments in agriculture to determine the effect of treatments . Therefore, he introduced randomization, replication, and blocking. The basic concept for using statistical methods is that the observations are normally distributed and drawn independently. The application of this assumption to real experimental settings was still limited in the 1920s. Therefore, Fisher proposed in 1923 the randomization of experiments in order to simulate independent-drawn sampling . The main advantages are the elimination of influences from the experimenter to the experiment (selection bias) and the neglecting of external factors (accidental bias) .
Replication is defined as the repetition of experiments with the same conditions to estimate the errors and variability. Although replication was known in the 1920s, Fisher demonstrated the need for it in the book “Statistical Methods for Research Workers,” which was published in 1925 . He evaluated the treatment of 12 potato varieties with basal dressing and additional dressing of sulfur and chloride.
To test the significance, he split the field into two parts and took the standard deviations of the yield into account. The responses were not significant and no effect of additional sulfur and chloride was found . Fisher arranged different lands in blocks with the same experimental conditions. By comparing the blocked experiments, he could reduce the error and increase the precision. After Fisher introduced the need for randomization, replication, and blocking, they were implemented by various authors and became state-of-the-art practice in experimental design [30–32].
In 1951, Box and Wilson adapted these techniques in the field of optimization of industrial settings and developed the response surface methodology (RSM) . In contrast to Fishers field experiments, the response of a given technical setting can be determined quickly. By evaluating the response of the experiments, subsequent experiments can be designed. In particular, response surface methods are widely used because of their simple structure, independence from underlying relationships, and the opportunities for optimization of factors in order to maximize a response [31, 34]. In the following years, these concepts were adapted by researches if less information about the underlying mechanisms are known.
4.2. DoE applied to QbD
In contrast to trial-and-error, or one-factor-at-a-time methods, statistical design of experiment (e.g., response surface) methods are widely used to develop biopharmaceutical processes [35–37]. They offer a tool to increase the process understanding and to identify the effect of process parameters to quality attributes. The main advantage of DoE methods is the systematic planning of efficient experiments in order to find relationships between factors and responses. Factors (e.g., medium components, feeding rates, and temperature) are the variables in the experimental design, which were defined previously by the experimenter and will be the input to a bioprocess.
The definition of boundary conditions for the factors is mainly based on heuristics and information from literature. Subsequently, experiments are planned by DoE software algorithms and performed in the laboratory. Then, the response (product titer and quality attributes) is determined and evaluated. Due to the high number of factors involved during bioprocesses, the identification of significant factors is performed by applying screening designs at first [38, 39]. In the context of designing biopharmaceutical production processes, they were used in the upstream as well as in the downstream part. As an example for the design of a bioprocess, Tai et al.  implemented a definite screening design in combination with an automated bioreactor system (250 mL) to identify the effect of 10 process parameters on the production of recombinant protein produced in Escherichia coli. Afterwards, the process parameters were optimized and recommendations for the Design Space of a pilot-scale plant (450 L) were given. As an example for the part of product purification, Horvath et al.  used a screening design with eight experiments to determine the effect of different process parameters on the isoelectric point of a therapeutic antibody expressed in CHO cell culture. The pH, temperature, and the time of the temperature shift were significant. These factors were evaluated on three levels in a concluding response surface design to optimize the isoelectric point .
Due to the high number of parameters affecting the processes and the product quality, a variety of high-throughput platform technologies such as the Ambr (Sartorius), Ramos (Kuhner), and DASGIP (Eppendorf) were developed . The advantage of DoE methods is the possibility to identify the effect of different factors by a reduced number of experiments. Based on the method used, even interactions between factors can be identified [31, 43]. Limitations occur with respect to the high number of experiments during process development and the reduction of complex relationships into simple linear correlations.
The modeling of cell culture processes has been extensively discussed and summarized in the literature [43–46]. Mathematical models are used for the design, optimization, and control of cell culture processes. Established and simple model structures, like unstructured, unsegregated, and empirical kinetic models, are state of the art [46, 47].
With the development of novel analytical methods, structured, segregated, and mechanistic models that allow a deeper comprehension on the mechanism of growth and production of mammalian cells proceed to take on greater significance . It has been stressed by [48–50] that mathematical modeling is a substantial part of QbD since the model contributes to a scientific understanding on the product formation and the process control and monitoring. However, the performance of a model structure is highly dependent on the identification of key parameters. The main objective of a model is to find solutions by analyzing the model instead of performing various experiments. The model mainly works as an initial starting point to the obtainment of deeper process understanding that is required for process optimization. Models are convenient if the reality can be described with sufficient accuracy of the real phenomena. For this purpose, the model structure should be kept as simple as possible with no unnecessary terms. According to Ref. , a model should be able to describe all phases of a cultivation with respect to the different operation modes (batch, fed-batch, etc.). At the same time, the parameters considered should have physical significance and should be determinable by simple experiments. Furthermore, it is favorable if models used for process optimization are applicable for a broad range of bioreactor scales [52, 53].
Due to the complexity of biological processes, simple models might be unsuitable for representing real phenomena. But even with complex models, the behavior of cells may change and predictions can differ from observed behavior. Reasons are the inadequate precision of the approximated model coefficients and the complexity during the determination of the model parameters. Therefore, a compromise between the accuracy of the model and the required experimental effort for the determination of the parameters needs to be agreed on for each application .
6. Model-based design of experiment
The combination of model-based simulations with DoE methods for the development of cell culture processes is a novel tool for process development in context with QbD. As shown in Section 4, DoE methods are well established and state of the art for the optimization of process development, feeding, and media optimization. However, they can result in a high number of time-consuming and cost-intensive experiments. Even if high-throughput systems can handle these number of experiments in parallel, the heuristic restriction of boundaries and the high number of factors result in stepwise iterations with multiple runs.
The use of DoE methods for the optimization of feeding strategies is still limited, due to their complexity. Furthermore, narrowing the Design Space using only experimental methods requires plenty of time and experimental effort, especially in cases where a high number of relevant factors are targeted. Even if the experiments are planned and performed with much experience, several experimental runs can be expected . In contrast to the chemical industry, process design based on mathematical models is not established in cell culture processes . Reasons are the challenges of an appropriate application of models on complex metabolic pathways of mammalian cells regarding cell growth and product formation. In addition, models targeting the metabolism of cell cultures demand more effort than those applied in chemical or microbial processes. Even if mathematical models are a promising tool for the development of stable processes that comply with the principles of QbD, examples have so far only been published in the field of product purification and polishing. In this way, Siegfried et al.  designed a blending unit using a discrete element model. The flow characteristics of the blender used in production were predicted without performing lab-scale experiments. Rácz et al.  used model predictions to form a prognosis on the deviation of 12 chromatography columns that came from different lots of the same manufacturer. In Ref. , critical quality characteristics of a wet-granulation process were determined by population balance modeling and compared to experimental results. Even though the prediction of the resin’s porosity was of insufficient accuracy, the simulation allowed deeper understanding of the process, which leads to a more efficient determination of the Design Space. In the general procedure of process development, the use of complex, demanding, and time-consuming methods based on DoE tools has become common practice, although mathematical models are already used in the operation, control, and optimization of existing production processes [47, 58, 59]. Novel tools for process development and optimization combine DoE tools with a growth model. As shown in Figure 2, their main objective is to increase the efficiency of DoE methods in context with QbD principles to identify critical process parameters, for example, fed-batch strategies or medium compositions [60, 61].
These methods are used to reduce the number of experiments during model-based DoE (mDoE) and the needed time for the development of more knowledge-based cell culture processes. In contrast to purely experimental or purely model-based methods, mathematical methods could be used to identify significant process variables in silico, which are later experimentally verified. In this way, a reduced number of experiments are planned by mDoE. The key element of an mDoE is the model, which compromise between the modeling approach, the identification of the kinetic parameters, and the quality of the model regarding the process. In contrast to process designs that are purely simulation-based and that involve highly complex models, which can deliver high-quality results, mDoE allows process optimization with lower-model complexity. In this way, Sercinoglu et al.  developed a tool for the evaluation of feeding strategies during fed-batch cultivation of AGE1.hn cells. Therefore, a simple unstructured and unsegregated model was used to plan four fed-batch experiments.
Möller et al.  used a model to describe the dynamics of cell metabolism of CHO-XM-111 cells in a two-step growth process with medium exchange followed by a fed-batch. Model parameters were fitted to averaged data from three parallel shake-flask cultivations, and model predictions were used to decrease the experimental space and the number of cultivation parameters in silico. To increase the total cell number (N) in a fed-batch, the
Instead of performing multiple experiments to generate data for response surface plots, simulated data were used. In this way, a reduced experimental space for a concluding optimization based on experiments was determined. One set of cultivation parameters in the range of the reduced experimental design was experimentally tested to evaluate the behavior of CHO-XM-111 cells during medium exchange and fed-batch. As shown in Figure 4, the total cell number was predicted successfully during fed-batch and medium exchange.
The combination of model-based simulations with DoE is suitable for the generation of deeper understanding of processes, for example, the linkage of different process parameters to quality attributes. Furthermore, cultivation strategies for mammalian cell lines can be compared and evaluated before experiments have to be performed in the laboratory. This results in a significant reduction of the number of experiments required during process development and process establishment. The strategy is especially intended for the use in multi-single-use-devices to speed up process development.
Ponce-García thanks to CONACyT for postdoctoral scholarship support.