Open access peer-reviewed chapter - ONLINE FIRST

Knowledge Management and Knowledge Leadership in the Fourth Industrial Revolution: Resolving the Automation-Augmentation Paradox

Written By

Hadi El-Farr and Kevin Sevag Kertechian

Submitted: 09 November 2023 Reviewed: 11 March 2024 Published: 06 May 2024

DOI: 10.5772/intechopen.1005236

The Changing Landscape of Workplace and Workforce IntechOpen
The Changing Landscape of Workplace and Workforce Edited by Hadi El-Farr

From the Edited Volume

The Changing Landscape of Workplace and Workforce [Working Title]

Hadi El-Farr

Chapter metrics overview

31 Chapter Downloads

View Full Metrics

Abstract

As acknowledged by scholars and practitioners, the rise of artificial intelligence and cyber-physical systems has led to a shift from the third to the fourth industrial revolution. Knowledge management as a discipline evolved in the late twentieth century, reflecting the increasing importance of knowledge as a resource in the knowledge-economy era. This chapter explores how organizations manage their knowledge in the fourth industrial revolution, which arguably should differ from how they did in the 1990s. The paper will begin by identifying the major characteristics of the four industrial revolutions. It will then delve into organizations’ strategies for managing knowledge during the third industrial revolution. Subsequently, alternative knowledge management strategies will be highlighted to address the changes brought about by the fourth industrial revolution. We claim that organizations might decide to prioritize augmentation or automation, or we propose an organic relationship between both, calling for another approach to managing knowledge: DeParadoxication. By reviewing relevant literature, this chapter proposes a theoretical framework for knowledge management in the twenty-first century.

Keywords

  • artificial intelligence
  • industry 4.0
  • augmentation
  • automation
  • knowledge management
  • leadership
  • strategy
  • IT adoption
  • job design
  • employee skills

1. Introduction

The rise of the knowledge economy in the 1990s stressed the importance of knowledge as the most critical resource to achieve a competitive advantage [1]. Reflecting the central role of knowledge in production, the term knowledge management was introduced, highlighting the need for organizations to manage knowledge creation and exploitation actively [2]. Knowledge has always been a source of competitive advantage, even before the knowledge economy era. That said, the increased importance rose due to the growth in the service sector and knowledge-intensive industries [1]. Thus, developing advanced information technology solutions and the quality of embedded knowledge in workers, in addition to their willingness to share knowledge and ability to create knowledge, arguably, became more important than financial and physical resources in competition. Stated differently, in the knowledge economy era, products and services are characterized by rapid obsolescence; therefore, the speed of knowledge creation and innovation and the dissemination of knowledge and its widespread use within organizations, among other knowledge-intensive activities, are of greater importance to organizational competitiveness than any other resource [3].

Many scholars argue that we are in the fourth industrial revolution (Industry 4.0), a term made famous by Klaus Schwab, the founder and chairman of the World Economic Forum [4]. The shift from the third to the fourth industrial revolution was caused by the development of several general-purpose technologies (GPTs), which infuse the physical world with the digital and biological [5]. Among those GPTs are artificial intelligence (AI), cyber-physical systems (CPS), intelligent robots, and advancements in nanotechnology and biotechnology, which enabled a major leap in innovation, automation, productivity, quality, efficiency, and customization [5, 6]. With the emergence of the fourth industrial revolution, knowledge management is far from being obsolete. The importance of knowledge in achieving a competitive advantage has been increasing exponentially. Agreeing with some scholars in the field of knowledge management, we argue that the approach to knowledge management should adjust to accommodate the technological advancements of the current era [7].

This chapter will begin with highlighting the major features of the first three industrial revolutions, leading to an in-depth presentation of the fourth industrial revolution. Then, it will present an overview of the knowledge management strategies debated in the third industrial revolution. The chapter will focus then on debating three knowledge management strategies to cater to the fourth industrial revolution. Afterward, the concept of knowledge leadership will be addressed, highlighting the best-fit leadership styles and roles in the digital era. Finally, the chapter will briefly present some technological and human challenges, with a focus on the essential role of leadership in managing those challenges and their interrelations.

Advertisement

2. The first three industrial revolutions

An industrial revolution is marked by technological breakthroughs that lead to substantial shifts in the socio-economic landscape. Such innovations, termed general-purpose technologies (GPTs), can be widely integrated into various industries (pervasiveness), enhance productivity and efficiency (dynamism), and foster innovation and the development of diverse technical applications (complementarities) [8]. In the context of this chapter, we define an industrial revolution as the rise of multiple technological advancements that deeply transform the economy, particularly in terms of production methods, the creation or obsoleteness of new industries, and the impact on the workplace and workforce.

In essence, although the introduction of GPTs during each industrial revolution marks a significant surge in automation, productivity, and efficacy, it is essential to highlight that the pace at which organizations adopt these technologies can vary based on financial constraints and their ability to integrate innovations, among other factors. So, while some enterprises are at the forefront, leveraging cutting-edge technologies, others may still rely on tools from past industrial phases.

Before the first industrial revolution and the factory system, manufacturing took place within the domestic system, also referred to as the putting-out system or cottage industry. Craftsmen produced goods at home or in workshops. Merchant capitalists used to deliver raw materials to craftsmen, who process them into final goods. Then, the merchant capitalist collects the goods, compensates the craftsmen, and sells the products in the market. The domestic system, thus, is characterized by high decentralization, low efficiency, and low production capacity. That said, the knowledge of the entire production process is embedded in the artisan, and the production quality depends on their skill level, with a high ability for customization. Moreover, craftsmen had more control over their schedules and working methods, which meant merchant capitalists had a diminished role in directing the production process.

The Industrial Revolution, a transformative era, unfolded in three distinct phases, fundamentally reshaping society and the economy. The First Revolution introduced mechanization with steam power, the Second ushered in mass production with electricity, and the Third revolutionized industries with digital technology. Each phase marked a significant leap in technological advancement, collectively setting the stage for the modern industrialized world.

2.1 The first industrial revolution

The onset of the First Industrial Revolution, often termed Industry 1.0, can be traced back to the mid-eighteenth century, marked by significant innovations like the steam engine. Initially invented by Thomas Newcomen in 1712, the steam engine underwent transformative enhancements under James Watt in 1769 [9]. The era’s primary General-Purpose Technologies (GPTs) included steam and water power, facilitating the shift toward mechanized production. By the mid-nineteenth century, the embrace of steam engines and the factory system intensified, propelling industrialization to new heights by introducing the factory system.

These technological advancements ushered in heightened automation, lessening reliance on manual craftsmanship and amplifying production speed and efficiency. The outcome was a monumental increase in product diversity, volume, and affordability. Steam engines became instrumental across multiple sectors, including textiles, transportation (like steam trains and ships), construction, farming, mining, and glassmaking [9]. This boom spurred a greater need for resources such as coal and iron. Consequently, cities grew as more people flocked to them for factory jobs, replacing the artisan-based workforce of yesteryears. This shift also birthed new socio-economic divisions, notably the working class and the capitalist elites.

However, the gifts of the Steam Age came with their set of challenges. While production and transportation underwent revolutionary enhancements, there were adverse repercussions like heightened pollution, subpar working environments, meager wages, and the exploitation of child labor. These developments further deepened social disparities and introduced a range of societal concerns.

2.2 The second industrial revolution

The late nineteenth century heralded the onset of the Second Industrial Revolution frequently referred to as Industry 2.0, primarily propelled by the innovation of electricity. This pivotal change witnessed the shift from steam to electric power, bringing about a transformative wave in factory automation. The genesis of this transformation can be traced back to 1870 with the invention of electric generators and trains. This period also saw the introduction of assembly lines and a structured division of labor, leading to the large-scale production of goods. The work process became more standardized, tasks more simplified, and labor more task-specialized. In this framework, workers were often perceived as mere cogs in the production wheel, required to possess only rudimentary skills for their specific tasks. One evident advantage of this was the substantial augmentation in productivity and operational efficiency.

Beyond just electrification, this era was marked by groundbreaking advancements in domains like steel manufacturing, chemicals, combustion engines, and communication [9]. This period birthed or significantly enhanced industries like automotive, aviation, construction, agriculture, and the oil and petrochemical sectors.

Building upon the foundations of the First Industrial Revolution, Industry 2.0 further accelerated economic expansion and urbanization. This led to a surge in oil demand, and cities expanded, necessitating enhanced infrastructure. Products became more accessible to consumers due to their increased affordability. However, this period also brought challenges: deteriorating work conditions, rising unemployment, heightened work-related stress, and growing social disparities became more pronounced. Additionally, as with its predecessor, the era was marked by escalating environmental pollution and a worrying trend of resource exhaustion.

2.3 The third industrial revolution

The Third Industrial Revolution, often termed Industry 3.0 or the Digital Revolution, emerged in the mid-twentieth century, marked predominantly by the advent of microprocessors. This era witnessed a pivotal transition from primarily hardware-focused technologies to digitalization, heralding what many consider the dawn of the Information Age [9]. A benchmark moment in this revolution was around 1970 when advancements in computers and automation systems reached a point of widespread industrial application. This period’s pivotal General-Purpose Technologies (GPTs) encompassed computers, the internet, advanced electronics, and robotics. This technological surge enabled the automation of complex tasks, fostering an era of specialized production combined with heightened efficiency.

These innovations allowed factories to operate with minimal human intervention. Concurrently, there was significant progress in both nuclear energy and sustainable energy sources. The demands of Industry 3.0 emphasized a more educated and highly skilled workforce, amplifying the importance of advanced education and training programs. Additionally, the revolution ushered in dramatic improvements in communication, knowledge dissemination, storage, and analytical capabilities due to the evolution of information and communication technology.

Digital transformation reshaped numerous sectors, from e-commerce, digital and social media platforms, IT, and telecom, to biotech, 3D printing, green energy, financial services, healthcare, and manufacturing. These sectors have been reshaped through automation, advanced analytics, and advanced manufacturing technology. The ripple effects of Industry 3.0’s innovations persist, with many experts positing that we are still navigating this transformative period. A notable societal shift during this time has been a heightened consciousness about work-life harmony, environmental sustainability, and addressing socio-economic disparities.

Advertisement

3. The fourth industrial revolution

In this twenty-first century, it’s undeniable that trends toward automation and digitalization are intensifying. However, there’s ongoing debate regarding whether we are navigating the Fourth Industrial Revolution or merely witnessing an extension of the Third Industrial Revolution’s core technological advancements [8]. Despite differing views, a common consensus is that both productivity and customization are on an upward trajectory, bringing along substantial shifts in employment patterns and technological innovation.

Among those who advocate that we are amidst the Fourth Revolution, many emphasize a suite of interrelated technologies underpinning Industry 4.0, rather than a single predominant one. Prominent among these GPTs are artificial intelligence (AI), big data, the Internet of Things (IoT), cloud computing, machine learning (inclusive of deep learning), advanced robotics, cyber-physical systems (CPS), additive manufacturing, and blockchain [8, 10, 11]. Notably, AI and CPS stand out as particularly emphasized GPTs, as they are also inclusive of most of the previously mentioned GPTs.

The innovations have undeniably left a profound mark across various sectors, ushering in enhanced intelligence in industries such as manufacturing, healthcare, transportation, energy, information and communication technology, as well as commerce. Most job roles will likely see shifts, with further automation of routine tasks and, notably, creative, and non-routine tasks that were once deemed uniquely human. Consequently, there’s a growing discourse about the requisite skills for the modern workforce and how frequently these skills might become outdated. This directly influences individual employability and raises concerns about heightened unemployment rates as technology adoption and advancement accelerate. Moreover, Industry 4.0 raises significant concerns about the loss of craftsmanship and an increase in wage disparity [12]. An accompanyingFigure 1 illustrates the core characteristics of the four industrial revolutions, underscoring the consistent rise in automation through each phase.

Figure 1.

Major characteristics of the four industrial revolutions.

Given the significant influence of both GPTs and the widespread agreement among scholars regarding their pivotal role in the fourth industrial revolution, the subsequent sections will delve deeper into the realms of AI and CPS. Furthermore, we’ll underscore the economic paradigm of this revolution, characterized by a transition toward mass personalization.

3.1 Artificial intelligence

AI’s definition remains debated, with no universal agreement [13]. Typically, AI is perceived as a system that learns from past experiences, comprehends its environment, modifies based on new data, and makes decisions aligned with its designated objectives. Advanced AI systems exhibit autonomy, implying decision-making without human intervention. Depending on the AI’s sophistication, it might replace human activities, ranging from basic repetitive tasks to intricate cognitive ones. The ultimate objective is to enhance prediction quality while saving time and costs [13]. Therefore, a comprehensive AI definition suggests a system capable of partially or wholly substituting human roles, amplifying capabilities beyond traditional automation.

Numerous taxonomies have been proposed by both researchers and professionals. For instance, in 2017 PwC introduced a classification with four AI categories, based on human interaction levels with machine decisions and whether the AI merely automates existing tasks or explores new ones independently [14]. Therefore, the report provided four types of AI: assisted intelligence, automated intelligence, augmented intelligence, and autonomous intelligence, which are demonstrated in the following Figure 2.

Figure 2.

Artificial intelligence forms based on human involvement and level of adaptability (adapted from: [14]).

Another classification is to differentiate between weak (narrow) and strong AI (general) [11]. Weak AI is systems designed and trained for specific tasks with no problem-solving capabilities. Strong AI are systems that can perform intellectual tasks such as reasoning, problem-solving, judgment, learning, and understanding natural language. Some theorized for a third type, which is super AI. Super AI is a system that is capable of duplicating humans in their creativity, social intelligence, and wisdom. It is noteworthy that strong and super AI systems are still theoretical and their innovation might not appear shortly [11]. Thus, current AI solutions may reduce the human element in production, but the total replacement of humans is not probable for the time being (Figure 3).

Figure 3.

Major characteristics of weak, strong, and super AI systems.

Alternatively, AI can be classified under reactive machines and limited memory. Reactive machines do not have the capacity to learn from past experience, while limited memory AI does have the ability to learn from data input and amend decisions accordingly. A typical example of reactive AI is machine learning models, and a typical example of limited memory is deep learning algorithms. Some added the theory of mind and self-awareness systems, which might evolve in the future. The theory of mind suggests the development of AI systems that can have emotions, beliefs, desires, and mental capabilities, similar to what we identified before as super AI. Self-aware AI goes one step further by developing systems that have their own consciousness and emotions. Alternatively, AI could be categorized based on its function, such as machine learning, deep learning, neural networks, natural language processing, cognitive computing, robotic automation…etc. (Figure 4).

Figure 4.

Major attributes of reactive, limited-memory, mind-like, and self-aware AI systems.

Decision-making in a system necessitates an algorithm, processing power, and relevant data for algorithm training. As algorithms grow more sophisticated and computers advance [15], the quality and quantity of the data become pivotal. A richer dataset ensures superior algorithm outcomes. However, AI concepts, like expert systems, machine learning, and neural networks, aren’t novel. Introduced around the mid-twentieth century, their evolution was previously constrained by the absence of substantial data and limited computational capacities [10]. With the advent of big data, permitting extensive data accumulation, coupled with amplified processing abilities, AI’s potential has surged dramatically [13]. Given the expected advancements in data solutions and processing capabilities, future prospects seem brighter.

The era of digitalization witnesses exponential growth in data volume, speed, and diversity. Whether structured or unstructured, harnessing this data for knowledge extraction is invaluable. This extracted knowledge, besides being shareable, is a potent innovation catalyst. However, given big data’s sheer volume and complexity, conventional processing tools often fall short. Thus, advanced predictive analytics are essential to derive meaningful insights. The cloud further facilitates efficient big data storage and enhances accessibility, offering computational prowess and diverse IT tools. The pay-as-you-use nature of cloud services eliminates hefty initial IT infrastructure investments, especially benefiting small and medium enterprises with limited investment capacities for intricate IT solutions. Therefore, current AI development and adoption was the result of several technological advancements that are interdependent to achieve the existing systems, and the future ones if they evolve.

3.2 Cyber-physical systems

CPS fuses the realm of physical processes with virtual computations [11]. This is done through several components: sensors, actuators, computation systems (AI systems), and communication networks, which enable them to interact with each other. Sensors are physical endpoints that monitor the surroundings and send data through the network to the computation system. The system will analyze the data and generate decisions to be implemented by the actuators, which are also physical endpoints. Prime examples of CPS include autonomous vehicles, sophisticated self-operating robots, and smart factories.

While CPS encompasses the concept of IoT, it emphasizes a deep synchronicity between hardware and digital computations. IoT refers to a network of devices, each boasting its unique digital identifier on the internet. Linked by specific protocols, these devices gather, disseminate, analyze, execute, and exchange information. This intricate web of interconnected devices amasses a treasure trove of insights, paving the way for further innovation. Beyond mere data logging and sharing, these devices can collaborate to oversee and fine-tune one another. In essence, IoT bridges the tangible world with its digital counterpart, increasing optimization, efficiency, quality, and effectiveness, and minimizing the need for human operators [11]. The following Figure 5 provides a simplified CPS model.

Figure 5.

Simplified model of a cyber-physical system.

3.3 The economic model of the fourth industrial revolution

The GPTs mentioned earlier pave the way for enhanced mass customization in production. They even hold the promise of advancing to mass personalization, where products can be jointly developed with individual customers without driving up production costs. Today’s consumers crave personalized experiences. The capability to understand and cater to individual preferences while sustaining large-scale production became feasible primarily due to intelligent systems [12]. These systems have refined customization, evident in platform-based business models, online supply chains driven by real-time demand, and tools that involve customers in the design process [16]. Moreover, organizations that focus on customers rather than products, rely on consumer data and communication, and behavioral analysis to further cater to individual customers [17].

While many companies are yet to fully embrace mass personalization, many are pushing the boundaries of mass customization, aiming to serve increasingly specific niches and ultimately individual needs. Failing to do so might lead them to lose out to competitors, bear higher inventory expenses, and miss out on innovative breakthroughs, potentially eroding their market edge [18].

Advertisement

4. What is knowledge management?

Although scholars differ in defining knowledge management, the consensus is that organizations should seek to extract the maximum value from their existing knowledge and strive to add to their knowledge depository through knowledge creation and acquisition [19]. Doing so will achieve continuous innovation in products and services, which leads to achieving a sustainable competitive advantage. Some scholars presented knowledge management as a system or strategy with supporting processes to flourish knowledge-intensive activities within a firm [20, 21, 22, 23]. Others looked at knowledge management as the active management or facilitation of a set of knowledge-intensive activities. A taxonomy of these activities could be grouped under three major ones: knowledge sharing (includes dissemination, usability, and accessibility), knowledge creation, and knowledge codification (includes transformation, storage, protection, and representation) [24]. For some, knowledge management should deal with knowledge as a resource, thus the focus should be on accumulating intellectual assets. Many scholars looked at knowledge management as active management of various knowledge types and their interaction and transformation, such as explicit knowledge vs. tacit knowledge and organizational knowledge vs. personal knowledge [22, 25, 26, 27].

Regardless of the definition, knowledge management focuses on two dimensions: information technology and human, and how to manage each and their interaction to maximize innovation, production, revenues, and profitability [28]. Management could flourish knowledge-intensive activities, whether embedded in humans and networks, information systems, or a combination of both, through formal and informal mechanisms. Formal mechanisms include strategy formation, processes, systems, rules, and procedures, and Informal mechanisms include the organizational culture, which includes norms and behaviors within and across teams and networks [28, 29].

4.1 How knowledge was managed during the first and second industrial revolutions?

Before Industry 1.0, the putting-out system was highly dependent on the embodied and embrained knowledge that is embedded in the craftsman. Rarely any of this knowledge is embedded and encoded in processes, structures, or documents. Therefore, the knowledge of the full production process was tacit in nature, which was hard to articulate and represents a trade secret owned fully by the artisan. Encultured knowledge was of less significance, as most production operations were based in households or small workshops. Moreover, each craftsman knowledge and skills are unique, which resulted in variations in quality and product differentiation, both dependent on the artisan’s level of skill. Merchant capitalists specialized in managing the supply chain and marketing, complementary yet completely separate processes from the production process. Due to the dominance of tacit knowledge, the transfer of knowledge took place through the apprenticeship process. The master artisan takes an apprentice, who would shadow and work under direct supervision. Thus, the process of learning was through intensive tacit knowledge exchange (imitation, mentorship, on-the-job training, trial and error…etc.), and gradual (from a novice to tradesman-in-training to master), which took years before the apprentice reached the level of mastery.

The introduction of machines during Industry 1.0 resulted in the automation of much of the knowledge previously held by craftsmen. For instance, tasks like knotting, cutting, and stitching, once done with simple tools by craftsmen, were gradually replaced by machines. This shift accelerated during the Second Industrial Revolution when machines became more complex, and assembly lines were introduced. Division of labor and time-and-motion studies led to increased specialization in tasks, with employees performing simple, repetitive tasks that complemented the assembly line. As a consequence, much of the knowledge inherent in craftsmen’s skills was decoded and substituted by machines. The comprehensive knowledge of the entire production process was now owned by a new class of supervisors and managers [30]. These managers acted as agents of capitalists, overseeing planning, control, direction, coordination, resource allocation, scheduling, and staffing of operations to ensure efficiency, predictability, and consistency [31].

In this context, employees required only a few basic, easily trainable skills, reducing the need for extensive knowledge transfer among workers and making them readily replaceable. The critical knowledge for organizational success was now vested in managers and supervisors, and a significant portion of it was documented in processes and documents, transforming it from personal to organizational knowledge. As a result, capitalists gained greater control over their investments, and organizations operated in a standardized manner, with reduced risk of losing essential knowledge due to employee turnover.

4.2 What are the proposed knowledge management strategies in the twentieth century?

Many scholars proposed various knowledge management strategies with the emergence of the knowledge management concept in the 1990s. Most notable is the work of Hansen et al. who proposed two main strategies: codification and personalization [23]. The codification strategy follows an IT perspective on knowledge management, which aims to logically codify and store information in databases to increase its accessibility and widespread usability. In other words, the focus is to transfer tacit knowledge, which is embedded in employees, to explicit knowledge. The competitive advantage of codification is speed, reliability, efficiency, and quality control – aiming for standardization and competitive pricing.

On the other hand, the personalization strategy follows an HR perspective on knowledge management, which aims to rely on the knowledge embedded in employees, relationships, and managerial styles. Information systems are viewed here as mere enablers to augment work, while the key factors are employee experience, networking, interaction, knowledge-sharing with others, and building a knowledge-sharing culture. The competitive advantage of personalization is promoting creativity, uniqueness, and flexibility – aiming for customization and allowing higher pricing for differentiation. The following Figure 6 presents the major attributes of codification and personalization.

Figure 6.

Major attributes of the codification and personalization strategies (adapted from: [23]).

In reality, even when one strategy is dominant or primary, the other will act as supportive or secondary. Hansen et al. warned that only one strategy should be dominant, for if management pushed for both, the result might be confusion and failure [23]. This standpoint was criticized by several scholars, stating that it is hard to neglect the desire of organizations to harvest the benefits of both strategies, stressing their equal importance [32]. Thus, some scholars proposed a combination strategy, where decision-makers in organizations support both personalization and codification, utilizing each to the extent that it is useful to the activity, product, and context [19]. Another critique is that Hansen et al. focused on management consulting firms as a study sample, where such duality might not be applicable in other sectors, such as manufacturing.

Another foundational work is Nonaka’s and Takeuchi’s knowledge spiral model, which aims to increase knowledge sharing and creation within organizations, thus enabling them to innovate products and processes faster and sustaining their competitive advantage [27]. They proposed the SECI process, which stands for socialization, externalization, combination, and internalization. Socialization is when employees share their tacit knowledge and transfer it to others’ tacit knowledge, through interaction and mentorship. The receivers of tacit knowledge will embed it through imitation, observation, and practice. This leads to externalization, where tacit knowledge is articulated, that is, transferred into explicit knowledge through dialogs and reflections. This explicit knowledge will be combined into documents and databases, which allows widespread dissemination. Finally, the explicit knowledge will be available for employees to internalize and transfer to their tacit knowledge through acquisition, sense-making, analysis, and reflection, leading them to be creative in generating new concepts and ideas; that is new tacit knowledge. That said, the four stages of the SECI model do not occur in a vacuum. They take place in groups and organizational contexts, referred to as “Ba”. Ba includes the mental space (including experiences, values, and ideas), the physical space, and the virtual space (such as databases, platforms, and informational and communication technologies) (Figure 7) [33].

Figure 7.

The SECI model of knowledge creation in organizations (adapted from: [25]).

Therefore, based on studying the manufacturing sector in Japan, which could be applied to various sectors and national cultures Nonaka and Takeuchi provided a strategy for how management should manage knowledge to make it more organizational and achieve a learning, creative, and innovative firm [33]. That said, a major critique of this model is that tacit knowledge conversion to explicit is never fully accomplished, as much personal knowledge and knowledge embedded in relationships is hard to articulate and convert, and even the converted knowledge is subject to interpretation [33]. So, although the SECI model provides a framework to formulate a knowledge management strategy and implementation practices, hardly can someone claims its completeness and comprehensiveness in addressing knowledge management activities.

Those are some of the many knowledge management strategies and frameworks that were proposed during the 1990s, which was during the third industrial revolution. Although their relevance is still valid to this date, we will argue that alternative strategies are more suitable for the fourth industrial revolution era, reflecting the substantial technological advancements of the twenty-first century.

Advertisement

5. Knowledge management strategies for industry 4.0

In light of the opportunities and challenges presented by the fourth industrial revolution, organizations must develop a strategic approach to Knowledge Management to harness the advantages and address the risks. The management of knowledge activities and processes is still influenced by two principal dimensions: the technological and the human. AI and CPS are progressively assuming cognitive responsibilities once handled by humans, leading to increased automation. However, it’s important to note that AI and CPS also have the potential to enable augmentation, where humans and machines collaborate closely to accomplish tasks. As a result, automation and augmentation appear to represent two contrasting strategic directions for Knowledge Management, especially within the Human Resources dimension. The former seeks to minimize if not entirely replace human involvement, while the latter aims to retain and enhance human performance.

In line with these two dimensions, the industry 5.0 represents a significant shift from the automation and efficiency focus of Industry 4.0 toward a model that integrates human creativity and collaboration with smart technologies. This transition necessitates a nuanced approach to knowledge management, highlighting the importance of human-centric values, sustainability, and resilience. The progression from Industry 4.0 to 5.0 amplifies the importance of these processes but adds layers of complexity and opportunity, particularly in terms of leveraging human-machine collaboration. In Industry 5.0, KM must not only ensure that information flows effectively but also that there is a symbiotic relationship between the cognitive and creative capabilities of humans and the analytical, data-processing powers of AI and machine learning systems. Additionally, the transition from Industry 4.0 to Industry 5.0 is not marked by a clear boundary but rather represents an evolving paradigm.

If the ultimate goal is achieving hyper-automation, which means automating nearly all organizational processes from start to finish, the result is a more efficient, logical, and comprehensive workflow [15]. This can lead to higher levels of quality control and faster production, potentially increasing profitability. However, machines, as they currently stand, have various limitations that prevent them from completely replacing humans. These limitations include areas such as emotional intelligence, comprehension, problem-solving, intuition, creativity, consciousness, and judgment. In essence, current AI solutions are specialized and lack commercialized strong or super AI systems; these are still in the realm of theory and the future. Consequently, existing solutions excel in rule-based decision-making and managing structured and semi-structured tasks. Yet, they fall short when dealing with unstructured, complex, and ambiguous tasks due to their inability to adapt beyond their predefined algorithms.

Alternatively, the augmentation approach suggests harnessing the capabilities of machines while keeping humans in the loop to address the limitations mentioned earlier. However, the allure of automation is hard to resist whenever the opportunity arises to automate a task, offering greater benefits and lower costs. For the time being, humans, working alongside machines, are better equipped to handle tasks that are predicted for strong and super AI. They are better positioned to exercise responsibility, judgment, substantial reasoning, and complex decision-making thanks to their adaptability and interpersonal skills. The following Figure 8 outlines some advantages of both automation and augmentation, where the strengths of one approach can be seen as the weaknesses of the other, underscoring the inherent contradiction between these two approaches.

Figure 8.

Major advantages of automation and augmentation.

Recognizing the distinctions and conflicts between the automation and augmentation strategies, Raisch and Krakowski also emphasized their interconnectedness [15]. They contend that favoring one approach over the other restricts the potential to reap the advantages of both, as each can enhance the capabilities of the other. This interrelationship is highlighted through both the timing and the scope of implementation, which they debated under the following two scales: temporal and spatial.

5.1 Temporal scale

At a certain juncture, management may opt for augmentation when dealing with complex tasks that require the nuanced expertise of professionals, making them challenging to automate. Managers and experts collaborate with data scientists and engage with intelligent systems, where the output of these systems enhances their capabilities and knowledge. They assess the systems’ results, make choices based on their expertise, and compensate for the system’s limitations. Over time, the intelligent system learns and refines its output, leading to a transition from augmentation to automation, resulting in improved efficiency, precision, and overall effectiveness.

However, as time progresses, circumstances can evolve, and the intelligent system’s output may no longer lead to optimal outcomes. This can be attributed to the limited capacity of systems to adapt to significant contextual changes. At this stage, augmentation is reintroduced to restore optimized results that align with the evolving context. This shift from augmentation to automation and, subsequently, from automation to augmentation forms a cyclical relationship. Therefore, the assertion is that to sustain long-term performance, organizations should effectively manage the periodic transitions between these two approaches.

5.2 Spatial scale

Tasks are interconnected, and when one task is automated, managers and experts engage with the inputs and outcomes of the automated task to carry out other related tasks that come before or after it. This interaction leads to an enhancement of their performance in the adjacent preceding or subsequent tasks. Furthermore, the ongoing collaboration between humans and machines serves to fine-tune the automated outputs, as experts adapt the input tasks based on their assessment of the results.

For instance, the process of problem-solving involves three primary sequential tasks: defining the problem, generating alternative solutions, and choosing the best solution. If automation is applied to the task of generating alternative solutions, decision-makers input the problem definition and constraints into the system and then select from the alternative solutions provided by the system. The interaction between decision-makers and the system empowers them to improve the task of defining the problem and the task of selecting the best solution. Additionally, drawing from the input and output, decision-makers propose adjustments to enhance the efficiency of the automated task of generating alternative solutions.

5.3 The automation-augmentation co-existence

The shift from automation to augmentation and vice versa is conditional to the optimization of the tasks’ effectiveness and efficiency. It is also important to highlight that current AI solutions cannot fully replace humans, as they have not reached the point of singularity, which is hard to reach, if ever, in the foreseeable future. Thus, automation will always co-exist with augmentation, where some tasks are easier to automate, and others are optimal to keep augmented. Also, as highlighted before, the cyclical relationship between both is vital to keep optimization and sustain competitive advantage.

5.4 What are the knowledge management strategies in the fourth industrial revolution?

The aforementioned debate leads us to distinctively identify three Knowledge Management strategies. The first takes the approach of automation, aiming to reach hyper-automation to the fullest extent possible. Some Scholars referred to this approach as “Robonomics”, where overwhelming production is achieved through robotics, artificial intelligence, and automation technologies [11].

The second strategy recognizes the importance of embracing technological advancements to enhance and amplify human performance. This approach encourages high levels of interaction between humans and machines, constituting a knowledge management strategy labeled as “Cybernetication.” Cybernetication seeks to augment human capabilities through artificial intelligence systems and robotics. Some scholars even anticipate that the future development and widespread adoption of general-purpose technologies such as virtual and augmented realities, cobots, cognitive systems, brain-computer interfaces, cybernetic implants, wearable technologies, and genetic modifications will propel a significant leap in augmentation and human-machine interactions, potentially paving the way for the fifth industrial revolution [12, 17].

The choice between hyper-automation and Cybernetication represents a tradeoff in approach. Management decisions may lean toward using AI to reduce human involvement and enhance efficiency, or they may prioritize augmenting human abilities and increasing human-machine interaction. Raisch and Krakowski presented these two approaches as a paradox, highlighting their contradictory yet interconnected and complementary nature, an argument supported by their temporal and spatial scales [15].

A third Knowledge Management strategy emerges, labeled as “DeParadoxication.” This strategy aims to resolve the contradictions between the two approaches and effectively manage their interplay and transitions. In other words, under the third Knowledge Management strategy, management seeks to minimize the tension between automation and augmentation by selectively adopting one approach over the other for specific tasks and timeframes. It also manages the transition between these approaches as needed. Moreover, this approach appears to be the most suitable strategy for achieving mass personalization, given its adaptability and efficiency.

The following Figure 9 illustrates these three strategies within KM 4.0, replacing the strategies prevalent in the twentieth century (codification, personalization, and combination). These strategic alternatives align with the shift from Industry 3.0 to Industry 4.0, reflecting the widespread adoption of Industry 4.0 General-Purpose Technologies.

Figure 9.

Industry 4.0 knowledge management strategies: HyperAutomation, Cybernetication, and DeParadoxication.

Advertisement

6. Knowledge leadership

When discussing strategies for managing knowledge, it is essential to emphasize the crucial role of leadership in shaping the vision, goals, and values related to knowledge [33]. “Knowledge leaders”, also referred to as “e-leaders”, influence others to achieve what needs to be done to manage organizational knowledge. Being in a position of influence and decision-making power, leaders contribute to the cultural and structural contexts to flourish knowledge sharing and creation [34]. Additionally, the increasing frequency and advancements in information technology have placed pressure on leaders to adapt their leadership styles and use their influence to bring about changes in attitudes, behaviors, and performance at all levels, including individuals, groups, and the organization as a whole [35]. Moreover, it’s not just that technology impacts leadership; leadership also has a significant influence on how technological solutions are integrated into organizational structures and behaviors, leading to the social construction of technological solutions throughout the change management process [35].

The literature focuses on leadership styles and behaviors that work best to create knowledge-creating and learning organizations, by focusing on influencing individuals, groups, and networks and creating processes and norms to support the creation of tangible and intangible knowledge assets. It should be noted that considering the diverse approaches to knowledge management strategies, the role and style of leadership may differ significantly in each case. Furthermore, these leadership roles may evolve based on the changes required to achieve optimization and competitiveness at different phases of knowledge creation or in response to alterations in knowledge management strategies driven by internal and external contextual factors.

6.1 What is the best leadership style to manage knowledge?

Several organizational factors influence successful digital changes, such as organizational culture, background, external environment, market competition, and technological advancement [36]. Among those, organizational leadership tends to be the most cited factor. There is a lot of debate in the literature on the best leadership style to flourish innovation and ensure the adoption of new technologies. That said, transformational and shared leadership styles are empirically the most effective for IT innovation, adoption, and implementation [34, 37].

In this context, transformational leadership influences employee behaviors and values beyond self-interest to care for organizational well-being, thus increasing the process of knowledge sharing and innovation and minimizing resistance to IT adoption. Shared leadership is also stressed due to the increasing level of complexity, where leadership roles shift based on the situation, making leadership contingent, dynamic, mutual, and emergent [37]. Various leadership styles are not distinctive alternatives, as they could be connected and complimentary to each other. Transformational leaders focus on individuals and groups, intellectually stimulating and inspiring them to be innovative and participative in the process of change [37]. This leads them to own the change initiative and jointly work toward achieving the goals and organizational objectives, thus acting in a shared leadership mode. Moreover, other scholars highlighted the importance of charismatic, distributed, empowering, visionary, and servant leadership styles to support knowledge management initiatives, such as in supporting the knowledge sharing and creation processes and the new IT systems’ adoption [34, 36]. Those leadership styles are also not contradictory and they enforce transformational and shared leadership, which are needed to achieve successful innovative and technological changes.

Many scholars focused on the roles, skills, and attributes of knowledge leaders. The most common roles are being visionary and strategic in designing and implementing a knowledge management strategy [34]. This vision is then implemented by influencing employees to adopt it, collaborating, and participating in translating the vision into goals and implementation tactics. Other leadership roles are essential in the process, such as being an effective communicator, change agent, motivator, coach and mentor, learner, facilitator of learning, intellectual stimulator, educator, supporter, role model, and technologist [34]. As for skills, soft skills were mostly stressed in the literature, especially when it comes to networking and communication skills, both face-to-face and through digital media. Moreover, e-leaders should be skilled in high-speed decision-making, and effectively managing disruptive change, connectivity, and teams (both in-person and virtual) [35]. Some common knowledge leaders’ attributes are being resilient, creative, initiative, competitive, trustworthy, trusting, humble, ethical, and empathic [34].

Although most of the abilities, skills, and roles are people-oriented, suggesting a similar yet evolutionary track of the trend of leadership studies in the past decades, the new phenomenon is the high inclusivity of digital communication and platforms and the high decentralization and openness of organizations. More so, there is a return to technical skills when it comes to effective leaders. Knowledge leaders should be able to understand and use various digital solutions, but also, need to stay updated with technological advancements, monitoring any that need to be adopted to ensure a sustainable competitive advantage. The combination of mastery of current IT solutions and the ability to be up-to-date with new advancements put the e-leader under the need for lifelong learning of digital and technical skills [35]. Leaders at various levels need to highly coordinate with IT specialists, who are now positioned, more than ever, in a strategic role. This is true whether the technological solutions are leading to more automation, augmentation, or both. The adoption of IT solutions has both technical and human dimensions to ensure its success, and the interaction between both dimensions is inevitable.

6.2 Technological challenges

To remain competitive, organizations must increase their investments in AI and CPS solutions, to enhance automation and augmentation. One significant factor influencing the adoption of these new solutions is their cost. Another crucial consideration is determining which tasks should be automated, and conducting a comprehensive cost-benefit analysis, considering both short-term and long-term implications.

It is imperative to assess how the automation of a specific task will impact other tasks, with a focus on generating value by minimizing costs and/or enhancing differentiation, all while minimizing disruptions and negative side effects [15]. This approach ensures sustainable competitiveness. Furthermore, the availability of high-quality data is a key determining factor, as any algorithm, regardless of its complexity, will be ineffective without access to robust data sources.

Adopting a new solution or upgrading an existing one carries a socio-technical dimension that cannot be underestimated [13]. Several important questions must be addressed, such as:

  1. Why is a new IT solution necessary, and how will it impact tasks and productivity?

  2. Is the organizational culture adaptable enough to embrace the new solution?

  3. What change management techniques are required to minimize resistance and foster acceptance?

  4. Do employees possess the necessary complementary and augmented skills? If not, what is the plan for acquiring and developing these skills?

  5. Is the current organizational structure and existing procedures compatible with the new solution? If not, what changes are necessary, including potential restructuring of departments and jobs, and adjustments to procedures, to ensure the effective integration of the AI solution?

6.3 Human challenges

Automation and augmentation necessitate significant reskilling efforts, requiring the workforce to acquire new skills, upskill, or even deskill to complement AI solutions effectively. In the context of Industry 4.0, many scholars emphasize the importance of soft skills and abilities since most hard skills can be readily replaced by machines [38]. This assertion holds true, particularly regarding cognitive and social skills, which remain challenging to replicate using automation. Among the most frequently cited irreplaceable skills are leadership, teamwork, collaboration, experimentation, problem framing, complex problem-solving, networking, creativity, abstract and design thinking, interpersonal and emotional intelligence, communication, and mentoring. Furthermore, certain abilities continue to be deemed crucial, such as adaptability, responsibility, the ability to learn, sensemaking, insightfulness and intuition, critical thinking, and integrity. Overall, employees with skills that are easily replaceable by machines face a higher risk of job displacement. The more advanced the technical solutions adopted by a company, the less need there is for a large workforce, with an increased focus on the quality of employees.

However, organizations that lose subject-matter expertise may encounter significant setbacks that could result in a competitive disadvantage, even if they have successfully automated hard skills [15]. It’s important to note that if the context changes, hard skills are necessary to adjust AI solutions. Additionally, for augmentation, a deep understanding of how automated tasks function is vital for making necessary adjustments to their inputs and outputs. In essence, human knowledge and hard skills continue to play a critical role in exercising judgment and responsibility since machines are currently incapable of adapting their rules and purpose. Some argue that AI-generic solutions could replace AI-specific solutions, but achieving general intelligence equivalent to human intelligence remains a complex and ongoing experiment.

Therefore, it is essential to maintain a minimum level of human capital with hard skills, especially organization-specific hard skills are not easily acquired on demand due to the time-consuming learning and experience curves. As a result, the number of experts required may decrease due to automation or augmentation, but the level of expertise that must be retained will need to be at a higher level. Some key hard skills that are still in demand include high-level technical expertise (subject-matter knowledge), interpretive and investigative reporting, digital literacy, analytics, and strategy development.

For knowledge leaders, a significant challenge lies in HR planning, encompassing both workforce planning and job design. Striking the right balance is essential, as organizations must retain key, specialized talent while also maintaining functional and numerical flexibility. This necessitates the adoption of various staffing techniques, such as outsourcing, contracting, offshoring, and strategic partnerships. Achieving this balance is critical for maintaining competitiveness while enhancing workforce flexibility and efficiency.

Advertisement

7. Conclusion

This chapter commences with an exploration of the first three industrial revolutions, culminating in an extensive discussion of the fourth industrial revolution. It places a particular emphasis on Artificial Intelligence and Cyber-Physical Systems as the primary General-Purpose Technologies of Industry 4.0. Subsequently, it delves into the concept of knowledge management and provides an overview of how knowledge was managed during the initial two industrial revolutions. The discussion then shifts to major knowledge management strategies that were implemented during Industry 3.0, underlining the necessity for a novel approach to knowledge management in Industry 4.0 due to the emergence and adoption of technological advancements.

Within this context, three distinct knowledge management strategies are proposed: HyperAutomation, Cybernetication, and DeParadoxication. HyperAutomation strives to maximize automation, minimizing human involvement to the greatest extent possible using current technological solutions. However, achieving this goal can be challenging, given the existing limitations of narrow-focused artificial intelligence systems. Future advancements in general-purpose systems, including strong or even super AI, remain theoretical and experimental but hold promise for realizing HyperAutomation.

Conversely, Cybernetication seeks to enhance human capabilities through technological means, such as AI and robotics. Further developments in this domain may lead to even more augmented employees, potentially involving cybernetic implants, wearable technologies, and genetic modifications. Many experts argue that such advancements could usher in the Fifth Industrial Revolution and may come to fruition sooner than the development of strong or super AI systems.

The third strategy, DeParadoxication, focuses on resolving the contradictions between automation and augmentation, effectively managing their interplay and transitions. This strategy, arguably the most complex to manage, due to its highly disruptive nature, necessitates continuous change management, encompassing both structural and cultural adaptation. Nevertheless, it is argued to be the most effective way to sustain a competitive advantage in the current era.

The chapter underscores the critical role of knowledge leaders in orchestrating the change required to embrace new IT solutions. It is suggested that the most effective leadership styles are transformational and distributed, with a focus on highlighting the significant roles, skills, and attributes of effective e-leaders. Additionally, the chapter ends with identifying some of the challenges faced by knowledge leaders in both the IT and HR dimensions, where both dimensions are undeniably interrelated due to the high human-machine interactions required in Industry 4.0.

References

  1. 1. Drucker P. The Age of Discontinuity: Guidelines to Our Changing Society. 2nd ed. New York: Routledge; 1992
  2. 2. Nonaka I. The knowledge-creating company. Harvard Business Review. 1991;69(6):96-104
  3. 3. Powell WW, Snellman K. The knowledge economy. Annual Review of Sociology. 2004;30(1):199-220
  4. 4. World Economic Forum. The Fourth Industrial Revolution, by Klaus Schwab [Online]. Cologny/Geneva: World Economic Forum; 2017. Available from: https://www.weforum.org/about/the-fourth-industrial-revolution-by-klaus-schwab; [Accessed: July 27, 2023]
  5. 5. Schwab K. The Fourth Industrial Revolution. Geneva: World Economic Forum; 2016
  6. 6. Agrawal M, Eloot K, Mancini M, Patel A. Industry 4.0: Reimagining Manufacturing Operations after COVID-19 [Online]. McKinsey & Company; 2020. Available from: https://www.mckinsey.com/capabilities/operations/our-insights/industry-40-reimagining-manufacturing-operations-after-covid-19 [Accessed: July 27, 2023]
  7. 7. Manesh MK, Pellegrini MM, Marzi G, Dabic M. Knowledge Management in the Fourth Industrial Revolution: Mapping the literature and scoping future avenues. IEEE Transactions on Engineering Management. 2021;68(1):289-300
  8. 8. Martinelli A, Mina A, Moggi M. The enabling technologies of industry 4.0: Examining the seeds of the fourth industrial revolution. Industrial and Corporate Change. 2021;30(1):161-188
  9. 9. Hayat A, Shahare V, Sharma AK, Arora N. Introduction to industry 4.0. In Namasudra S,AK, editor. Blockchain and its Applications in Industry 4.0. Singapore: Springer; 2023. p. 29-59.
  10. 10. Haenlein M, Kaplan A. A brief history of artificial intelligence: On the past, present, and future of artificial intelligence. California Management Review. 2019;61(4):5-14
  11. 11. Dolanay SS. Artificial intelligence, smart robots, types of artificial intelligence and a new economic order. In: de Souza GHS, editor. An Overview on Business, Management and Economics Research Vol. 2.: B P International; [online] 2023. p. 138-160
  12. 12. Santhi AR, Muthuswamy P. Industry 5.0 or industry 4.0S? Introduction to industry 4.0 and a peek into the prospective industry 5.0 technologies. International Journal on Interactive Design and Manufacturing. 2023;17:947-979
  13. 13. Duan Y, Edwards JS, Dwivedi YK. Artificial intelligence for decision making in the era of big data – Evolution, challenges, and research agenda. International Journal of Information Management. 2019;48:63-71
  14. 14. Rao AS, Verweij G. Sizing the Prize: What’s the Real Value of AI for your Business and How Can You Capitalize? [online] PricewaterhouseCoopers; 2017. Available from: https://www.pwc.com/gx/en/issues/analytics/assets/pwc-ai-analysis-sizing-the-prize-report.pdf [Accessed: May 5, 2024]
  15. 15. Raisch S, Krakowski S. Artificial intelligence and management: The automation–augmentation paradox. The Academy of Management Review. 2021;46(1):192-210
  16. 16. Baranauskas G. Mass personalization vs. mass customization: Finding variance in semantical meaning and practical implementation between sectors. Social Transformations in Contemporary Society. 2019;7:6-15
  17. 17. Narkhede G, Pasi B, Rajhans N, Kulkarni A. Industry 5.0 and the future of sustainable manufacturing: A systematic literature review. Business Strategy & Development. 2023;6:704-723
  18. 18. Fenech C, Perkins B. The Deloitte Consumer Review: Made-to-Order: The Rise of Mass Personalization [online]. Deloitte; 2019. Available from: https://www2.deloitte.com/content/dam/Deloitte/ch/Documents/consumer-business/ch-en-consumer-business-made-to-order-consumer-review.pdf [Accessed: May 5, 2024]
  19. 19. El-Farr H. Aligning Human Resource Management to Knowledge Management within the UK Management Consulting Sector. Leeds: University of Leeds; 2011
  20. 20. Neef D. Making the case for knowledge management: The bigger picture. Management Decision. 1999;37(1):72-78
  21. 21. Hislop D. Knowledge Management. In: Redman T, Wilkinson A, editors. Contemporary Human Resource Management. Essex: Pearson Education Limited; 2006
  22. 22. Alavi M, Leidner DE. Review: Knowledge management and knowledge management systems: Conceptual foundations and research issues. MIS Quarterly. 2001;25(1):107-136
  23. 23. Hansen MT, Nohria N, Tierney TJ. What’s your strategy for managing knowledge? Harvard Business Review. 1999;77(2):106-187
  24. 24. Anshari M, Syafrudin M, Fitriyani NL. Fourth industrial revolution between knowledge management and digital humanities. Information. 2022;13(6):292
  25. 25. Nonaka I. A dynamic theory of organisational knowledge creation. Organisation Science. 1994;5(1):14-37
  26. 26. Bhatt GD. Information dynamics, learning and knowledge creation in organizations. The Learning Organization. 2000;7(2):89-99
  27. 27. Nonaka I, Takeuchi H. The Knowledge-Creating Company: How Japanese Companies Create the Dynamics of Innovation. Oxford: Oxford University Press; 1995
  28. 28. El-Farr H, Hosseingholizadeh R. Aligning human resource management with knowledge Management for Better Organizational Performance: How human resource practices support knowledge management strategies. In: Wickham M, editor. Current Issues in Knowledge Management. London, UK: IntechOpen; [online] 2019
  29. 29. Tsoukas H, Vladimirou E. What is organizational knowledge? Journal of Management Studies. 2002;38(7):973-993
  30. 30. McGrath R. Harvard Business Review [Online]. 2014. Available from: https://hbr.org/2014/07/managements-three-eras-a-brief-history [Accessed: October 31, 2023].
  31. 31. Unyimadu SO. Management and industrial revolution in Europe, United States of America and Japan. Engineering Management International. 1989;5:209-218
  32. 32. Edwards JS, Handzic M, Carlsson S, Nissen M. Knowledge management research & practice: Visions and directions. Knowledge Management Research & Practice. 2003;1(1):49-60
  33. 33. Kahrens M, Früauff DH. Critical evaluation of Nonaka’s SECI model. In: Syed J, Murray PA, Hislop D, Mouzughi Y, editors. The Palgrave Handbook of Knowledge Management. Cham: Palgrave Macmillan; 2018
  34. 34. Hosseingholizadeh R, El-Farr H, Kerman NT, Lotfi H, Ahmadi M, Akhoondi M, et al. A systematic review and synthesis of empirical research on “knowledge leadership”: A new insight in the field of knowledge management. International Journal of Information Science and Management. 2022;20(4):167-190
  35. 35. Cortellazzo L, Bruni E, Zampieri R. The role of leadership in a digitalized world: A review. Frontiers in Psychology. 2019;10:1938
  36. 36. Tseng SM. Investigating the moderating effects of organizational culture and leadership style on IT-adoption and knowledge-sharing intention. Journal of Enterprise Information Management. 2017;30(4):583-604
  37. 37. Bunjak L, Bruch H, Matej Č. Context is key: The joint roles of transformational and shared leadership and management innovation in predicting employee IT innovation adoption. International Journal of Information Management. 2022;66:102516
  38. 38. Mabe K, Bwalya K. Critical soft skills for information and knowledge management practitioners in the fourth industrial revolution. SA Journal of Information Management. 2022;24(1):a1519

Written By

Hadi El-Farr and Kevin Sevag Kertechian

Submitted: 09 November 2023 Reviewed: 11 March 2024 Published: 06 May 2024