Open access

Introductory Chapter: Challenges in Neuro-Memristive Circuit Design

Written By

Alex James

Submitted: September 16th, 2018 Published: May 27th, 2020

DOI: 10.5772/intechopen.91969

Chapter metrics overview

605 Chapter Downloads

View Full Metrics

1. Introduction: what makes memristors attractive for neural networks?

The ability of the memristors to change its conductance i.e. behaves like a resistor, and yet be able to remain in that conductive state, be able to change the state based on a control voltage makes it resemble like a neuron. The spiking neurons in the brain respond to the stimuli in different ways. The continuous application of stimuli and the changing response of the neuron to this is related to learning. In the same way, by application of voltage pulses of certain amplitude and frequency can cause a change in conductance state, reflecting as changing the amplitude of the current outputs through a memristor [1, 2, 3]. The voltage pulse trains below a threshold voltage for a given conductance state produces a current signal output that follows the input voltages reflecting learning ability. As such this idea can be translated to emulate spiking neurons with memristors [4, 5].

Another major design use case for memristor is the crossbar arrangement of the memristors. The memristors are arranged in a crossbar architecture, with each memristor being able to be accessed with rows and columns. The memristors are programmed using the transistor switch control, or selector switch control often referred to as ITIM or 1S1M configuration [6, 7]. Multiple transistors are usually required in the practical control circuits and depending on the complexity of the task such as the need to access multiple conductance states, the design aspects become complicated [8]. Nonetheless, a single crossbar can emulate a single dot product matrix computation that is required for weighted summation of inputs in a neural network layer. From a design perspective, at a higher level the simplification of multiply and accumulate operation is simplified, and it can reduce the design complexity.

The neuro-memristive system requires architectural level combinations of crossbars and memristor neurons, and be able to fabricate along with CMOS devices. Usually, sensors, control circuits and memories, would be required for the neural network to be scaled to a large network. The larger the network or deeper the number of layers in the neural network, the complexity of implementing increases. Large crossbar arrays suffer from the sneak path currents and non-idealities of the devices, which introduces errors in the dot-product computations, that propagate from one layer to another. While to some extent these errors can be compensated with learning algorithms, they do not fully compensate for the changes in real-time conditions. Online learning is possibly a way to compensate for real-time errors, however, online learning systems are not easy to realise for analog circuits and often consume a large amount of area on-chip and power. For digital implementations, in general, online learning circuits consume larger area and higher delays, than the crossbar based analog counterparts.


2. Main challenges

2.1 Modelling issues

Modelling realistic memristors devices is a challenging task [3]. There have been arguments for and against the existence of “ideal” memristor devices, based on electrical, physical, chemical and philosophical arguments [9, 10]. From a neural circuit design perspective, the arguments on the existence of such idealistic devices are practically not relevant. The more important question for the circuit designer is the accurate modelling of the practical device that can be either used as a spiking neuron or can be used in a crossbar.

When the models can incorporate into a simulator, it is important that the models represent accurately the true behaviour of the device and also are fast in terms of computation [3]. The ability for the models to be easily integrated into SPICE like simulators, that can enable simulations of millions of neurons are important for building neural networks [11]. Currently, the simulations with memristor models are extremely slow for deep neural networks, and often require the use of scripting languages such as Python to get around this issue.

2.2 Lack of design tools

There is limited availability of physical design kits (PDK) for use in standard design tools such as provided by Cadence [12], Mentor Graphics [13], Silvaco [14] etc. The support for memristor PDK suitable for integration with CMOS is largely an open problem. The accuracy of the design files is not comparable with CMOS processes, and the variability data is not very well disclosed. The design tools that can accurately translate the realistic memristor devices are not very common and is an active topic of study.

2.3 Reliability issues of memristors

The memristor devices suffer from a range of reliability issues. Some of the main issues include:

Ageing – the devices when switched ON and OFF for a long period of time suffer from the loss of conductance state. This creates a major problem in analog dot product computations with crossbar architecture. Ageing has better tolerance to binary neural networks [15, 16].

Noise – the electrical and thermal noise can play with the changes in output response of the memristors, which can interact with the design of the neurons. The exact interplay of the device noise within different configurations of the network is largely an open question [17].

Variability – the variability of the conductance due to process and fabrication challenges can create design challenges for the crossbars. The neural network design has shown to be tolerant to large variations in conductance [18, 19, 20]. The signal integrity and electromagnetics issues related to packaging also need to be taken into account in this challenge.

2.4 Complexity issues for programming memristors

Programming the memristors requires applying a series of voltage pulses for a sustained period of time until the conductance of the memristor changes to the desired value. The state changes are based on the magnitude of the voltages applied. The issue with the realistic design is the voltage control across several memristors is not an easy task. The memristors in crossbar are prone to non-idealities and often faced with variable threshold voltages. This makes the design of the programing logic complex [21, 22]. The ability to program memristor devices in parallel with low cost on the power, and area on-chip, is a challenging program, especially if the design is for analog neural networks [23].

2.5 Architectural challenges

There are several types of neural networks. Many designs have multiple layers and they involve convolution layers that involve dendrite logic [24]. This makes the architecture design complex for generalisation. While crossbar-based designs can be used for a large number of neural network architectures, optimising the design for hardware is a totally different problem [6, 25]. The architectural changes need to be aligned with the circuit design challenges, especially, when the design constraints are with chasing accuracy and system-level performance metrics. The architectural designs also need to take care of a wide range of generalisation issues including those related to hardware-software co-design, and system of chip solutions [25].

2.6 Scaling and 3D integration

Scaling the CMOS circuits, and improving the packing density of the memristors are not a well-studied problem. There have been several suggestions on using 3D technologies and using vertical devices for very-large-scale integration [26, 27]. The main challenge in this regard has been the variability of the devices that prevent the large-scale 3D integration of crossbar-based designs. There are yet not full-proof solutions to scaling up in density and scaling up in size. The best architecture level scale-up is the use of modular designs that make use of several small crossbars to create larger ones [28, 29]. However, these designs are yet to be fully tested in a realistic commercial application.

2.7 Neuron model

There are several types of neurons in the human brain [30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40]. The cognition is a result of interactions between varied types of neurons in the cortex. Most neural networks inspire from the cortical neural networks and often are oversimplifications of the biological networks. The exact form of how intelligence over a life-time of human are not very well understood to completely build an equivalent machine intelligence. At best what we have achieved today in neuro-chips is weak intelligence, being able to implement some specific functionality of the human brain, that too not in its entirety. The journey of hardware AI research is its very early stages, with a scalable design similar to the human brain practically limited by the chemistry of how neurons work. The organic nature of the brain offers several advantages over the silicon neuron. The electrical models are many, but they all tend to be bulky and complex when implemented in silicon. Having a functionally complex neuron with simplistic implementation complexity is a major challenge in the system design of memristive neural networks.


3. Discussions and future outlook

While these challenges exist, the practical use of neural networks build with crossbar and that using memristive spiking neurons are many. Several problems having a few sets of sensors such as in biomedical sensing applications only need smaller neural networks to make the sensor intelligent. Likewise, many time-series based prediction problems use one-dimensional data that again only need simple recurrent neural networks.

The practical implementation of large scale networks is required to match the neural network scale and size of the human brain [41, 42, 43, 44, 45, 46]. Packing billions of neurons into a single chip is a major challenge, that requires to match the energy benchmarks and complexity. Current circuit implementations fail to match up with the energy benchmarks of the human brain, mainly as the scaling of power supply on chips are practically limited by electrical design and device constraints. In addition to this, packaging and electromagnetic effects also play a major role in building systems with neural chips. The precision engineering of these chips for reliable use is important for long term acceptability in higher intelligence tasks. Further, the data processing with the neurochips can be prone to adversarial attacks, which means the system needs to be made secure using dedicated cryptographic coprocessors. Going further, it will be also important to see the applications of these neurochips in human-machine interfaces, and for building connected and collective intelligence solutions.

Ageing is a time-dependent process, where the conductance of the memristors changes over a period of time and use [15, 16, 47, 48, 49]. The more the memristors are used, i.e., writing and reading, the ability to keep the expected conductance levels diminishes. This is wearing out the phenomenon that the memristor devices face due to continuous electrical stress on the devices impacting the chemistry and physics of the device. Over a period of time, the multiple conductance states get combined, or disappeared, making the reliability of programming memristors challenging. This makes fine-tuning as an essential part of memristor programming and test stages. Any changes in the conductance values introduce undesirable errors in the output of the crossbar arrays, which is far from expected ideal behaviour. This is a serious issue when the multiple conductance states are extensively used for building analog neural networks with crossbar arrays. The conductance of the memristors is equated to weights in the analog neural network, and hence if a conductance state goes missing it makes the training more complicated. Additional, rules need to be framed to the pre-trained network models to further adjust the weight values to achieve convergence. Learning and self-tuning in this sense is an online process for analog neural networks with memristor crossbar arrays. Nonetheless, the advantages of the analog neural networks with crossbar outweigh the digital-only counterpart, for smart sensor integration and edge AI computing [50, 51, 52, 53, 54, 55, 56, 57, 58, 59].

When the noise gets added to the signals at input, in-network layers or outputs of the analog neural network, it introduces errors in the layers of the neural networks. The noise can originate in different ways, such as due to thermal effects, electromagnetic effects, or through external sources. Noise is typically seen as a problem in circuits, however, with neural networks this may have some advantages to offer, such as with avoiding overfitting during training. The role of noise in the human brain is immense and it plays some major role in the way intelligence and perception is shaped [60, 61].


4. Conclusions

There are several open challenges in neuro-memristive circuit design. The design challenges go from classical circuit analysis to computer-aided design issues. The major bottleneck with creating a billion-neuron chip is the limitations imposed at the device and at architecture levels. There are yet no practical tools that can help address all the design challenges in a systematic way. Unlike software tools, where debugging is a well-detailed topic of study, the neuro-memristive hardware design is not easy to debug due to a variety of non-idealities of crossbar and memristor devices. There have been several proofs of concepts of circuit designs and a growing body of literature on architectures that aim to address these very challenges. However, there is a long way to go before many of these designs can be put for commercial use on a large scale. The digital designs of neural networks are much more feasible than analog neural networks at this point in time. In the future, it is expected that analog neural networks will have a much more important role to play in making sensors smarter and make intelligent computing energy-efficient.


  1. 1. Strukov DB, Snider GS, Stewart DR, Williams RS. The missing memristor found. Nature. 2008;453(7191):80-83
  2. 2. Prodromakis T, Toumazou C. A review on memristive devices and applications. In: 2010 17th IEEE International Conference on Electronics, Circuits and Systems. IEEE; 2010. pp. 934-937
  3. 3. James AP. Deep Learning Classifiers with Memristive Networks. Ed. 1 ed. Cham: Springer; 2020
  4. 4. Babacan Y, Kaçar F, Gürkan K. A spiking and bursting neuron circuit based on memristor. Neurocomputing. 2016;203:86-91
  5. 5. Serrano-Gotarredona T, Masquelier T, Prodromakis T, Indiveri G, Linares-Barranco B. STDP and STDP variations with memristors for spiking neuromorphic learning systems. Frontiers in Neuroscience. 2013;7:2
  6. 6. Krestinskaya O, James AP, Chua LO. Neuromemristive circuits for edge computing: A review. In: IEEE Transactions on Neural Networks and Learning Systems. 2020;31(1):4-23
  7. 7. Li C, Hu M, Li Y, Jiang H, Ge N, Montgomery E, et al. Analogue signal and image processing with large memristor crossbars. Nature Electronics. 2018;1(1):52
  8. 8. Cai F, Correll JM, Lee SH, Lim Y, Bothra V, Zhang Z, et al. A fully integrated reprogrammable memristor–CMOS system for efficient multiply–accumulate operations. Nature Electronics. 2019;2(7):290-299
  9. 9. Vongehr S, Meng X. The missing memristor has not been found. Scientific Reports. 2015;5:11657
  10. 10. Abraham I. The case for rejecting the memristor as a fundamental circuit element. Scientific Reports. 2018;8(1):1-9
  11. 11. Biolek D, Kolka Z, Biolkova V, Biolek Z. Memristor models for spice simulation of extremely large memristive networks. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE; 2016. pp. 389-392
  12. 12. Simulator VS. Cadence Design Systems. Inc. Available from: [Accessed: 20 February 2020]
  13. 13. Mentor Graphics. Inc. Available from: [Accessed: 20 February 2020]
  14. 14. Silvaco. Inc., Available from: [Accessed: 20 February 2020]
  15. 15. Mozaffari SN, Gnawali KP, Tragoudas S. An aging resilient neural network architecture. In: Proceedings of the 14th IEEE/ACM International Symposium on Nanoscale Architectures; 2018. pp. 25-30
  16. 16. Zhang S, Zhang GL, Li B, Li HH, Schlichtmann U. Aging-aware lifetime enhancement for memristor-based neuromorphic computing. In: 2019 Design, Automation & Test in Europe Conference & Exhibition (DATE). IEEE; 2019. pp. 1751-1756
  17. 17. Georgiou PS, Köymen I, Drakakis EM. Noise properties of ideal memristors. In: 2015 IEEE international symposium on circuits and systems (ISCAS). IEEE; 2015. pp. 1146-1149
  18. 18. Biolek Z, Biolek D, Biolková V, Kolka Z. Variation of a classical fingerprint of ideal memristor. International Journal of Circuit Theory and Applications. 2016;44(5):1202-1207
  19. 19. Krestinskaya O, Irmanova A, James AP. Memristive non-idealities: Is there any practical implications for designing neural network chips? In: 2019 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE; 2019. pp. 1-5
  20. 20. Liu B, Li H, Chen Y, Li X, Wu Q , Huang T. Vortex: Variation-aware training for memristor x-bar. In: Proceedings of the 52nd Annual Design Automation Conference; 2015. pp. 1-6
  21. 21. Berdan R, Prodromakis T, Toumazou C. High precision analogue memristor state tuning. Electronics Letters. 2012;48(18):1105-1107
  22. 22. Merced-Grafals EJ, Dávila N, Ge N, Williams RS, Strachan JP. Repeatable, accurate, and high speed multi-level programming of memristor 1T1R arrays for power efficient analog computing applications. Nanotechnology. 2016;27(36):365202
  23. 23. Sung C, Hwang H, Yoo K. Perspective: A review on memristive hardware for neuromorphic computation. Journal of Applied Physics. 2018;124:151903. DOI: 10.1063/1.5037835
  24. 24. Li C, Belkin D, Li Y, Yan P, Hu M, Ge N, et al. Efficient and self-adaptive in-situ learning in multilayer memristor neural networks. Nature Communications. 2018;9(1):1-8
  25. 25. James AP. A hybrid memristor–CMOS chip for AI. Nature Electronics. 2019;2(7):268-269
  26. 26. Cheng KT, Strukov DB. 3D CMOS-memristor hybrid circuits: Devices, integration, architecture, and applications. In: Proceedings of the 2012 ACM international symposium on International Symposium on Physical Design; 2012. pp. 33-40
  27. 27. Payvand M, Madhavan A, Lastras-Montaño MA, Ghofrani A, Rofeh J, Cheng KT, et al. A configurable CMOS memory platform for 3D-integrated memristors. In: 2015 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE; 2015. pp. 1378-1381
  28. 28. Mikhailenko D, Liyanagedera C, James AP, Roy K. M 2 ca: Modular memristive crossbar arrays. In: 2018 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE; 2018. pp. 1-5
  29. 29. Mountain DJ, McLean MR, Krieger CD. Memristor crossbar tiles in a flexible, general purpose neural processor. IEEE Journal on Emerging and Selected Topics in Circuits and Systems. 2017;8(1):137-145
  30. 30. Gerstner W, Kistler WM. Spiking Neuron Models: Single Neurons, Populations, Plasticity. 1st Edn. Cambridge university press; 2002. p. 496
  31. 31. Gerstner W, Naud R. How good are neuron models? Science. 2009;326(5951):379-380
  32. 32. Rall W, Burke RE, Holmes WR, Jack JJ, Redman SJ, Segev I. Matching dendritic neuron models to experimental data. Physiological Reviews. 1992;72(suppl_4):S159-S186
  33. 33. Ostojic S, Brunel N. From spiking neuron models to linear-nonlinear models. PLoS Computational Biology. 2011 Jan 20;7(1):e1001056. DOI: 10.1371/journal.pcbi.1001056
  34. 34. Huys Q   J, Ahrens MB, Paninski L. Efficient estimation of detailed single-neuron models. Journal of Neurophysiology. 2006;96(2):872-890
  35. 35. Jolivet R, Kobayashi R, Rauch A, Naud R, Shinomoto S, Gerstner W. A benchmark test for a quantitative assessment of simple neuron models. Journal of Neuroscience Methods. 2008;169(2):417-424
  36. 36. Druckmann S, Banitt Y, Gidon AA, Schürmann F, Markram H, Segev I. A novel multiple objective optimization framework for constraining conductance-based neuron models by experimental data. Frontiers in Neuroscience. 2007;1:1
  37. 37. Rossant C, Goodman DF, Fontaine B, Platkiewicz J, Magnusson AK, Brette R. Fitting neuron models to spike trains. Frontiers in Neuroscience. 2011;5:9
  38. 38. Coombes S, Thul R, Wedgwood KC. Nonsmooth dynamics in spiking neuron models. Physica D: Nonlinear Phenomena. 2012;241(22):2042-2057
  39. 39. Gerstner W, Kistler WM, Naud R, Paninski L. Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. 1st Edn. Cambridge University Press; 2014. p. 590
  40. 40. Makino T. A discrete-event neural network simulator for general neuron models. Neural Computing and Applications. 2003;11(3-4):210-223
  41. 41. Prieto A, Prieto B, Ortigosa EM, Ros E, Pelayo F, Ortega J, et al. Neural networks: An overview of early research, current frameworks and new challenges. Neurocomputing. 2016;214:242-268
  42. 42. Hecht-Nielsen R. Neurocomputing: Picking the human brain. IEEE Spectrum. 1988;25(3):36-41
  43. 43. Zhu XH, Qiao H, Du F, Xiong Q , Liu X, Zhang X, et al. Quantitative imaging of energy expenditure in human brain. NeuroImage. 2012;60(4):2107-2117
  44. 44. Navarrete A, van Schaik CP, Isler K. Energetics and the evolution of human brain size. Nature. 2011;480(7375):91-93
  45. 45. Tomasi D, Wang GJ, Volkow ND. Energetic cost of brain functional connectivity. Proceedings of the National Academy of Sciences. 2013;110(33):13642-13647
  46. 46. Bélanger M, Allaman I, Magistretti PJ. Brain energy metabolism: Focus on astrocyte-neuron metabolic cooperation. Cell Metabolism. 2011;14(6):724-738
  47. 47. Kumar S, Wang Z, Huang X, Kumari N, Davila N, Strachan JP, et al. Oxygen migration during resistance switching and failure of hafnium oxide memristors. Applied Physics Letters. 2017;110(10):103503
  48. 48. Valov I, Kozicki M. Non-volatile memories: Organic memristors come of age. Nature Materials. 2017;16(12):1170-1172
  49. 49. Lohn AJ, Mickel PR, Aimone JB, Debenedictis EP, Marinella MJ. Memristors as synapses in artificial neural networks: Biomimicry beyond weight change. In: Cybersecurity Systems for Human Cognition Augmentation. Cham: Springer; 2014. pp. 135-150
  50. 50. Kozma R, Pino RE, Pazienza GE. Are memristors the future of AI? In: Advances in Neuromorphic Memristor Science and Applications. Dordrecht: Springer; 2012. pp. 9-14
  51. 51. Wang Z, Li C, Lin P, Rao M, Nie Y, Song W, et al. In situ training of feed-forward and recurrent convolutional memristor networks. Nature Machine Intelligence. 2019;1(9):434-442
  52. 52. Jeong DS, Kim KM, Kim S, Choi BJ, Hwang CS. Memristors for energy-efficient new computing paradigms. Advanced Electronic Materials. 2016;2(9):1600090
  53. 53. Li C, Wang Z, Rao M, Belkin D, Song W, Jiang H, et al. Long short-term memory networks in memristor crossbar arrays. Nature Machine Intelligence. 2019;1(1):49-57
  54. 54. Vourkas I, Sirakoulis GC. Memristor-Based Nanoelectronic Computing Circuits and Architectures. Cham: Springer International Publishing; 2016
  55. 55. Zhang X, Huang A, Hu Q , Xiao Z, Chu PK. Neuromorphic computing with memristor crossbar. Physica Status Solidi. 2018;215(13):1700875
  56. 56. Jiang W, Xie B, Liu CC, Shi Y. Integrating memristors and CMOS for better AI. Nature Electronics. 2019;2(9):376-377
  57. 57. Chakraborty D, Raj S, Fernandes SL, Jha SK. Input-aware flow-based computing on memristor crossbars with applications to edge detection. IEEE Journal on Emerging and Selected Topics in Circuits and Systems. 2019;9(3):580-591
  58. 58. Versace M, Chandler B. The brain of a new machine. IEEE Spectrum. 2010;47(12):30-37
  59. 59. Vourkas I, Stathis D, Sirakoulis GC. Massively parallel analog computing: Ariadne’s thread was made of memristors. IEEE Transactions on Emerging Topics in Computing. 2015;6(1):145-155
  60. 60. Fraiman D, Chialvo DR. What kind of noise is brain noise: Anomalous scaling behavior of the resting brain activity fluctuations. Frontiers in Physiology. 2012;3:307
  61. 61. Bernasconi F, De Lucia M, Tzovara A, Manuel AL, Murray MM, Spierer L. Noise in brain activity engenders perception and influences discrimination sensitivity. The Journal of Neuroscience. 2011;31(49):17971-17981

Written By

Alex James

Submitted: September 16th, 2018 Published: May 27th, 2020