Alex James

Nazarbayev University

Alex Pappachen James received his PhD degree from the Queensland Micro- and Nanotechnology Centre, Griffith University, Brisbane, QLD, Australia. He is internationally known for his contributions on memristive networks, neuromorphic computing and image processing. Currently, he is Full Professor in AI and Neuromorphic Electronics at IIITMK. He is a mentor to several tech start-ups and co-founded companies in machine learning and computer vision hardware. Dr. James has been a founding chair for IEEE R10 Circuits and Systems Society Chapter and an executive board member of IET Vision and Imaging Network. He was an editorial member of Information Fusion (2010–2015), Elsevier, and is an associate editor for HCIS (2015–present), Springer; IEEE Access (2017–present); IEEE Transactions on Emerging Topics in Computational Intelligence (2017–present); and IEEE Transactions on Circuits and Systems 1 (2018–present). He is also a senior member of IEEE, life member of ACM and senior fellow of HEA.

2books edited

4chapters authored

Latest work with IntechOpen by Alex James

This book covers a range of models, circuits and systems built with memristor devices and networks in applications to neural networks. It is divided into three parts: (1) Devices, (2) Models and (3) Applications. The resistive switching property is an important aspect of the memristors, and there are several designs of this discussed in this book, such as in metal oxide/organic semiconductor nonvolatile memories, nanoscale switching and degradation of resistive random access memory and graphene oxide-based memristor. The modelling of the memristors is required to ensure that the devices can be put to use and improve emerging application. In this book, various memristor models are discussed, from a mathematical framework to implementations in SPICE and verilog, that will be useful for the practitioners and researchers to get a grounding on the topic. The applications of the memristor models in various neuromorphic networks are discussed covering various neural network models, implementations in A/D converter and hierarchical temporal memories.

Go to the book