TY - CHAP AU - Valeri A. Makarov AU - José Antonio Villacorta-Atienza ED - Hubert Cardot Y1 - 2011-02-09 PY - 2011 T1 - Compact Internal Representation as a Functional Basis for Protocognitive Exploration of Dynamic Environments N2 - The RNNs (Recurrent Neural Networks) are a general case of artificial neural networks where the connections are not feed-forward ones only. In RNNs, connections between units form directed cycles, providing an implicit internal memory. Those RNNs are adapted to problems dealing with signals evolving through time. Their internal memory gives them the ability to naturally take time into account. Valuable approximation results have been obtained for dynamical systems. BT - Recurrent Neural Networks for Temporal Data Processing SP - Ch. 6 UR - https://doi.org/10.5772/15127 DO - 10.5772/15127 SN - PB - IntechOpen CY - Rijeka Y2 - 2024-04-25 ER -