Temporal Dynamics is a sort of umbrella term referring to how neural systems change and evolve over time. This includes the study of how neurons and neural circuits process information, respond to stimuli, and adapt through learning and plasticity.
In Machine Learning, we have temporal dynamics in recurrent neural networks (RNNs) and long short-term memory (LSTM) networks, which are designed to handle sequential data and capture temporal dependencies. We also have seen some novel architectures like Continous Thought Machines that aim to model continuous temporal dynamics more effectively.

But in Computational Neuroscience, we are more interested in how Neural Systems in the brain leverage synchronizaiton over a temporal dimension to process information. This includes studying oscillatory activity, phase locking, and temporal coding mechanisms that underlie cognitive functions such as perception, attention, and memory.
This special aspect of temporal dynamics is still missing from most machine learning models, which often focus on static representations of data rather than dynamic temporal processes. Continous Thought Machines are a step in the right direction, but there is still much to explore in terms of how temporal dynamics can be effectively integrated into machine learning architectures to better mimic the brain’s processing capabilities.