There something I've been thinking about recently which is that is it possible to slow down a chain of events that are taking place in a machine/computer/programme.
To give an example say there is a live video feed that is recording certain events (CCTV IN A ROOM), and I am using a machine/programme that is showing the live feed on the screen while I make notes on the events that are taking place at a given time. Is there a way to make the video lag behind for example 30 seconds? So in a nutshell if I am watching the room through my eyes directly I can predict what the camera is going to show 30 seconds in advance as I have already seen what's taken place however the system will show the event taking place 30 seconds later due to the delay whilst the system still thinks it's a live feed.
I'm sorry if it doesn't make sense but hope you guys get the gist. Thanks