Charlie and I have a disagreement. Charlie says 'the industry' uses the term 'real-time analytics'. I say there's no such thing as 'real-time'. Let the debate begin!
Actually, it's not really much of a debate or disagreement. It's more just friendly banter. You see, we can both agree on the intent of the terms 'real-time' vs. 'near real-time'. Processing that happens seemingly instantaneously.
The definition of real-time lends credence to MY position though.
COMPUTING - relating to a system in which input data is processed within milliseconds so that it is available virtually immediately as feedback.
HA! See it right there! PROCESSED WITHIN MILLISECONDS... VIRTUALLY IMMEDIATELY. You see, NOTHING is 'immediate'. You reading this article isn't considered 'immediate', as light travels across a distance from your screen to your eye. Traveling that distance takes some time - though it is 'virtually immediate'. Same thing if we had a conversation... sound waves travel across a distance to your ears. Traveling that distance takes some time - though it is 'virtually immediate'. I equate this to this scene in the classic movie The Princess Bride. It seems real-time... but is it really?
This is the same argument I make when talking about 'real-time processing'. The 'low latency' is still 'latency'. Some time passes. It may be 50 milliseconds... 500 milliseconds... but it is still time.
Hence, I say 'near real-time', whereas Charlie uses the term 'real-time'. Same definition... just a small nuance in terminology.
Vote in the comments... Team Charlie or Team Gary?