@Raedwald said:
Statistically speaking, what happens is that after a few hours you'll realise you just broke everything. What once was stored as 1 * MINUTE = 60, is now read back as 60 = 0.06 * SECOND. That's why you should use a reliable library with constant time units, not something you wrote in 10 minutes with "constant but can change in future" time unitsSo, the variables of interest hold time (or duration). In seconds. But why must the unit be seconds? What happens if you later realise your program need milisecond precision, so the variables ought to hold time in miliseconds.Much of the work can be done by simply changing to this:
const int SECOND = 1000;