Timers - Typically a WTF?



  • Am I the only one here who doesn't use timers (e.g. in .NET), and who automatically questions any code with timers in it?

    My earliest background is Win32 programming in C and assembly language. Timers exist in that realm, but (as with anything) it's more difficult to use them, since multiple API calls are necessary. Instead of a timer, this is how I would do something at one-second intervals in a C program:

     

    do{

     //Whatever

     Sleep(950); //Can be tweaked but should achieve ~1s interval

    }while(/* some condition */);

     

    The resultant timing is not exact, but it typically close enough for a one-eyed Bosniak peasant such as myself. 

    When I enter the world of "timers," on the other hand, I find that .NET has at least three varieties. At least one of these runs one's event handler in another thread, although its syntax doesn't make this obvious. Two other timer types seem to require the application to service a message queue- again, without anything syntactic to indicate this.  Of course, all these timers also have much more overhead than my simplistic approach.  I think the whole concept is an architectural disaster, which apparently dates back to some things in .NET's ugly ancestor, VB6.

    And yet, I find that programmers who are more immersed in the world of .NET than I am make heavy use of timers Typically, if they can find something like "8 hertz" in a specification, they interpret it as prima facie architectural evidence that somewhere a timer with an interval of 125 ms must be used.

    I don't get it. In fact, I feel like I'm actually giving up way too much information about real-time programming for free by even posting this. Of course, in Today's Modern World there's little chance of anyone actually using my technique for their benefit. Its easier to just fire off a witty response post (e.g. "You are the weakest link. G'bye.") than it is to actually consider my ideas.



  • @beau29 said:

    .NET's ugly ancestor, VB6.
     

    What?

     

    Generally, I don't consider timing to be a good coding practice, because you open yourself up to all sorts of race conditions. Waiting a little bit before proceeeding means that you hope everything will be all right with every iteration, because you usually depend on some outside state that goes too slow for your code to execute at full speed.

    One example is a Google Maps geocode request: you can't cram a loop in there and expect your complete list of 20 locations to be transformed into coordinates, so I had to invent a timed for loop:

    var array = someArrayWithThings;

    var i = 0;
    var loopTimer = setInterval(function () {
       if (i == array.length) {
          clearInterval(loopTimer);
          return;
       }
       /////

       // do your stuff.
       array[i];

       /////
       i++;
    }, 250);

    @beau29 said:

    Typically, if they can find something like "8 hertz" in a specification, they interpret it as prima facie architectural evidence that somewhere a timer with an interval of 125 ms must be used.

    That makes some sense. Without more context, it's hard to see the problem with thinking of a timer when you read "Hertz".

     



  • @dhromed said:

    @beau29 said:

    .NET's ugly ancestor, VB6.
     

    What?

    I'm an intuitive person and my intuition tells me this. Also, it seems that most VB coders have migrated to .NET; architecture aside, the VB6 target audience is the.NET target audience. In VB6, it was quite typical to drag a "timer" onto a form from a Toolbox. I think it is this mentality that's led to the .NET timer abuse I cite.

    @dhromed said:

    @beau29 said:

    Typically, if they can find something like "8 hertz" in a specification, they interpret it as prima facie architectural evidence that somewhere a timer with an interval of 125 ms must be used.

    That makes some sense. Without more context, it's hard to see the problem with thinking of a timer when you read "Hertz".

    Practically, when doing real-time programming in .NET, I find the only timing strategy that works reliably is to be greedy. If a program tells .NET it needs something every half second, .NET will respond by running the handler at (for example) 550 ms, then 540 ms later, then maybe 1200 ms later, then perhaps 490 ms later, and so on. The looping approach I suggested is more deterministic; if the sleep time is 0, then 100%/(number of cores) CPU utilization will be observed. This percentage can be throttled downward by increasing the sleep time. So, when doing real time programming in .NET, I typically decide on an acceptable CPU utilization level, and tweak the sleep time to achieve this. If performance isn't adequate at that level, then I go to management (or to whoever wrote that spec) and let them make a decision. In my experience trying to beat real-time performance out of .NET, this is the technique that's worked. And whoever wrote the spec does not care that, instead of "8 times a second," they are getting their data "as fast as .NET can manage." 

    Also, the looping approach does away with all the message-pumping, reentrancy, and overhead issues associated with timers. I just don't see what timers bring to the table, except a smug sense that "management said this has to happen 8 times a second, so I'm basically letting them write my code for me."

    Another big problem with the message-pump-based timers is that (in WinForms, at least) the messages can "back up" and cause delays and even crashes. In cases where the handler code takes just slightly longer than the timer interval, this problem can take quite a while to surface. To put it in broken German, "das ist nicht gut."

      


  • Discourse touched me in a no-no place

    @beau29 said:

    Practically, when doing real-time programming in .NET

    My turn.

    What??!?

    We either have different definitions of what 'real-time' means or you're using entirely the wrong language.

    (In a previous life I was involved with fiscal metering which had to be real-time, you can't integrate a pressure over time if you can't guarantee how long it's going to be between measurements. Languages of choice were assembly for the interface for the input hardware and C for the calculations. Windows wasn't involved either.)



  • @PJH said:

    @beau29 said:

    Practically, when doing real-time programming in .NET

    My turn.

    What??!?

    We either have different definitions of what 'real-time' means or you're using entirely the wrong language.

    (In a previous life I was involved with fiscal metering which had to be real-time, you can't integrate a pressure over time if you can't guarantee how long it's going to be between measurements. Languages of choice were assembly for the interface for the input hardware and C for the calculations. Windows wasn't involved either.)

     

    I agree about the overall unsuitability of .NET for real time applications. If one defines "real time" as meaning that events happen in predictable or even bounded time, then yes, .NET definitely doesn't fit the bill.

    However, every place I have worked since 2000 has at least been playing around with .NET. So, I definitely have seen what it can do in a wide variety of applications. And if one defines "real time" more loosely, e.g. if one uses it to say something like "this app needs to display boiler temperature in real time" then plenty of .NET apps fall into that category. Those apps will perform better using the loop / sleep approach (or a loop without a sleep) than they will with a timer. 

    More generally, my original comment wasn't about the suitability or unsuitability of .NET for any particular purpose; it was that I perceive that a great many programmers display an attachment to the "timer" paradigm which I find infantile. 


  • Garbage Person

    Timers are non-blocking, meaning I can throw one directly into GUI code and it'll only block when the events fire, which when you're using it to update a GUI, is exactly the way you want it. The other alternative would be to spin up a whole new thread that updates the GUI, with all the associated complexities. If I sleep the GUI thread, it becomes nonresponsive, except when it's updating - which means you can't use it. Period.

     But what if you're not working in UI code? Then it all depends on exactly how you want the timing to work out.

     

    Lets say I want a report to run every 30 minutes. Using a timer, you set the timer for 30 minutes and it starts generating a report every 30 minutes or so. If you use the sleep method, you need to know how long your report is going to take to generate, and it needs to do it consistently, because you'll need to sleep for 30 minutes minus the duration it takes to run the report.  If you have a report that takes 0 seconds to run in the middle of the night, but 25 minutes to run during peak time, how do you set up your sleep intervals?

     

    Yes, your method has its place, but it's more at home in actual real-time-programming, rather than the "close enough" realm of .net.



  • Come on people, "real-time" is easy. It just means "you need the result between time X and Y, too early is wrong, to late is wrong."



    And by this definition you can do real-time stuff in any language. As long as you watch the following:

    -How bad is it do be wrong? If you are working on a pacemaker, then being 'wrong' (to late or to early with a shock) means someone dies. Being wrong is VERY bad then.

    -How hard do you need to prove that you will be right, you need to be right 99.9% of the time or only 90% of the time?



    Most, if not all, applications have real-time requirements.



    Examples:

    When I hit search on Google, then I expect a result within 1 second 90% of the time. It's a pretty soft requirement, but it is there.

    When I hit the brakes on my car I expect the brakes to work within 100ms (a blink of an eye) 99.999% of the time.



    Real-time does not mean "very fast" or "instant". And it's usually interpreted as "you need to be on time many 9s of the time"



  • @beau29 said:

    I feel like I'm actually giving up way too much information about real-time programming for free by even posting this.

     

     This really goes well with your common references to the academic mentality of the "upper echelon" of programmers.

     

    BTW, you are the weakest link. G'bye



  • @Weng said:

    Lets say I want a report to run every 30 minutes. Using a timer, you set the timer for 30 minutes and it starts generating a report every 30 minutes or so. If you use the sleep method, you need to know how long your report is going to take to generate, and it needs to do it consistently, because you'll need to sleep for 30 minutes minus the duration it takes to run the report.  If you have a report that takes 0 seconds to run in the middle of the night, but 25 minutes to run during peak time, how do you set up your sleep intervals?

    Easy.

    Either: Kick off the report in another thread and go to sleep for 30 minutes.

    Or: Run the report, then calculate the time interval between now and when the report is next due, and sleep for that long.

     



  • @tster said:

    @beau29 said:

    I feel like I'm actually giving up way too much information about real-time programming for free by even posting this.

     

     This really goes well with your common references to the academic mentality of the "upper echelon" of programmers.

     

    BTW, you are the weakest link. G'bye

    Wait, isn't beau29 the whiny socialist who is always complaining about "greedy" corporations?  And he's developing real-time software in .NET?  And he's afraid of share knowledge "for free"?



  • @morbiuswilters said:

    Wait, isn't beau29 the whiny socialist who is always complaining about "greedy" corporations?  And he's developing real-time software in .NET?  And he's afraid of share knowledge "for free"?

     

    That is exactly my point.  I am not going to bother arguing any actual points in the OP (which is fucking ludicrous by the way) because he is an epic troll along these lines.



  • @SCB said:

    Easy.

    Either: Kick off the report in another thread and go to sleep for 30 minutes.

    Or: Run the report, then calculate the time interval between now and when the report is next due, and sleep for that long.

     

    That code wouldn't pass a code review with me.  Timers are a good feature and are much better than writing your own threading code.



  • @SCB said:

    Or: Run the report, then calculate the time interval between now and when the report is next due, and sleep for that long.

     

    I think that many game loops work that way. The examples I've looked at worked exactly like that.

    @tster said:

    That code wouldn't pass a code review with me.  Timers are a good feature and are much better than writing your own threading code.

    Code reviews are a waste of time. Every one I've ever been involved in has devolved into either 1) a pissing contest over variable naming, indenting, etc. or 2) interminable discussion along the line of "you could have done X. That's an example of the <speakers_pet_pattern> pattern. That pattern is so sick! Oh, wait, you couldn't use <speakers_pet_pattern>- it's only for classes with non-covariant semantical paradigms. But you could have used <speakers_second_favorite_pattern>. That one's pretty sick too." I can endure such sessions only by poking my palm with a sharpened pencil and mentally promising myself a 64 oz. daiquiri.

    In response to the OP, I have found that timers appeal to programmers who want everything abstracted away from them. It is "nice" sometimes that they can post their events into the same message queue as the GUI. But as the OP pointed out, this feature can also undermine the programmer in subtle ways.



  • @bridget99 said:

    Code reviews are a waste of time. <words>
    Honestly, I didn't need to read past this to dismiss the whole post.  I've worked with plenty of people who don't care what other programmers think about code.  They tend not to work well on teams and usually stagnate with design as they don't learn from the people around them.

    Other than that, what makes you think timers run code within the UI thread?   You can have timers without having a UI at all!  



  • [quote user="beau29"]Practically, when doing real-time programming in .NET, I find the only timing strategy that works reliably is to be greedy. If a program tells .NET it needs something every half second, .NET will respond by running the handler at (for example) 550 ms, then 540 ms later, then maybe 1200 ms later, then perhaps 490 ms later, and so on. The looping approach I suggested is more deterministic; if the sleep time is 0, then 100%/(number of cores) CPU utilization will be observed. This percentage can be throttled downward by increasing the sleep time. So, when doing real time programming in .NET, I typically decide on an acceptable CPU utilization level, and tweak the sleep time to achieve this. If performance isn't adequate at that level, then I go to management (or to whoever wrote that spec) and let them make a decision. In my experience trying to beat real-time performance out of .NET, this is the technique that's worked. And whoever wrote the spec does not care that, instead of "8 times a second," they are getting their data "as fast as .NET can manage."

     

    [/quote]

     

     

    The reason the .NET "Timer" works this way has nothing to do with it's "ancestor" and everything to do with how a Windows Timer works.

     

    Now, given that they seem to require a form to work, it's safe to assume that the implementation uses the Message-pump based handling technique, whereby WM_TIMER messages are dispatched to that windows message handler. The unpredictable nature of how the messages are handled - and how quickly - is what causes the observed inconsistencies in the event intervals.

     

    The "real-time programming" approach is filled with flaws, not the least of which being that the User Interface becomes completely unresponsive unless you perform the "blocking" on a separate thread. and even then, the Application- or thread - will be consuming the entire timeslice that is available, whereby using a Timer will be relatively processing time free, (relying instead on timer interrupts).

     

    Additionally it is often implemented poorly. Sometimes a real-time program can "run to fast" for example- a game. The solution I have often seen employed in such cases was something along the lines of:

    for(int x=0;x<50000;x++);

    or something similarly processor-specific. Of course the programmer's mild WTF-ey solution when he upgrades? Easy! increase the number. Problem solved. And then when the users complain that it runs slow as molasses on their machines? Make it a setting. Of course it is all side-stepping the issue that  these "timing" issues are directly related to the time t hat is passing. Therefore it isn't too hard to put a necessary delay during each loop that is exactly the right amount, by keeping track of the average execution time per iteration and then compensating for the variances to attempt to make the game or simulation run as smooth as possible.

     

    Now, that being said, both have their advantages- A basic timer control when you need something to, say, check for the existence of a file (assuming your too dumb to use SHRegisterChangeNotification() or FindFirstChangeNotification()) or something along those lines.

     

    On the other hand since the "real-time" approach is almost always far more resource hungry in terms of CPU power it's a bit of a trade off. And for most of the circumstances where perfect or near-perfect timingis necessary it's almost unavoidable (high-resolution timers nowithstanding, but hey, if they don't exist in the .NET framework, they simply don't exist. (dllimport? never heard of it. go away)).

     

    Of course "real-time applications in .NET" is a total WTF anyway.




  • @BC_Programmer said:

    The reason the .NET "Timer" works this way has nothing to do with it's "ancestor" and everything to do with how a Windows Timer works.

    Now, given that they seem to require a form to work...

    .NET Timers do not require a form.



  • @BC_Programmer said:

    Additionally it is often implemented poorly. Sometimes a real-time program can "run to fast" for example- a game. The solution I have often seen employed in such cases was something along the lines of:

    for(int x=0;x<50000;x++);

    or something similarly processor-specific. Of course the programmer's mild WTF-ey solution when he upgrades? Easy! increase the number. Problem solved. And then when the users complain that it runs slow as molasses on their machines? Make it a setting.

    Which games have used this in the last 2 decades?  Maybe some terribly shitty VB written by a moron and not used in production.



  • I never said it was a new game! Actually come to think of it I don't even think it was game. If memory serves it was one of the flavours of borland C or C++ for DOS. Some boring utility, probably had something like that for a stupid rotating cursor effect. yeah, that's time well spent.

     

     



  •  Oh...so they don't... hmm, well, the runtime might send it to a hidden window, or just use a TimerProc callback... I think that requires a messagepump on the applications main thread, not sure though. Read it somewhere- I believe it was Raymond Chen's blog. 

     

    I haven't used any timer control in ages, I hate them.... to... restricting... I prefer to construct my own Timers with SetTimer() and associated functions, and go from there. (Recently I believe the only use I had was for a method to force VB6 to effectively "create a new thread" by calling back into a object in the TimerProc. a tad on the odd side but it worked for that purpose.)

     

    Didn't realize the whole "timer needs a form" stupidity was fixed yet. My mistake. (yes I'm a bit behind the times. But hey, It's not my job, it's my hobby, at least for the time being. (now if it was my job It's would be kinda negligent of me to insist that new development ever occur in a 11 year old tool)



  •  You have much to learn grasshopper.



  •  @tster said:

     You have much to learn grasshopper.

     

    Ahh, multiple implementations;  but are they all not subject to the seemingly random delays caused my servicing the timer messages?



  • @morbiuswilters said:

    Which games have used this in the last 2 decades?
     

    Assumed extra meaning: "newer games don't fuck up time anymore."

    While probably not caused by the Dumbass Delay method, even a modern game such as Fallout 3 suffers from timing issues. While it's expected that its world clock runs a tad slower when the scene is hard to render, it also speeds up when the environment is dead simple, instead of just running at 100% speed with a higher framerate. This is both odd an annoying because running through corridors as if it's a movie played back at 130% speed really breaks immersion.

    You get the same shit with GTA2 (the top down sprite-based one, you know), and I used keep Frame Rate Limiter off because the gameplay was unbearably sluggish at the authors' apparent "intended" world speed. On a new computer with the limiter off, the world zips by at breakneck speeds and it's again unplayable. :(

     

    PS.
    This is not an F3 thread hijack. No really!



  • @BC_Programmer said:

    are they all not subject to the seemingly random delays caused my servicing the timer messages?
    What?



  •  @tster said:

    @BC_Programmer said:

    are they all not subject to the seemingly random delays caused my servicing the timer messages?
    What?

     

     

    What I mean, is since I assume (possibly wrongly) that each Timer implementation is using a windows Timer under the covers, are they also subject to the the slightly erratic timing issues, turning a 15ms interval timer into a "I'll try to fire every 15ms, but I won't make any promises" timer?

     

    Not that I personally think this is "bad" behaviour- in fact in most instances where you supposedly need a timer to fire at exact intervals it's likely you shouldn't be using a timer in the first place and instead, as mentioned previously, some sort of Idle loop construct.



  • @dhromed said:

    @morbiuswilters said:

    Which games have used this in the last 2 decades?
     

    Assumed extra meaning: "newer games don't fuck up time anymore."

    While probably not caused by the Dumbass Delay method, even a modern game such as Fallout 3 suffers from timing issues. While it's expected that its world clock runs a tad slower when the scene is hard to render, it also speeds up when the environment is dead simple, instead of just running at 100% speed with a higher framerate. This is both odd an annoying because running through corridors as if it's a movie played back at 130% speed really breaks immersion.

    You get the same shit with GTA2 (the top down sprite-based one, you know), and I used keep Frame Rate Limiter off because the gameplay was unbearably sluggish at the authors' apparent "intended" world speed. On a new computer with the limiter off, the world zips by at breakneck speeds and it's again unplayable. :(

     

    PS.
    This is not an F3 thread hijack. No really!

    My God, I thought frame rendering rate had been decoupled from game time long ago.  I'm going to go cry now.


  • Garbage Person

    @morbiuswilters said:

    @dhromed said:

    @morbiuswilters said:

    Which games have used this in the last 2 decades?
     

    Assumed extra meaning: "newer games don't fuck up time anymore."

    While probably not caused by the Dumbass Delay method, even a modern game such as Fallout 3 suffers from timing issues. While it's expected that its world clock runs a tad slower when the scene is hard to render, it also speeds up when the environment is dead simple, instead of just running at 100% speed with a higher framerate. This is both odd an annoying because running through corridors as if it's a movie played back at 130% speed really breaks immersion.

    You get the same shit with GTA2 (the top down sprite-based one, you know), and I used keep Frame Rate Limiter off because the gameplay was unbearably sluggish at the authors' apparent "intended" world speed. On a new computer with the limiter off, the world zips by at breakneck speeds and it's again unplayable. :(

     

    PS.
    This is not an F3 thread hijack. No really!

    My God, I thought frame rendering rate had been decoupled from game time long ago.  I'm going to go cry now.

    It DID get decoupled once. However, because most games are now developed for consoles first and ported to PC, it's come back and is EXCESSIVELY common. Even on net-centric games, it'll often be linked to the framerate on the host (on dedicated servers they use a proper timer running at the ideal framerate instead). This is why most games have a frame limit cap (the better ones will allow you to adjust the cap and scale its timing accordingly)


  • @Weng said:

    It DID get decoupled once. However, because most games are now developed for consoles first and ported to PC, it's come back and is EXCESSIVELY common. Even on net-centric games, it'll often be linked to the framerate on the host (on dedicated servers they use a proper timer running at the ideal framerate instead). This is why most games have a frame limit cap (the better ones will allow you to adjust the cap and scale its timing accordingly)
     

    This sucks.

    I always thought of framerate and resolution as a kind of currency, that you may exchange for graphical quality, and you can get some back if you buy less quality or if it is maxed out. Instead, I get a game hopped up on speed.

    I feel fortunate in a stupid ironic way that my system is only a little bit over the max specs for Fallout. My next computer will have it suffer the GTA2 effect once more. :\

    More games:

    • Painkiller: suffers from same, but actually enters a strange slowmotion mode when your system doesn't cut it for a scene, without dropping frames. Speedup is up to a ridiculous 200% (eyeball figure). It's hard to play, but not as bad as dying needlessly because the game decided to not show you that some creature had advanced already.
    • GTA: SA: suffers from it, but only slightly, for some reason. I still play it.
    • Half-Life 2+episodes: no issues at all! Moar system => moar framez & moar res & moar AA! Wasn't this game conceived on the PC and ported to the console? Valve wins again.
    • Raptor, Call of The Shadows: runs without issues. This is a 320×240 DOS game from somewhere in the late Victorian age. No emulation required.


  • @dhromed said:

    Raptor, Call of The Shadows: runs without issues. This is a 320×240 DOS game from somewhere in the late Victorian age. No emulation required.
    You win for knowing this game.


  • Garbage Person

    @dhromed said:

    Raptor, Call of The Shadows: runs without issues. This is a 320×240 DOS game from somewhere in the late Victorian age. No emulation required.
    That is, unless you're a real man and run 64bit Windows, at which point you need emulation and need to spend half an hour beating on DOSBox and Raptor's configurator to get a workable sound configuration (that doesn't crash and uses the full fidelity audio tracks).

    Also, why do we still develop games? Apogee wrote Raptor like 20 years ago and it's still the greatest thing ever.



  • @Weng said:

    spend half an hour beating on DOSBox and Raptor's configurator to get a workable sound configuration

    What the hell are you using sound for? You're just going to hold down fire continuously anyway, making for a damn annoying sound experience.

    @Weng said:

    Apogee wrote Raptor like 20 years ago and it's still the greatest thing ever.

    This is true. But I've beaten it once, and have the optimal weapon configuration (twin laser + auto-aim laser), and I get through the first world mostly on muscle memory.  :)

    That third space world is a bitch, though. Enemies are tough enough to not be bothered by your auto-aim laser, so they often plod on to the bottom of the screen -- not to mention those awful base bosses that rain down fire in ways uncomparable to any of the flying bosses.



  •  I'm actually looking for shooters comparable to Raptor, but have been unsuccessful so far. Something with similar gameplay, but with mega spicy graphics.



  • @dhromed said:

     I'm actually looking for shooters comparable to Raptor, but have been unsuccessful so far. Something with similar gameplay




    Tyrian was something like Raptor, but its from the same century, therefore old graphics.



  •  @Nelle said:

    Tyrian was something like Raptor, but its from the same century, therefore old graphics.

    Old graphics do not scare me per se; they would be nice to have though.

    There was that other old shooter where you control a spaceship, and all your bullets looked like little blue-grey cylinders, kind of like the bookmark icons in Visual Studio. The view was squareish, non-scrolling, and the surrounding interface was space-tech. Kind of like tyrian, but grey bullets and grey hud. In a sense, it resembled a shooter-version of arkanoid, rather than a free-flight shooter.


  • Garbage Person

    @dhromed said:

    What the hell are you using sound for? You're just going to hold down fire continuously anyway, making for a damn annoying sound experience.

    True, the music alone is perfectly serviceable, and the best setting for that works out of the box. There's just something therapeutic about things that go "BMMMPH" when they explode, though (many world 1 ground structures for example).

     @dhromed said:

    This is true. But I've beaten it once, and have the optimal weapon configuration (twin laser + auto-aim laser), and I get through the first world mostly on muscle memory.  :)

     I've beaten the first 2 levels of world 1 with the bloody monitor turned off, playing strictly by sound and memory.



  • @Weng said:

     I've beaten the first 2 levels of world 1 with the bloody monitor turned off, playing strictly by sound and memory.
     

    Oh yeah? I've beaten wave 1 and 2 with... the computer turned... off...

    oh well.

    you win.



  • @Nelle said:

    @dhromed said:
     I'm actually looking for shooters comparable to Raptor, but have been unsuccessful so far. Something with similar gameplay


    Tyrian was something like Raptor, but its from the same century, therefore old graphics.
    Tyrian was great.  Also, it was released as freeware in 2004, so it can be had (legally!) for free.



  • @Weng said:

    @morbiuswilters said:

    @dhromed said:

    @morbiuswilters said:

    Which games have used this in the last 2 decades?
     

    Assumed extra meaning: "newer games don't fuck up time anymore."

    While probably not caused by the Dumbass Delay method, even a modern game such as Fallout 3 suffers from timing issues. While it's expected that its world clock runs a tad slower when the scene is hard to render, it also speeds up when the environment is dead simple, instead of just running at 100% speed with a higher framerate. This is both odd an annoying because running through corridors as if it's a movie played back at 130% speed really breaks immersion.

    You get the same shit with GTA2 (the top down sprite-based one, you know), and I used keep Frame Rate Limiter off because the gameplay was unbearably sluggish at the authors' apparent "intended" world speed. On a new computer with the limiter off, the world zips by at breakneck speeds and it's again unplayable. :(

     

    PS.
    This is not an F3 thread hijack. No really!

    My God, I thought frame rendering rate had been decoupled from game time long ago.  I'm going to go cry now.

    It DID get decoupled once. However, because most games are now developed for consoles first and ported to PC, it's come back and is EXCESSIVELY common. Even on net-centric games, it'll often be linked to the framerate on the host (on dedicated servers they use a proper timer running at the ideal framerate instead). This is why most games have a frame limit cap (the better ones will allow you to adjust the cap and scale its timing accordingly)

    Yay! People are finally starting to program correctly again!



  • Screw you bridget.  I read way too many posts in this thread before I realized it was from 2009.



  • @Sutherlands said:

    Screw you bridget.  I read way too many posts in this thread before I realized it was from 2009.

    A "timeless" thread?



  • @serguey123 said:

    @Sutherlands said:

    Screw you bridget.  I read way too many posts in this thread before I realized it was from 2009.

    A "timeless" thread?

    It's fun to review though, because there's SO MANY WTFs, from the mis-use of "real time", to the assertion that .net is based on VB6, to the assertion that .net Timers require forms (I guess I was just imagining all those Services I have timers in...)

    One down side, though: there's no me in it. I like reading me.



  • Timers != Delay

    For all those smart braines using delay methods to slow down process, remember that  timer's are not suposed to halt aplications.

     

    Your loops are doing exactly that only and thereby creating negativ impact on application.




  • What happened to the Nagesh who was around 6 months ago? I'm guessing body-snatchers.



  • @blakeyrat said:

    What happened to the Nagesh who was around 6 months ago? I'm guessing body-snatchers.

    Obivious that this user acount is shared by many people in softwear develop team. So each one has his own style and trying to keep up with predecesor's style prove dificult to next person.

    Or Nagesh is sufering from multipel personal disorder.



  • @Nagesh said:

    For all those smart braines using delay methods to slow down process, remember that  timer's are not suposed to halt aplications.

    Unless you are writing a program called TURTLE.EXE, as I did years ago in VB 1.0 to slow down DOS games that were old even at that time so they would be playable! Silly game programmers basing the speed of the game on your CPU clock speed...



  • A common modern mistake in games programming for framerate independent motion is incorrectly applying it. Most engines have an internal floating point value of seconds since the last frame. Multiply the speed of an object by that and that's how much it should move. Simple! Except a lot of people don't really think and apply this logic to decay (such as friction) and so on and so forth causing all kinds of madness at some framerates. "Yeah I'll just multiply the velocity by 1-delta, that'll work" as if frame times never go over a full second and decay is linear. As previously mentioned, most development is done in the console environment where framerates are pretty static so most of it goes unnoticed. Also, it pisses me off because it's damned sloppy and exploitable as hell.



  • @blakeyrat said:

    @serguey123 said:
    @Sutherlands said:
    Screw you bridget.  I read way too many posts in this thread before I realized it was from 2009.

    A "timeless" thread?

    It's fun to review though

    I was thinking about the 64oz daiquiri (that's almost 2 litres!) then I saw morbius and thought "He's back?!" then I saw the dates. Then I was disappoint.



  • @ekolis said:

    @Nagesh said:

    For all those smart braines using delay methods to slow down process, remember that  timer's are not suposed to halt aplications.

    Unless you are writing a program called TURTLE.EXE, as I did years ago in VB 1.0 to slow down DOS games that were old even at that time so they would be playable! Silly game programmers basing the speed of the game on your CPU clock speed...

    and what hapens after CPU gets faster?



  • If you want quasi-real time in .NET, you should use a multimedia timer


Log in to reply