I hate Chrome



  • I debated posting this, because I know someone's going to be a dick about it. But then I realized-- hey! I don't give a shit what you think! So please keep that in mind before calling me a dick for using the following code.

    Chrome has some very very weird behaviors. Ok, let's say I want to pop off an analytics pixel at unload time... we need a little delay to ensure the image request gets sent, so we shove in this little gem:

    var s = e = new Date();

    while (e.getTime() - s.getTime() < 500)
    {
       e = new Date();
    }

    Ok, run it through Chrome and-- Exception? Huh? "Error: Too much time spent in unload handler." Hm. Well, fair enough. Let's fix that:

    var s = e = new Date();

    while (e.getTime() - s.getTime() < 1)
    {
       e = new Date();
    }

    And: "Error: Too much time spent in unload handler."

    1 millisecond is too long for you Chrome? Is that what you're saying? OH NO WAIT this exception is a lying liar with its pants thoroughly on fire. It has nothing to do with the amount of time spent in the handler; it really should read: "Error: Chrome detected you're trying to delay unload to ensure a pixel request goes through. Even though there's no way to accomplish this without delaying unload, we're going to be total dicks about it and pre-empt your attempt."

    So what's the fix?

    var s = e = new Date();

    for( var i = 0; e.getTime() - s.getTime() < 500; i++)
    {
       e = new Date();
    }



  • What the hell. How, why does chrome behave that way?



  •  If you spent half a second creating new objects just to turn around and destroy them, I'd yell at you too.



  • @delta534 said:

    What the hell. How, why does chrome behave that way?

    I fiddled around a bit, it's really quite targeted. A while() loop alone doesn't trigger the behavior, nor does calling getTime() alone. But if you call getTime() in the condition or body of a while() loop, bam.

    I can totally understand preventing infinite loops in BeforeUnload handlers, because duh. What's really stupid is that while the above quirky behavior is obviously designed to do that, they missed a bunch of other ways to lock-up their own browser. For example (and excuse the pseudocode, typed from memory in a notepad app):


    var img = document.createElement('IMG');
    img.src = 'http://url_i_want_to_load';
    img.loaded = false;
    img.onload = new Function('e', 'this.loaded = true');

    while( !img.loaded )
    {
      DoSomethingToTrickChromeIntoThinkingThisCodeDoesSomethingUseful();
    }

    Now obviously that while condition is never going to occur, because there's no way the image's onload handler can be triggered while we're in the middle of another handler. But you'd expect Chome to do the usual "chug along for about 30 seconds, then give us a 'this page stopped responding' dialog... right?

    Nope! Chrome happily just keeps on chugging along until the end of time, or the user kills the tab.

    You know it's a fun day at work when you accidentally figure out how to write malware.

    Oh, and BTW: the above code? Exists in an extremely popular web analytics program. Their saving grace is that they check a timer *and* the img.loaded property, but apparently it never occurred to the huge corporation whose name rhymes with "badobe"who recently bought this web analytics program which name sounds like "omniturd" to check whether the image onload handler could ever possible execute ever. Fucking idiots. Our 2-man team collects ten times the data in half the JS, and our JS doesn't have retarded bugs in it.

    Edit:

    @TehFreek said:

    If you spent half a second creating new objects just to turn around and destroy them, I'd yell at you too.

    Well, since the entire point of the exercise is to delay the browser, it doesn't really matter what we do in the loop.

    Believe me, if there was a way to tell the browser, "look, I know the page is unloading, but I really need this image request to go through first-- could you just spawn off another thread and do that for me while you're loading the next page? I'll be your best buddy!" Then I'd use it. But there ain't, so this is what we're stuck with.



  • @blakeyrat said:

    Ok, let's say I want to pop off an analytics pixel at unload time... we need a little delay to ensure the image request gets sent
    Is there not another way to do things? I know nothing at all about this, so maybe I've got hold of the wrong end of the stick again, but surely doing anything at all after I try to close a page is not the right way to go about things if you're trying to be ethical?


    Instead of firing off a pixel at unload, why not just load it every ten seconds (or at whatever interval is suitable) whilst the page is still open?

    [Edit: Or to generalise that a bit more, check if the page is still open, rather than trying to get it to do something after it closes.]



  • Good thought, but nope.

    The goal is to record what link the user clicked to leave the page. The instant they click the link, all execution of timers on the page stops and the only way of executing JS at all is in the BeginUnload handler. Now, if the link happens to be going to another page on the same domain, what we can do is record what link the user clicked in a cookie, then having the next page read the cookie and send the pixel on page load. (In fact, that's exactly what Badobe Omniturd does with internal links.) But for links going to a different domain-- there's no alternative to this I know of.

    (By all means, though, if you come up with any ideas, I'd love to hear them. Anything to give us an edge over other analytics vendors is good.)



  •  Redirect external links through another page? That is, instead of linking to http://example.com/, link to extlink.php?url=http://example.com/. Lots of sites that do this already.



  • Question.

    Why do you do this:

    @blakeyrat said:

    img.onload = new Function('e', 'this.loaded = true');
     

    Instead of this:

    img.onload = function (e) {
       this.loaded = true;
    };

    thus not having script in a string?


  • Discourse touched me in a no-no place

    @immibis said:

     Redirect external links through another page? [...] Lots of Too many sites that do this already.
    That way leads to madness: http://royal.pingdom.com/2010/09/22/is-the-web-heading-toward-redirect-hell/ (or just google for "redirect hell" for other pages.)



  • Ah the laziest of optimizations.

    does IE9 exhibt the same behaviour?



  •  Hm? I'm not seeing the problem, to be honest... the following works fine for me:

    <!doctype html>
    <html>
        <body>
        <script>
        function test(a)
        {
            alert('Do anything, like an ajaxy tracking call?');
            var d = new Date();
            var now = d.getTime();
            while (now + 1000 > (new Date()).getTime()) {
                // pause...
            }
            return true;
        }
        </script>
        <a href="http://google.com" onclick="return test(this)">test</a>
        </body>
    </html>

    You could also return false and have the Ajax call redirect after success, thus eliminating the need for a timer completely.

    Either that, or the Linux version of Chrome differs wildly from the Windows version.



  • Disable the link default action until you're done :

    <a href="link_url_whatever" onclick="setTimeout(out, 100);return false;">link text</a>

    ...with an "out" function in JS like :
    function out(e) {
       // do all you want to do with your images here
       doStuff();
       // then link
       document.location.href = (e.target ? e.target : e.srcElement).href;
    }



  • Either I didn't fully understand what you want, or toshir0 answer is the one that seems obvious to me — and in that case I wonder why it didn't occur to you.

    BTW, before "hanging" Chrome for a while, remember to cover the whole page with a semi-transparent div with style="cursor: wait". Giving feedback to the user is always important, and lack thereof has always been a pet peeve of mine.



  • @dhromed said:

    Question.

    Why do you do this:

    @blakeyrat said:

    img.onload = new Function('e', 'this.loaded = true');
     

    Instead of this:

    Because I was using Badobe code as an example of what not to do. Yes, not only is the code completely non-functional, even harmful, it's also bad style... but guess what? That code is live on every site that uses Badobe Omniturd right now this instant. I'm not making shit up to make them look worse than they are.

    @PJH said:

    That way leads to madness: http://royal.pingdom.com/2010/09/22/is-the-web-heading-toward-redirect-hell/ (or just google for "redirect hell" for other pages.)

    Also, I don't own the HTML of the page. I'm only allowed to install a single JS file.

    Yes, yes, theoretically, the JS file could surf the DOM for links and replace every link with a redirect, but that's virtually impossible to do in a bug-free manner. It'll probably break their cookie handling, too, since our redirect would go through another domain. Edit: Oh, and of course that wouldn't work for form submissions, something that causes an unload that we really need to know about.

    @zipfruder said:

    does IE9 exhibt the same behaviour?

    Honestly, not 100% sure. But I doubt it. I do know the modified code (with the for() loop) works in IE9. Also, I checked-out this code in IE9 when it came out a month ago, and I didn't notice any problem receiving unload pixels, so I'm pretty sure it's always worked in IE.

    [soapbox]Of course, IE is the only web browser not coded by open source hippies and head-in-the-sky academics, so it stands to reason they'd be the one to understand the necessity of code like this.[/soapbox]

    @Monomelodies said:

    Hm? I'm not seeing the problem, to be honest... the following works fine for me:

    What you're missing is that the code runs in the BeforeUnload handler. I should have made that clearer in my original post.

    If you moved your code into the BeforeUnload handler, and made an actual AJAX call instead of popping an alert, you'd see that the AJAX request never actually gets made-- instead the BeforeUnload handler stops execution on the page before the browser has a chance to request the image/XML/whatever. The entire point of this code is to delay the browser so that the image request goes through before execution stops.

    @toshir0 said:

    Disable the link default action until you're done :
    <a href="link_url_whatever" onclick="setTimeout(out, 100);return false;">link text</a>

    ...with an "out" function in JS like :

    I don't have control over the HTML of the page. Anyway, that wouldn't work-- the instant a link is clicked, all execution other than the BeforeUnload handler stops. That includes timers. As I pointed out in this post nobody read, apparently.

    @Zecc said:

    BTW, before "hanging" Chrome for a while, remember to cover the whole page with a semi-transparent div with style="cursor: wait". Giving feedback to the user is always important, and lack thereof has always been a pet peeve of mine.

    For 250 milliseconds? During the page unload? Seriously?


  • ♿ (Parody)

    So, TRWTF is web analytics?



  • @boomzilla said:

    So, TRWTF is web analytics?

    Yeah, pretty much. And how browser makers and the W3C hold their hands up to their ears and yell "NAH NAH NAH NAH!!!" whenever the need for web analytics comes up. There really should be support for something this simple in the spec. Hell, Google's own Google Analytics has code that looks... exactly like this.


  • Considered Harmful

    @blakeyrat said:

    Anyway, that wouldn't work-- the instant a link is clicked, all execution other than the BeforeUnload handler stops. That includes timers.

    Even when the link click event is blocked as clearly shown in this example? I call bullshit.



  • @joe.edwards said:

    @blakeyrat said:
    Anyway, that wouldn't work-- the instant a link is clicked, all execution other than the BeforeUnload handler stops. That includes timers.

    Even when the link click even is blocked as clearly shown in this example? I call bullshit.

    Fine; maybe it is bullshit.

    But you also have to remember, I don't own the HTML of the page. Surfing the DOM and putting a delay like this on every link is going to be extremely error-prone. Plus, I still don't believe it would work on form submissions. (Maybe it would.)

    I could explore the possibility, but a lot smarter people than me have worked on this for a lot of years, and every analytics package out there uses the same solution we're using. In any case, I don't see much difference between "delay the page for 250 milliseconds with a timer slightly before Unload" and "delay the page for 250 milliseconds in the BeforeUnload handler". So even if it worked, and if it could be installed on any arbitrary site using a generic .js file, I don't think it actually improves the situation in any meaningful way.



  • I'm sure you've tried some alternative ways already, right?

    Like storing the condition in a boolean, and checking that boolean in the while. Using a for, rather than a while loop. Building an infite loop and breaking out of it with an if() and a break. Making a counting loop to count from 1 to 1000, resetting to 1 at 1000 and again break out of it with an if() and a break.

    I have no opinion on the matter nor about Chrome (I don't use it), but just trying to give some suggestions.



  • @blakeyrat said:

    @boomzilla said:
    So, TRWTF is web analytics?

    Yeah, pretty much. And how browser makers and the W3C hold their hands up to their ears and yell "NAH NAH NAH NAH!!!" whenever the need for web analytics comes up. There really should be support for something this simple in the spec. Hell, Google's own Google Analytics has code that looks... exactly like shit.

    ATFY (anagrammed this for you !)



  •  Time to switch back to Chrome I guess. I hate pages that have timer loops in their unload events, or use unload events in general. And I seem to remember one of the most upvoted feature requests being to have Chrome block those, but I don't have time to look it up.



  • @dtfinch said:

    Time to switch back to Chrome I guess. I hate pages that have timer loops in their unload events, or use unload events in general.

    Me too, and if browser makers or the W3C extracted their head from their ass, I'd remove that code as soon as possible.

    @dtfinch said:

    And I seem to remember one of the most upvoted feature requests being to have Chrome block those, but I don't have time to look it up.

    Ironic, since it's demonstrably worse at detecting infinite loop BeforeUnload handlers than other browsers I've tested.



  •  A big problem, as alluded to above, is how marketers abuse things like unload events to do highly annoying crap like asking you "Did you really want to exit our site?  Didn't you know that you can get an exciting trial offer..."  (and I've had to endure bosses ordering me to add crap like that to sites I was working on development for).  This leads to a demand that browser makers block some of the more annoying permutations of those things, which can have collateral damage.

     And there's also the "privacy freak" crowd who hates all sorts of analytics regarding their own web browsing, too.



  • @dtobias said:

     And there's also the "privacy freak" crowd who hates all sorts of analytics regarding their own web browsing, too.

    Hey, I resent that, we are not freaks


  • ♿ (Parody)

    @dtobias said:

    A big problem, as alluded to above, is how marketers abuse things like unload events to do highly annoying crap like asking you "Did you really want to exit our site?  Didn't you know that you can get an exciting trial offer..."&


    Heh. This brings to mind the Raymond Chen posts where customers ask how to make their app overcome some aspect of the system to make their app always on top of even other on top windows, etc. I understand the usefulness of analytics to the people serving the pages, but fortunately they don't get to simply dictate what our browsers do in order to make blakey's life a little easier and less WTFy.



  • So you're whining because Chrome kept you from freezing up the browser? Aww, I'm feeling with you.

    As a slightly more sane approach, have you tried synchronous XHR (setting the last parameter of xhr.open() to true)? At least in Firefox that still seems to work, even during the unload handler. Haven't tried it in chrome yet, though.



  • @PSWorx said:

    As a slightly more sane approach, have you tried synchronous XHR (setting the last parameter of xhr.open() to true)? At least in Firefox that still seems to work, even during the unload handler. Haven't tried it in chrome yet, though.

    What does your code look like? That doesn't work in FF or Chrome for me.

    It does work in IE8, I didn't try IE9.


  • BINNED

    Well, from the perspective of a user leaving your site I couldn't give a shit if your unload handler runs or your analytics pixel gets through. So Chrome works exactly as intended here.

    But of course the hacks people will inevitably come up with to get around this will be worse than allowing it in the first place.



  • @blakeyrat said:

    @Zecc said:
    BTW, before "hanging" Chrome for a while, remember to cover the whole page with a semi-transparent div with style="cursor: wait". Giving feedback to the user is always important, and lack thereof has always been a pet peeve of mine.

    For 250 milliseconds? During the page unload? Seriously?

    Never mind. I was thinking of the case where you waited for the request to be answered, which could potentially be quite long on bad internet days. But you're only waiting for the request to be sent, so it doesn't matter.

    In any case, are you dealing with users of tabbed browsing? I usually open a bunch of links from the same page in several tabs and only return/close the original tab much later.



  • @Zecc said:

    In any case, are you dealing with users of tabbed browsing? I usually open a bunch of links from the same page in several tabs and only return/close the original tab much later.

    I'd have to second this question. I find myself opening links in a new tab and then closing the old tab a lot, especially if there's something I want to finish reading in the main tab (for example, for a blog entry that cites a news article in the opening paragraph, I'd open the news article in a new tab and finish reading the blog entry first).



  • @MiffTheFox said:

    @Zecc said:

    In any case, are you dealing with users of tabbed browsing? I usually open a bunch of links from the same page in several tabs and only return/close the original tab much later.

    I'd have to second this question. I find myself opening links in a new tab and then closing the old tab a lot, especially if there's something I want to finish reading in the main tab (for example, for a blog entry that cites a news article in the opening paragraph, I'd open the news article in a new tab and finish reading the blog entry first).

    I don't understand the relevance.


  • ♿ (Parody)

    @MiffTheFox said:

    @Zecc said:

    In any case, are you dealing with users of tabbed browsing? I usually open a bunch of links from the same page in several tabs and only return/close the original tab much later.

    I'd have to second this question.

    I don't know a hell of a lot about browsers, javascript, DOM, etc, but I think you guys are looking at this completely wrong. Blakeyrat wants to have some js that phones home to let him know which link you clicked on. And there's something about the browsers that stops the js from firing. What does tabbed browsing and closing tabs have to do with that? I suppose blakey can answer what happens to his code in that case...


  • @boomzilla said:

    Blakeyrat wants to have some js that phones home to let him know which link you clicked on. And there's something about the browsers that stops the js from firing. What does tabbed browsing and closing tabs have to do with that?
    Except I can click a bucket-load of links without ever firing the Unload event, and then I can close the tab without clicking a link.

    If he wants to send home what links are being clicked on, he'll be using the Clicked events on the links themselves; in which case I refer you back to tosir0's post.

    If he wants to know when the page is being closed.. well that's what the Unload event supposedly represents. But just because I have the page open — or should I say, non-closed — it doesn't mean I haven't fled the site.



  • @Zecc said:

    @boomzilla said:

    Blakeyrat wants to have some js that phones home to let him know which link you clicked on. And there's something about the browsers that stops the js from firing. What does tabbed browsing and closing tabs have to do with that?
    Except I can click a bucket-load of links without ever firing the Unload event, and then I can close the tab without clicking a link.

    If he wants to send home what links are being clicked on, he'll be using the Clicked events on the links themselves; in which case I refer you back to tosir0's post.

    If the link doesn't cause an unload, we have all the time in the world to send our tracking pixel. In fact, our script is designed for this-- we buffer events and only bother to actually send them if the buffer is reaching the URL length limit, or if the BeforeUnload event triggers.

    Whether or not the browser is using tabs doesn't really change anything at all.



  • @blakeyrat said:

    If the link doesn't cause an unload, we have all the time in the world to send our tracking pixel. In fact, our script is designed for this-- we buffer events and only bother to actually send them if the buffer is reaching the URL length limit, or if the BeforeUnload event triggers.

    Whether or not the browser is using tabs doesn't really change anything at all.

    Okay. So you're buffering events. That explains why you care about the page being closed.



  • @Zecc said:

    Okay. So you're buffering events. That explains why you care about the page being closed.

    Well, even if we weren't, the last event before unload is generally the most important one to know.



  • I don't know about you, but from testing with a minimal script and some netcat goodness, I've found this to work fine:

     

    <script type="text/javascript">
        function myUnload(){
            request = new XMLHttpRequest();
            request.open("GET", "http://myservernamehere:65534/123", false);
            request.send();
        }
        window.onbeforeunload = myUnload;
    </script>

     

    I've tested with FF3.6, IE9, and Chrome 11.  You need the false or it won't work.  Play around with it, see if you can get ajax on beforeunload to work right, because this is probably the "correct" way to do it.



  • Aren't XMLHttpRequests restricted based on same-origin policy?



  • @Shishire said:

    I've tested with FF3.6, IE9, and Chrome 11.  You need the false or it won't work.  Play around with it, see if you can get ajax on beforeunload to work right, because this is probably the "correct" way to do it.

    Nope. If our web server goes down for whatever reason, this'll lock-up the user's browser-- if you pass in async=false, the function will stop execution until it gets a response from the remote server. Definitely not the "correct" way to do it.



  • Is this really a problem with Chrome, or is it a problem with devs not testing for Chrome? Doesn't IE have a similar check?



  • @blakeyrat said:

    1 millisecond is too long for you Chrome? Is that what you're saying? OH NO WAIT this exception is a lying liar with its pants thoroughly on fire. It has nothing to do with the amount of time spent in the handler; it really should read: "Error: Chrome detected you're trying to delay unload to ensure a pixel request goes through. Even though there's no way to accomplish this without delaying unload, we're going to be total dicks about it and pre-empt your attempt."
     

    Unless it's just a bug in its infinite loop detection code, in which case the developers may have no idea that it does this. If it is written by open source hippies as you say, then you should be able to find out what actually triggers the error message yourself.

    But, since you've already found something which does what you want, it's probably not worth the time.



  • @jasmine2501 said:

    Is this really a problem with Chrome, or is it a problem with devs not testing for Chrome? Doesn't IE have a similar check?

    I didn't check IE9. IE8 and below have no problem with the original code. IE9 has no problem with the new code. Also, all IE versions correctly detect the infinite loop in the "omniturd" code sample, which Chrome fails to.

    @__moz said:

    Unless it's just a bug in its infinite loop detection code,

    Yeah, but that's a different error message. (And behavior-- in that case it's supposed to pop up the "this page is stuck" dialog.)

    @__moz said:

    But, since you've already found something which does what you want, it's probably not worth the time.

    Moreover, from past experience, I know nobody in the Chromium project or at Google reads the fucking bug tracker (like most open source projects), so there's also no point in putting this in there. Assuming they'd even consider it a bug, which they probably wouldn't.



  • @blakeyrat said:

    from past experience, I know nobody in the Chromium project or at Google reads the fucking bug tracker (like most open source projects)
    For major open source projects (Chrome, Firefox, etc) they really do read the bug tracker.  But it appears that they ignore submitted bugs because their priorities are  . . . . strange and random.   I've submitted bugs that were fixed in a few days while others just sit there with no action.  @blakeyrat said:
    Assuming they'd even consider it a bug, which they probably wouldn't.
    The biggest problem is there's no detectable pattern to what is important enough to get fixed right away and what isn't.  I like this one.  Submitted in 1999.  No activity from 2002 to 2010.  Finally marked as fixed last month.



  • @El_Heffe said:

    The biggest problem is there's no detectable pattern to what is important enough to get fixed right away and what isn't.  I like this one.  Submitted in 1999.  No activity from 2002 to 2010.  Finally marked as fixed last month.

    Frédéric's kids didn't grow up until 2010.



  • @El_Heffe said:

    For major open source projects (Chrome, Firefox, etc) they really do read the bug tracker.  But it appears that they ignore submitted bugs because their priorities are  . . . . strange and random.   I've submitted bugs that were fixed in a few days while others just sit there with no action.

    I've submitted 4-5 genuine (meaning: obvious, even to open source developers) bugs to Chrome. A couple were actually violations of the DOM spec. None have ever been acted on, except one... and the only reason it was is that I found another bug that had been acted on and personally emailed the developer who fixed it to take a look at the one I just put in. Maybe someone's looking at the bugs, but not bothering to comment or look for dupes or anything... but I highly doubt it.

    For Firefox, though, it's a really mixed bag... first of all, I believe all Firefox bugs are *read*, which already puts it ahead of Chrome. For DOM errors, Firefox seems to spring into action, fix it in the next dev release, then send me a "Firefox 4 Beta Team" t-shirt for contributing... in short, I'll be reporting bugs to FF in the future, because they really made it feel worthwhile.

    That said, Firefox does have issues. For example, this bug: Bug 100085 - disabled form elements steal events and prevent context menu or dialogs has been open since 2001. It's not a DOM violation because (natch) the DOM is too fucking vague to mention what should happen in this case, but it's pretty obvious to most thinking people that it's a bug, and none of the comments are "this is intentional" comments. 2001. Target: mozilla1.6alpha.

    I've actually thought about just putting in a new bug that says the same thing; maybe if I play it off as a Firefox 4 bug instead of a "this has been in your fucking Mozilla renderer since dinosaurs roamed the Earth" bug, it'd get some attention.



  • @Zecc said:

    BTW, before "hanging" Chrome for a while, remember to cover the whole page with a semi-transparent div with style="cursor: wait". Giving feedback to the user is always important, and lack thereof has always been a pet peeve of mine.

    This is basically what WinForms / Win32 apps do when they're busy, at least under Windows 7. But somehow this is not good enough for Microsoft, who 1) puts up a misleading "Not Responding" title when this happens, encouraging support calls 2) throws up stupid, cryptic messages during development if an app running in debug mode doesn't pump messages and 3) bombards the web with propaganda about multithreaded UIs and how they're mandatory.

    Well then... I'll just let Microsoft and its legion of fanboys explain to my boss why I have to spend extra time making sure my app pumps messages at all times, even while it's basically running a batch process. I'll just tell him I'm doing this because it makes sense to accept user input when the code can do #$%@ - all about it. Anything less would be unprofessional. And programming is a profession, right? Otherwise, people couldn't get hired without actually knowing CS.



  • @bridget99 said:

    But somehow this is not good enough for Microsoft, who 1) puts up a misleading "Not Responding" title when this happens, encouraging support calls

    How dare Microsoft tell users my buggy-ass application stopped responding to the UI thread! Why, I'm so angry I'm going to call the message "misleading" when in fact it's about the clearest error message Microsoft has ever written!

    @bridget99 said:

    2) throws up stupid, cryptic messages during development if an app running in debug mode doesn't pump messages and 3) bombards the web with propaganda about multithreaded UIs and how they're mandatory.

    Those bastards! Encouraging developers to write bug-free applications! They should be tarred and feathered!

    @bridget99 said:

    Well then... I'll just let Microsoft and its legion of fanboys explain to my boss why I have to spend extra time making sure my app pumps messages at all times, even while it's basically running a batch process.

    Dude, first of all, if you're writing apps that simple and using Win32 to do it, you're a gigantic fail. Shove it in .net and use a BackgroundWorker thread; that's what it's for. (You know, doing work. In the background.)

    Secondly, if it really is a batch process, put it in a fucking console app. Then you don't have a UI thread, then Microsoft won't gripe at you for not responding to UI messages. "Damn these screwdrivers! It makes it so hard to pound in these nails!"

    Thirdly, if you don't give a shit about the quality of your end-product to the extent that you're actually yelling at the OS maker for having the audacity to say "hey, you should probably give a shit about the quality of your end-product" at you during development... well... you really should enter another field. Maybe digging ditches. Or being a movie producer. You know, somewhere where it doesn't matter if you really give a shit or not.

    @bridget99 said:

    Anything less would be unprofessional. And programming is a profession, right?

    Not the way you're doing it. I actually kind of hope you're trolling.



  • On Win7 you can now use

    pITasklist3->SetProgressState(hmainwnd, TBPF_INDETERMINATE);

    to turn your taskbar button into an indeterminate progress bar.

    Sure, it doesn't do more than an indeterminate progress bar inside a form; or an animated waiting cursor, for that matter. But it looks nice and pretty.

     

    I'm sure I'd use it if I worked on desktop applications. On Windows. 7.



  • Look at this lovely Firefox bug, extraordinarily trivial, utterly obvious, and unfixed in version 1, 2, 3, 4.

    My new favorite.


  • ♿ (Parody)

    @blakeyrat said:

    Look at this lovely Firefox bug, extraordinarily trivial, utterly obvious, and unfixed in version 1, 2, 3, 4.

    My new favorite.

    Who is sillier? The devs for not correcting this? Or the guy who's been commenting on it for almost 5 years? If it were my project, I might see how long I could string him along, too.

Log in to reply