@belgariontheking said:
And I thought beau wasn't a "guy".
This is true. I consider myself more of a "fop" or "raconteur" than a "guy."
@belgariontheking said:
And I thought beau wasn't a "guy".
This is true. I consider myself more of a "fop" or "raconteur" than a "guy."
@morbiuswilters said:
@belgariontheking said:
@morbiuswilters said:The mockery of MVPs is less heavy-handed than your unhinged IE6 rantThat was Bridget99, not this guy.They're not the same person? And I thought beau wasn't a "guy".
I switched to the similar-looking Bridget99 after I started getting long, menacing PMs. I'm certain I mentioned that... go back and read all my posts and I'm sure you can find it. The name is a tribute to a really intelligent user of IMDB who posts all sorts of astute commentary on that site.
Apparently this computer still "remembers" the old ID... I usually just walk up to any old kiosk here at the Penitentiary's A/V center so it's hard to avoid that sort of thing.@morbiuswilters said:
I'm guessing you're still a little gun-shy after your last effort ended so badly.
I'm not quite sure what you mean by that. I just looked at that IE6-related thread for the first time in a few days. It turns out that people in certain parts of the world experience even more difficulty getting their IE8 search bar to use Google than I did. In fact, it seems to be well nigh impossible for some people. So, in summary:
1) I posted something perceived as inflamatory
2) The inevitable conventional wisdom was loudly articulated by others
3) A different group of people began pointing out an even more severe version of the problem I had originally mentioned
4) The thread suddenly (and conveniently, for the conventional-wisdom-repeaters) went off-topic
5) In this second thread, the usual suspects grimly declared the first thread to be a personal debacle from which I should "learn," as if I had just drunkenly run over someone in my Camaro
I am sorry, but #5 makes no sense to me. The conclusion doesn't follow its premises.
@morbiuswilters said:
The real problem is that I've already come to associate your username with highly contrived attempts at flamebait, so it's kind of a wash. I also think you could have taken the MVP bashing a bit further and garnered more laughs, but I'm guessing you're still a little gun-shy after your last effort ended so badly.
I'm not really aiming to get a high number of stars. I do like to provoke discussion.
I think we're still very much in the "barber surgeon" era of software development and, as such, widely held opinions in the field are often still immature and even incorrect. There are a great many people involved in the field who are basically incompetent: people who only program in one "pet" language; people without the necessary formal training for their jobs;people who eschew formal thinking and parrot vendor talking points instead; and so on.
When I read comments like (paraphrasing) "Golly, don't make fun of Professor So-and-So; he could probably write a compiler single-handedly!" this only tends to confirm my analysis. If one hasn't written a compiler, or at least done maintenance-type work on one in a class, then what makes one feel competent to participate in this discussion? I ask this rhetorically- if you haven't done these things, then you're not qualified to dispute CS topics with me, or to hold many development jobs. And yet, such comments typify the state of thinking in this young discipline.
I am not alone in my assessment of the state of software development as a field, or in my use of provocative commentary. I would give Prof. Ed Lee (author of "The Problem with Threads") and the late Prof. Edsger Dijkstra (author of "To Hell With Meaningful Identifiers" - as pure an example of flamebait as one might name) as examples of people with similar views who are nonetheless highly regarded.
@alegr said:
Well, if you had just 1% of his experience and expertise, you might have some hope for my respect.
I said he was my favorite Microsoft MVP! One must admit, though, that he doesn't really fit the typical MVP mold.
@alegr said:
This guy could single-handedly write a compiler.
I don't doubt it. That's probably not as big a deal as you might imagine. In my experience, compiler development is similar to skeet shooting... it's both easier and more enjoyable than it looks from afar.
@alegr said:
Of course, you don't know that Dr. Newcomer currently has to use an electric "scooter" to get aroung.
We've probably all got a mental picture of the typical Microsoft MVP: the khaki pants; the dev-conference travel coffee mug and laptop case; the obsession with "sick new desktop gadgets" and "CTPs"; the 2-year-old white Toyota Highlander; etc.
Somehow this guy ( http://www.flounder.com/ ) just doesn't seem to fit. But apparently he made it through their, um, rigorous selection process.
The impression I get is almost as if that "Lord of the Web" guy were to get a PhD from Carnegie-Mellon. I guess Best Buy would really have to improve their benefits package for that to ever happen.
Sadly, he is probably my favorite Microsoft MVP now. There's something endearingly stubborn about just not giving a damn about design and not wanting to screw around with people that do... since they probably wouldn't get along anyway.
In fairness, this particular individual doesn't really seem to focus on the "Web" per se, he really is more of a nuts-and-bolts C++ expert.
@bridget99 said:
A moment ago while I was absent-mindedly picking the scabs off of my elderly neighbor's "missing" chihuahua, I had an unrelated (and hopefully less inflamatory) thought:
Are there just too damn many programmers out there? Think about how many search tools there are: Bing; Google; Dogpile; Hotbot; Yahoo; Altavista; etc. I get the sense that there are too many workers chasing the same piece of the metaphorical pie and the result is a bunch of duplicated work. I think "synch" is another example of a fun-to-play-with problem foisted off onto excess programmers by their baby sitters, e.g. Ray Ozzie, Scott Guthrie, etc. And how many custom drag-and-drop conference room schedulers does a Fortune 500 company need, anyway? Not nearly so many as they've gotten, I'd venture to say.
Consider, for example, the bundleware that came with my laptop: the stupid launch utility; the audio program with the horribly aliased icons; the back-door ActiveX rootkit left by their tech support. Somebody wrote this stuff, and the whole time they were probably drawing a big salary and wasting time with sarcastic posts like the ones I se here.
I ask myself, "Is that most programmers? Are we (or 'they,' since I'm a Manager;or 'you,' since this is Reddit ) just a bunch of con artists?" Sadly, it is actually even worse than that. Those bundleware programmers are actually fairly elite, inasmuch as they get to write commerical software that's made it to production. A typical Java or .NET programmer, slaving away at some ill-defined task for a dyspeptic "internal customer," is even more of a surplus burden!
Like I said, I was busy fooling with some stupid dog while I thought all of this up, so maybe it's not completely thought through.
That's a really good question and I'm glad it's out there. But are you even halfway serious about that Chiwhahua? The thing about Chiuhahuhas is that the surface area of their skin is very small. Even a small wound, inflicted "absentmindedly" as you described it, can prove fatal for such a small dog. Hopefully you are just kidding....?
@morbiuswilters said:
@bridget99 said:
I had an ex-girlfriend who competed in the Special Olympics, and she was quite vocal about the fact that using such an epithet in the general sense is offensive.You dated a retard? And you have a girl's name. How odd.
I'm not a girl. My nickname is a tribute to one of my all time favorite IMDB trolls. And yes, I had a girlfriend who competed in the Special Olympics, although she was really only my "show" girlfriend for family gatherings, business parties, etc. I apparently had something even less presentable for side action. But enough about the strange relationship between WPF and Silverlight...
@PJH said:
@beau29 said:
Practically, when doing real-time programming in .NETMy turn.
What??!?
We either have different definitions of what 'real-time' means or you're using entirely the wrong language.
(In a previous life I was involved with fiscal metering which had to be real-time, you can't integrate a pressure over time if you can't guarantee how long it's going to be between measurements. Languages of choice were assembly for the interface for the input hardware and C for the calculations. Windows wasn't involved either.)
I agree about the overall unsuitability of .NET for real time applications. If one defines "real time" as meaning that events happen in predictable or even bounded time, then yes, .NET definitely doesn't fit the bill.
However, every place I have worked since 2000 has at least been playing around with .NET. So, I definitely have seen what it can do in a wide variety of applications. And if one defines "real time" more loosely, e.g. if one uses it to say something like "this app needs to display boiler temperature in real time" then plenty of .NET apps fall into that category. Those apps will perform better using the loop / sleep approach (or a loop without a sleep) than they will with a timer.
More generally, my original comment wasn't about the suitability or unsuitability of .NET for any particular purpose; it was that I perceive that a great many programmers display an attachment to the "timer" paradigm which I find infantile.
@dhromed said:
@beau29 said:
.NET's ugly ancestor, VB6.What?
I'm an intuitive person and my intuition tells me this. Also, it seems that most VB coders have migrated to .NET; architecture aside, the VB6 target audience is the.NET target audience. In VB6, it was quite typical to drag a "timer" onto a form from a Toolbox. I think it is this mentality that's led to the .NET timer abuse I cite.
@dhromed said:
@beau29 said:
Typically, if they can find something like "8 hertz" in a specification, they interpret it as prima facie architectural evidence that somewhere a timer with an interval of 125 ms must be used.That makes some sense. Without more context, it's hard to see the problem with thinking of a timer when you read "Hertz".
Practically, when doing real-time programming in .NET, I find the only timing strategy that works reliably is to be greedy. If a program tells .NET it needs something every half second, .NET will respond by running the handler at (for example) 550 ms, then 540 ms later, then maybe 1200 ms later, then perhaps 490 ms later, and so on. The looping approach I suggested is more deterministic; if the sleep time is 0, then 100%/(number of cores) CPU utilization will be observed. This percentage can be throttled downward by increasing the sleep time. So, when doing real time programming in .NET, I typically decide on an acceptable CPU utilization level, and tweak the sleep time to achieve this. If performance isn't adequate at that level, then I go to management (or to whoever wrote that spec) and let them make a decision. In my experience trying to beat real-time performance out of .NET, this is the technique that's worked. And whoever wrote the spec does not care that, instead of "8 times a second," they are getting their data "as fast as .NET can manage."
Also, the looping approach does away with all the message-pumping, reentrancy, and overhead issues associated with timers. I just don't see what timers bring to the table, except a smug sense that "management said this has to happen 8 times a second, so I'm basically letting them write my code for me."
Another big problem with the message-pump-based timers is that (in WinForms, at least) the messages can "back up" and cause delays and even crashes. In cases where the handler code takes just slightly longer than the timer interval, this problem can take quite a while to surface. To put it in broken German, "das ist nicht gut."
Am I the only one here who doesn't use timers (e.g. in .NET), and who automatically questions any code with timers in it?
My earliest background is Win32 programming in C and assembly language. Timers exist in that realm, but (as with anything) it's more difficult to use them, since multiple API calls are necessary. Instead of a timer, this is how I would do something at one-second intervals in a C program:
do{
//Whatever
Sleep(950); //Can be tweaked but should achieve ~1s interval
}while(/* some condition */);
The resultant timing is not exact, but it typically close enough for a one-eyed Bosniak peasant such as myself.
When I enter the world of "timers," on the other hand, I find that .NET has at least three varieties. At least one of these runs one's event handler in another thread, although its syntax doesn't make this obvious. Two other timer types seem to require the application to service a message queue- again, without anything syntactic to indicate this. Of course, all these timers also have much more overhead than my simplistic approach. I think the whole concept is an architectural disaster, which apparently dates back to some things in .NET's ugly ancestor, VB6.
And yet, I find that programmers who are more immersed in the world of .NET than I am make heavy use of timers Typically, if they can find something like "8 hertz" in a specification, they interpret it as prima facie architectural evidence that somewhere a timer with an interval of 125 ms must be used.
I don't get it. In fact, I feel like I'm actually giving up way too much information about real-time programming for free by even posting this. Of course, in Today's Modern World there's little chance of anyone actually using my technique for their benefit. Its easier to just fire off a witty response post (e.g. "You are the weakest link. G'bye.") than it is to actually consider my ideas.
@movzx said:
This ad is geared to women who have problems conceiving.
Are you quite certain of that? I was fairly certain it was directed at Albanian separatists.
@morbiuswilters said:
Then pay for it yourself. If the oil business is at all profitable (and it will be) it can offset the costs of having to rebuild every time the city floods. If the cost of rebuilding is more than what the oil market will bear, the industry will move elsewhere, there's no need to tax the rest of us just to keep rebuilding your crappy city.
Louisiana is just a staging area for the oil business, not that different from Nigeria. The profits certainly do not get re-invested back into the State.
Besides that, one huge reason were so vulnerable down here is that the oil companies have turned so much of the swamp into open canal. They've been allowed to do this based on the same free market, laissez-faire ideology you seem to espouse. To me it seems obvious that some sort of government intervention into this free market is warranted. This is just basic economics- an example of an "externality" almost as obvious as the canonical pig-farm-next-door story (paraphrasing into WTF-ese, "The pig put foot. Grunt. The pig disgusting. The government move pig. ") Perhaps the oil companies should be taxed to pay for repairing the damage. But ultimately that's not very different in effect from just having the federal government foot the bill.
@morbiuswilters said:
when left to your own (meaning Louisiana) devices you did the same thing the Feds did and let the fucking levees go unmaintained until they failed. It takes a special level of corruption and incompetence to do no better than the Feds but New Orleans has that down pat.
What are you referring to? I am not aware of any other levee failure. Every storm prior to Katrina came, killed some people, caused major wind damage and brief flooding, and then left. The levees always held. If, as you seem to imply, at some point in the past someone other than the Army Corps of Engineers was maintaining the levees, then I think they must have done a fairly good job.
@morbiuswilters said:
@beau29 said:
Of course, they're helped along by the petty bourgeois proponents-of-Texas-independence-types here at WTF, and countless other communities of VB-addled Microsoft apologists.You misspelled "Micro$oft".
If I ever incorporate a business, I'll probably call it Macrohard just out of spite.
@bridget99 said:
"[People] might use Outlook when they're driving around; they might want to open up their laptop and type something in."
What an idiot. Spolsky really hammers Ray Ozzie in that link, and I can see why. Outlook is dangerous enough on a desktop.
@TwelveBaud said:
@bridget99 said:Suppose, now, that the same app is doing the same thing on a Vista machine. Microsoft's claim (I think) is that the GDI32.DLL code will do its work into some kind of scratchpad memory bitmap which DWM will then further process. As you note, the "Windows Accelerator" hardware is now excluded from the process, since DWM needs a RAM-resident bitmap (for the new ALT+TAB previews, the transparency effect on Window title bars, etc).... Perhaps I've been misinformed, but aren't the things you listed done on the GPU as well, without any RAM-resident bitmap? In Beryl, window previews were as simple as "render the texture you use for the window on this quad (the preview window space) as well. I don't know about the transparency blur effect offhand, but that could be done with a simple du/dv map texture on the same surface as the RGB texture with the window contents.
If by "RAM" she meant "main memory," then you're correct. DVM is not using main memory to do its obnoxious little blurring effects. However, the crux of her explanation remains true. The legacy (GDI / USER) stuff is basically (and needlessly) locked out of the previous, optimal code that it used to run under previous versions of Windows.
Maybe Microsoft has backed off of this in Windows 7, but that would seem to be very out-of-character. Usually they just plod forward with a bad idea until it reaches a sort of inevitability based on critical mass. Of course, they're helped along by the petty bourgeois proponents-of-Texas-independence-types here at WTF, and countless other communities of VB-addled Microsoft apologists. You know the sort... they typically own khaki slacks, at least one Toyota Camry, and a Blackberry, and they periodically emit prerecorded snippets about how "sick" Silverlight Deep Zoom is.
@lpope187 said:
My experience with VS (2005, 2008, & 2010) closely matches tster's. My machine is a 2.4 GHz Dual Core, 4GB DDR2, on Vista SP2. The only caveat is that Visual Studio starts up slower the first time (about 15s) after a reboot because that has to do with the system having to load all the .Net assemblies including addins. After that, VS start almost instaneously. This is par for course in a lot of applications like Adobe Acrobat, Siemens NX 6, and a host of others.
The one place where VS2008 really sucks compared to VS2005 is on the first edit after loading a project / solution. The IDE hangs for about 20-30s before it does anything. This behavior is not in VS2005 nor is it in VS2010, so I'm not sure what's going on.
It seems to me that your experience more closely matches my own than that of tster. That 20-30 second delay when you attempt first edit, for example, seems like exactly the sort of thing I complained about. To me, this does not seem like acceptable behavior from a program which (at the point-in-time we're discussing) I am only asking to do basic text editing.
To me, your statements seem brazenly contradictory, in a way that wouldn't pass muster in a more objective forum. Unfortunately, this kind of flawed reasoning seems to thrive at Daily WTF.
I think the reason is that site has become a collection of very like-minded individuals. In particular, it has become "ground zero" for a species of big business apologist which is otherwise ill-at-ease in the creative, somewhat academic field of software development.
There are many otherwise intelligent programmers who lack the intellectual polish to cut it socially with the granola-munching, mountain-biking crowd that dominates the upper echelon of the computing field. These people come here and spout simplistic right-wing, pro-Microsoft arguments whose truth they consider obvious. Because their proto-capitalist morals are considered self-evident (the primacy of the market, the immaturity of anyone who questions Microsoft, etc.) , no one here really bother constructing really convincing arguments around them.
So, we get statements like yours (paraphrasing):
"Gosh, I dunno, that sounds like crazy commie talk to me. Say, I always thought Visual Studio was plenty fast! Course, it takes dang near a minute to open up a simple text file and add a one-line comment, but jiminy! It's not like there's anything better out there!"
I take comfort in knowing that none of these views is taken seriously outside of Microsoft's marketing material or (perhaps) a meeting of the John Birch society.
@tster said:
@beau29 said:
So I guess my overall feeling is that before Microsoft gives us whiz-bang new features that seem to defy basic CS, they ought to work on the responsiveness of their IDE.Visual Studio is very responsive. In fact, I don't know of a more responsive and better performing IDE.
Yeah, I'm gonna have to go ahead and disagree with you on that one. I ran a little test just to give you some hard numbers. This was on a 2000 MHz Core Duo latptop with 2GB RAM running Windows XP. I started with a fresh boot, with nothing running except IEXPLORE. It takes about 40 seconds for VS2008 to open after I double-click its icon on the desktop. When I click a menu, there's another 8-15 second delay before I see my choices. If I then press F1, the IDE freezes up for about 20 seconds, after which I see some sort of "progress bar" followed fairly quickly by the help screen... which is non-responsive for another 15 seconds or so. Keep in mind that I've actually done nothing at this point; I'm just testing VS2008's ability to respond to user input under the most generous of conditions.
The most similar software tool I have on this computer is the MPLAB IDE, which is used for C and assembly language. This opens in less than 20 seconds and is completely responsive. Its help system opens virtually instantaneously.
Google Earth is completely responsive within 20 seconds after I double-click on its destop icon. In fact, if I close it and then reopen it a second or third time, it opens virtually instantaneously. And I don't think I've ever seen it just wander off with the CPU for (apparently) a summer abroad in Italy the way VS2008 will.
I have a really hard time working with anything as sluggish as VS2008. I tend to switch to another program or task far more often than I otherwise would, and productivity suffers as a result. I am far more productive in something slightly more primitivelike MPLAB or even Borland C++ 5.0 where my keystrokes and thoughts can flow in uninterrupted fashion.
@tster said:
@beau29 said:
The overall situation makes me want to hitchhike up to Redmond with an Elmo projector and re-hash CS101 for them.
I like the feature. It is extremely useful. I'm pretty sure they understand stacks and frames, considering that they actually made it possible to edit and continue. Personally I debug all the time and then see a problem on the line and wish I could fix it without restarting the program.
OK, that is a positive reply. I agree it's impressive that they've made this work. Obviously they've mastered CS101.(My problem is the "magic" they've grafted on top of it.)
I think the feature probably makes more sense in .NET than in Visual C++. Things are more predictable, and builds are also faster. Working in a large C++ project, rebuilds are not really interactive events that can "just happen" on a background thread. And apparently, for whatever reason, opening online help isn't either... So I guess my overall feeling is that before Microsoft gives us whiz-bang new features that seem to defy basic CS, they ought to work on the responsiveness of their IDE.
@belgariontheking said:
@beau29 said:
Like, you, I too am using the special software! And not seeing the scum on title of window bar. Except, for me this is the firm wear! Can you guess woo is?Did you suddenly turn into a rezzing markov chain?
There was a point in the whole "cold fusion" affair where another scientist managed to pin down one of "cold fusion's" creators - Martin Fleischmann - in a line of questioning for which he obviously had no answer. Fleischman responded by telling jokes. I think I have reached that point in my CS career. One might accurately describe it as "cloud-of-hot-air computing."
@PJH said:
And is there nothing in the options dialog to turn this off?
It's a project-level setting. So, I can't just change it once, or even once-per-solution. Also, because this is a project-level setting, any change gets checked into source control and affects others. And in any case, making a bad idea configurable doesn't excuse it.
Beyond that, I guess I am asking, "does anyone really want this?" Yes, it does seem like the sort of thing that some "programmers" might request. But my suspicion is that said "programmers" must be ignorant, VB-addicted mouse-jockeys to want such a thing.
I guess I can imagine, somewhat, how Microsoft translates a runtime environment onto a newly built program. But as somebody with a basic idea of what's going on underneath the hood, I feel skeptical about toting a runtime state from one program to another. The fact that the program that originally generated the runtime state had some kind of error, which may or may not be corrected in the newly built version of the program, only increases the level of mind-bending non-determinism inherent to the whole exercise.
For .NOT developers, "Edit-and-Continue" is the reason for all those stupid ".VSHOST.EXE" versions of your program that hang around in the background while using the later versions of Visual Studio. Anyone who writes code that inspects the process list, e.g. to implement a singleton EXE in .NET, must sort through that garbage, and I've seen that really confuse less experienced developers. There are undoubtedly other unforseen consequences as well, e.g. in build scripts.
The overall situation makes me want to hitchhike up to Redmond with an Elmo projector and re-hash CS101 for them.
@DOA said:
I use special software for this. It's called Windows XP.
Like, you, I too am using the special software! And not seeing the scum on title of window bar. Except, for me this is the firm wear! Can you guess woo is?
I used to call that Polydorkism, in that the perpetrator is apparently unaware of the usefulness of Polymorphism, and is thus a dork. I remember catching a team lead doing this once. He was not a bad programmer overall, but he suffered from the same more-is-better attitude toward source code that underlies products like Vista, Multics, or Office 2007. I think his response to my critique was something like "well, at least this way anyone needing subclass-specific logic willl have a hook to hang it on," as if he was doing us some kind of favor by going ahead and typing in all the possible subclass names...
The fact that these particular coders are "very serious about catching all their exceptions" is revealing, I think. I typically don't agree with people who say things like that. Overall, many applications really need far less exception handling code than they actually contain, and I get the impression sometimes that the mental lines between "debugging" and "error handling" are blurred for people who write code like this. This is one of the many forms of architectural tunnel-vision that I run into (other examples are obsession with logging, obsession with configurability, etc.) </offtopicflamebait>
Does anyone use this new feature? To me, it seems like a nuisance. Accidentally hitting F1 while coding in Visual Studio has, for years, been a productivity drain. Now, accidentally hitting basically any key at all while debugging will similarly bring one's workday to a grinding halt. Because, in the brave new world of VS2008, the compiler interprets any source code edits during debugging as an invitation to ride off into the sunset with the CPU.
Does anyone else find this annoying? Or is anyone making productive use of this new feature?
Also, am I the only one who finds the whole concept of edit-and-continue to be wildly non-deterministic? How in the hell do things like stack frame, program counter, etc. get mapped onto a completely new program?
@morbiuswilters said:
@tster said:
@amischiefr said:
I hate Government intervention as much as the next guy, but I wish they would put a cap on these increases. At least here in florida the governor told Allstate to fuck off when they wanted to nearly double their home owners insurance premiums.A) Your a fuckin' commie.B) If you don't want to pay high home owners insurance, don't live where hurricanes come through every year.
+1, Obvious.
Really, fuck Floridians who somehow think they are entitled to having the rest of us pay to rebuild their house every 3 years because they're too fucking stupid to move out of a place where God and Nature are doing their best to murder them.
I can't speak for Florida, but I'm in south Louisiana and I often hear the same complaint directed at us. It's completely misguided to say that we should just leave. Your complaints about economic fairness are incorrect as well.
People have to live here because of the oil business. If we had to stage the offshore oil business out of Baton Rouge, this would completely change the economics of domestic oil production. We would become even more dependent on foreign oil, and even less technically skilled. Our manufacturing and industrial base would erode even further. To a lesser extent, the same argument can be made about the fishing industry as well.
And yes, we are "entitled" to more than we've gotten. The majority of hurricane damage in Louisiana over the last 40 years or so occured due to a single levee failure. This resulted from documented fraud and incompetence by employees of the federal government, i.e. we bought dirt and engineering services from YOUR government and were defrauded as to their quality.
Besides that, the federal government takes a percentage of the gross oil revenue attributable to drilling of the coast of Lousiana. The state government gets $0, per federal law, because there's nothing left after dealing with the needs of the majority of the country.
I can't speak for Florida. I'm not sure they even allow offshore drilling. But when people complain about how Louisiana or New Orleans is getting undeserved money, it makes me cringe. And in a way, you included us in your comment. Certainly, this is a place "where God and Nature are doing their best to murder" us.
What I'm saying is that this doesn't necessarily mean that it makes sense to leave, or that we're idiots, or expect a handout. All we really need is for some dirt to be piled up in strategically chosen locations, and your damn Yankee Army can't seem to figure it out. I wish my job (as an ignorant Southerner) were as easy as collecting money and piling up dirt.
By the way, it's been 4 years since Katrina, and this place (outside the Yankee tourist parts) looks about like Sarajevo on a bad day. The poverty down here is incredible: the bombed out buildings, the rat attacks, etc. Enjoy your bratwurst and hope the tables never turn.
@Justice said:
One similar thing I've seen recently is something I'll call the "what the hell is wrong with you" approach.
I've seen ads on TV recently for one of those for-profit trade school/college establishments (community college education at a private school price!) which does this. The ad features some black dude in an off-center baseball cap, berating the viewer for not getting off their ass and getting some kind of education. One line actually goes something like "You're sitting there watching TV, your life is passing you by."
It's weird, but considering the target audience, maybe that's the message they need to send.
That's great, I wish we had those ads around here. I really enjoy snickering at those "ITT Tech" commercials, in which people find happiness due to their l33t server-rebooting skills, as the camera pans affectionately around some cubicle farm like it's the #$%$-ing Vietnam Memorial or something. Seeing the ads you're describing would take me to a whole new level of elitist smugness.
I guess the quality of the educational institution is inversely proportional to the amount of bullying-people-through-the-front door that goes on. At the bottom level, the juvenile detention center has men with guns. One level up, public schools have truant officers. One more level up, ITT Tech hits you with the "hard sell." Beyond that, real colleges won't even return your call without looking at your tax return and an SAT score.
In school, I recall going over a half-dozen or so strategies used in advertising: the bandwagon strategy, the elitist strategy, the "apple pie" strategy, etc. Lately, I have been running into a strategy that wasn't discussed in school, and which I don't understand. That is, I don't see how this approach could ever work, and I really question the collective sanity of a society that could give rise to such advertisements. Basically, the approach is to insult the potential customer.
The two examples that pop to mind are Cintas (the "uniform people") and LoJack (who want to put some kind of tracking device in your car, in case it's stolen).
Cintas runs radio ads where they spend 20 seconds or so trying to sell you on the comfort and attractiveness of their product... so far, so good. Then, the narrator says something like "With Cintas, your employees will look like valuable members of a team. Without them.... well, take a look for yourself." I think they even insert the old "record needle lifted off a running 33 1/3 RPM record" sound effect.
At this point, I imagine that the listener (who, of course, is manager of a business) is supposed to look over at his latest slacker employee, who will presumably be wearing Crocs, a pair of clam diggers, and a Dinosaur Jr. tube top. Thus enlightened, this listener is supposed to call Cintas to rectify the problem.
Do I misinterpret this ad somehow? I find the ad insulting and, as a result, Cintas has no shot at getting my money.
LoJack has been running print ads whose tone I can only desribe as stern. They read about like the speech I got from my dad about auto insurance when I was 16. The tag line is something like "It's your property. Protect it."
One particularly puzzling part of the LoJack ad attempts to deconstruct the obvious reason for NOT buying LoJack: that if my car gets stolen, I probably won't want it back. I'll probably just want the insurance money. At this point, the ad moves from fatherly to downright abrasive. "Here's some news for you," it reads, "If you finance your car, it's not your decision." That's about one "buddy boy" away from being a Dennis Leary rant. Paraphrasing, they are arguing that if my car gets stolen, the bank will want to give it back to me so I can keep paying on it, even in its now questionable condition. And I'm supposed to help them do this by buying LoJack, out of a sense of obligation gleaned from this stern advertisement.
At a rational level, this is ridiculous; I am about as likely to do Chase Bank a favor as I am to sprout a third eye out of my forehead. They make me carry insurance to cover their risk, and I do, but I'm sure not doing them any additional favors. That would be (contrary to LoJack's hectoring ad) irresponsible on my part, considering how I have children to feed and must build a retirement for my wife and me.
Because this makes so little sense at a rational level, I think these ads must be directed at people who, for mental health reasons, respond to abuse. I mean, I think one would almost have to be suffering from PTSD or Battered Wife Syndrome to respond to that LoJack ad.
Have I misinterpreted something? All in all, this is one of the many things I run into in the mass media that just make me feel out-of-touch with people. I can write off a great many things I dislike as bad taste (disco, "glare effects" on buttons, Pokemon, that new "Silverback" thing Microsoft is pushing, etc.) but this transcends mere bad taste. It's downright #$@%-ing irrational...
Still sorting throught all your comments... I am pleasantly surprised to see that all of your previous bravado toward me is backed up by a solid knowledge of Microsoft's debugger. Clearly, this is an all-important topic, since as we've already established Microsoft is the world's greatest company and anyone who doesn't use a symbolic debugger is an idiot. So it's no surprise that I've gotten so many good responses. Anything less would be a form of hypocrisy, and surely a bunch of respectable VB.NET developers such as yourselves would never indulge in hypocrisy.
More seriously, don't ever ask Windbag to do the obvious. For example, don't set its "symbol path" to the folder where the appropriate copies of your app's PDB files get built. If you do, not only will Windbag still not work, it will also trash that folder with a bunch of folders named after all the DLL files in C:\windows\system32.
The more I think about this, I think that I need to empower the scalable enterprise before my remote GAC tier will G.C. the W.C. Does it sound like I'm on the right track?
Wow, what an overwhelming response. You're such a beautiful audience... give yourselves a round of applause.
I have started trying to get what I need using the CDB tool. My favorite granola-slurping widget-marshallers provided this to me in the same MSI as Windbag. I managed to get basically the same information out of CDB as I was getting from Windbg (i.e. an infuriatingly lame, bogus excuse for a real call stack), but without the annoying preponderance of animated dogs or effeminate-looking gradient effects. So, I was starting to develop some small optimism for CDB.
Then, I accidentally typed "syncheck" into it instead of "symcheck." Of course, there's only one proper response to such an error... 5 minutes of silent, unresponsive lockup followed by 20 pages of gibberish.
Oh, gosh, I sure hope no one ever expects me to do anything without Visual Studio, golly, how would I ever get stack traces? Oh, gee, how would I ever figure out how to include files without StdAfx.h? Oooh, I hope no one ever asks me to do something hard like assembler, and if they do I hope it's x86 because having 5,000 instructions sure will make everything easier. Jiminy, if this were any easier college students from India could do it!
After years of dodging it, I am finally making an effort to learn the WINDBG.EXE symbolic debugger for Win32 apps, or, as I have taken to calling it, "Windbag." I think this name fits because this tool seems to:
1) Never give you a straight answer to what you are trying to ask (e.g. "where did my executable crash?");and
2) Always give you a bunch of other boilerplate information about other things, which is of highly dubious utility, e.g.
Product: WinNt, suite: EmbeddedNT SingleUserTS
Machine Name:
Debug session time: Mon Jun 1 11:52:38.000 2009 (GMT-6)
System Uptime: not available
Process Uptime: 0 days 0:00:16.000
or
SYMCHK: ~glh00008.tmp FAILED - Image is split correctly, but ~glh00008.dbg is missing (is that even object code? and what does "split" mean?)
I am posting here because I seriously do need as much help with this thing as I can possibly get. I am posting a list of questions below. But my inane side rant is that Windbag has perhaps the worst user interface I have ever seen in a Windows application. It reminds me of some UNIX app that was really-not-meant-to-have-a-GUI-but-does. Unfortunately, Windbag doesn't have the excuse of being, old, open-source, or unimportant. This Windbag thing makes me feel like a hyperactive 6 year old trying to configure EMacs. The entire program seems to have been designed in an effort to provoke me into throwing things.
My questions:
What is the role of the /Oy (omit frame pointers) optimization with respect to PDB files? If I have an EXE built with /Oy, and have the correct PDB files, should I then be able to see a real call stack for the app? Or does /Oy prevent this even in the presence of PDBs?
Why are the "Open Executable," "Open Crash Dump," etc. menu items grayed out once I've opened or saved a "workspace?" How do I get these items to be enabled once more without closing Windbag entirely? I tried to use "Clear Workspace" but the dialog that popped up provoked some kind of primeval aversion in my adrenal cortex (it's one of those dialogs where the user has to pick various things to throw back and forth over a fence using buttons with labels like >> and <... Grrrrr! How does that have anything to do with "Clear Workspace?" )
Generally, how am I supposed to use this thing?As I understand it, I am supposed to take a .dmp file from the problematic machine and then feed it to Windbag, along with the relevant symbols files (PDBs and perhaps maybe some other kind I don't know about). Some of these symbols will come from a Microsoft website, and some of them will have to come from me. There is an SRV* syntax that I can put into the relevant dialog box in Windbag to combine these symbols paths, e.g.
SRV*c:\work_done_begrudgingly\symbols*http://msdl.microsoft.com/download/symbols
I think that setting this path is a part of my Windbag "workspace," and that as a result if I have different versions of my own PDBs hanging around, I will need distinct "workspaces" for each. Is this correct? It is the impression I get from the, um, Windbag GUI, but it seems ridiculous to make me browse around all of the time for different versions of my PDBs. I think a much better approach would have been to simply require the PDBs and the DMP to coexist in the same folder.. is there support for that mode of operation? Or would that have been too, um, primitive?
In any case, once Windbag knows where to get symbols, I think I should be able to see stack traces after loading my DMP file. But this isn't working. I see a few examples of what seem like function names in the Windbag call stack, but none of them tell me anything other than "you're dumber than me" (which, I think, is all the Wizard-of-Oz @$$-clowns up in Redmond really want me to know.)
Does anyone have any suggestions? As always, I have failed to consider the impact of DCOM widget-thriftiness or CWhiffleSnaffle storage padding or some other equally crucial aspect of Win32 that I should feel guilty about not knowing more thoroughly. I know there is some provision for giving Windbag source code files as well, but I can't see why it should need these to show me a stack trace, and the source is very scattered. I really don't want to have to browse to every single folder that Windbag might need, especially since it uses that nausea-inducing folder selection dialog (the one with a big treeview with a textbox below it... this program is like a museum of bad GUI ideas).
Incidentally, I noticed that the first Windbag-related link I found on Microsoft.com had a video explaining symbol servers posted, which I did not watch. I think that the type of person who thinks that watching videos on the web is a good way to learn is generally not the same type of person who needs to do in-depth debugging of a Win32 app. I don't know anyone over 30 who voluntarily watches web videos (particularly instructional videos), nor do I know anyone under 30 who does real Win32 programming any more ("it's too hard... what would I do with all that money anyway... waaaaaaaaah.") So I don't think that video is getting many hits. Feel free to flame me if you actually watched it.
@Michael Casadevall said:
@beau29 said:
And I don't think I'll ever quit programming because, quite honestly, it gratifies me to earn a good living doing something that my colleagues say I'm bad at. I've been insulted by people who were probably even more stupid and incompetent than you, and my general response has been to go out into the open market and write myself a 20% raise. Believe me, as I sit here doing real-time programming in C++, I'm not a bit worried about the crummy database programmers (like you, probably) who used to run me down at my old jobs. But if it makes you FEEL better you can pretend I'm rotting in the gutter ;)
Quite frakly, your attitude sucks. People don't call your code or method of doing something crap without just cause. Learn to listen to some critism and maybe, just maybe improve yourself a bit. No matter how much of an expert you are on something, there is someone out there who is 10 times the expert, and by ignoring them, all you do is make yourself look like an idiot to those who you could learn something from.
Yes, I agree, that post was mean-spirited, and probably did a disservice to some very smart people. But please remember that I only wrote this in response to someone who said that I should "just quit programming" and made all sorts of sarcastic comments about "flipping burgers" and so on. I thought that was inappropriate. MorbiusWilters posted some basically reasonable arguments about my high-level judgment about products, paradigms, etc.... but then he couched it in this personal attack on my competence as a programmer. In my experience, good programmers are hard to find and it's not helpful to discourage them. Please do not chastise me for responding mildly in-kind to such a post.
Please, I am not a troll and I'm sorry if I offended you all. I don't think I've ever even read anything hosted on Slashdot.com.
My suggestion is just that Microsoft isn't making very good development tools right now. Give me credit for at least posting a bunch of real examples and for trying to offer real help about a real alternative.
Much of what I posted here amounted to very specific advice like "when you want to make a .cpp application for Windows, don't pay lots of money for Visual Studio; use GCC instead; here's how you get it for free and an example of how to use it."
I don't think that any of this necessarily constitutes a rant. When I got an ad hominem response, I got a bit ugly. Frankly, I think Microsoft needs its comfy little cage rattled a bit.
God, everyone knows that this kind of discussion is the whole reason Al Gore invented the internet, and his special rhythms.
@sceptre said:
You are comparing apples to oranges. If you want to compare VS, compare it with Eclipse, or something. If you want to compare GCC, compare it with... "Microsoft (R) 32-bit C/C++ Optimizing Compiler", CL.
OK, so I can buy Visual Studio and then use its command line tools to develop using Microsoft's brain-dead command prompt. Or, I can get Cygwin (or MinGW) and do the same thing for free, except I will get to use BASH or ZSH, which (I think we can all agree) are much better, and (unlike CMD.EXE) are widely implemented standards. I'm not sure what you're trying to prove by bringing this up.
I guess you could argue that if one likes IDEs then Microsoft's is the best. My feeling is that most programmers will not really want an IDE anyway once they understand exactly how little it is doing for them. That was the main point of my last post.This is not an apples-to-orange comparison... it's a broad comparison between two interchangeable development strategies that produce an identical outcome.
Besides that, I'm not at all convinced that Microsoft makes the best IDE. As I remember things, back in the 1990s products like SQLWindows and Delphi allowed a great deal of RAD / GUI design in an IDE environment. Visual Basic caught up with these products in the late 1990s, and basically ran them out of the market. Since then, Visual Studio has only regressed in terms of RAD and GUI app design. The WPF designer in VS2008 is really pretty bad... consider the lack of support for ViewBox for example.
In my experience most VS2008 users simply don't use the visual designers. It's too easy to break designer support. Nested master pages are an ASP.NET example of something reasonable that completely hoses the VS designer. I think Microsoft is simply too ambitious architecturally to limit themselves to things that work well in a designer, and I think their products suffer because of it.
I guess one can argue that there are other benefits to the IDE beyond RAD or being able to draw your GUI. But at this point you're offering up a pretty thin gruel. Yes, IDE users get the benefit on Intellisense, and The World's Slowest Help System, and the schizophrenic ability to build-while-editing-and-also-recompiling-the-Intellisense-database, and a whole lot of homoerotic talking paper clips and gradient backgrounds. I just don't think that's a worthwhile tradeoff. UNIX has features like "fork" and "man" that do a better job of these things anyway.
@sceptre said:
> rc MyApp.rcI'm not seeing your point.
@sceptre said:
Ever wondered what the "Create directory for solution" checkbox is for?
No, I don't have a whole lot of curiousity about Microsoft's build logs, command line tools, and so on. For one thing, the command line isn't used as much in the Microsoft arena as in the UNIX arena, probably because their command prompt is so bad.
But there's a more subtle reason as well. When I deconstruct the nuts-and-bolts of a Microsoft product, I don't feel like I'm really learning much about computing in general. I'm just seeing a whole bunch of annoying details decided on by people I don't like and don't respect.
I remember once I did a project where I implemented serial communication over a single wire. Obviously, things have to be electrically grounded in a certain way to do this. And I had to learn a great deal about RS-232 and other standards. When I was doing this, I really felt like I was learning about the basics of computing, along with some immutable, universal laws about physics. But agonizing over, say, the ASP.NET page life cycle, or how Visual Studio 2008 compiles resources, or WPF dependency properties, just doesn't rise to the same level of learning or excitement for me. It just feels like I'm playing around in Scott Guthrie's underwear drawer, learning a whole lot of otherwise useless details about someone I don't really like anyway.
@morbius wilters said:
The next step (which should be obvious) is for you to quit trying to program totally. After all, you're clearly not qualified - you don't know the difference between Visual Studio (an Integrated Development Environment - IDE for those who have the brains) and GCC (a command line compiler). Perhaps flipping burgers is more in line with your talents? While they're sizzling, you can stand there and shout, "Down with the MAN!!!" to entertain yourself.
I don't know what I did to deserve such an ad hominem attack... how or why people manage to get so defensive about Microsoft is a question that I can't even begin to answer. We're not talking about a puppy with heartworms, for @#$@ sake, we're talking about an incredibly rich private business. They don't need a cheerleader.
And I don't think I'll ever quit programming because, quite honestly, it gratifies me to earn a good living doing something that my colleagues say I'm bad at. I've been insulted by people who were probably even more stupid and incompetent than you, and my general response has been to go out into the open market and write myself a 20% raise. Believe me, as I sit here doing real-time programming in C++, I'm not a bit worried about the crummy database programmers (like you, probably) who used to run me down at my old jobs. But if it makes you FEEL better you can pretend I'm rotting in the gutter ;)
@KenW said:
@beau29 said:
I'll just sit here and re-write GDI32.DLL to be free-threadable while I'm waiting for it.Will that keep you busy enough so you don't waste our time with your idiotic, meaningless rants that pointlessly bash MS? If so, please start now.
A minor bug in a product doesn't mean that the entire company is stupid. Neither does the fact that, in today's economy, they laid of game programmers; people are spending money on important stuff now (like food, mortgages, etc.) and not wasting as much on entertainment. Needing to cut jobs and keeping the employees in a division that will experience reduced income because of a spending slump in the target market is just bad business. Any one who isn't a moron can see that; the fact you can't should worry you.
First, I was kidding about GDI32.dl... I actually think this is already as thread-safe as it reasonably should be. I was just trying to make up a long, pointless, and probably impossible task as an example.
Second, this thead (your response in particular, but all of it) really pissed me off. So here's what I did: Sunday, I decided to stop using Microsoft development tools as rapidly as possible. I have been porting my work to GCC. Initially at least, my environment is Cygwin. "Port" is probably too strong a verb... GCC really builds just about everything I have with minimal changes. I have to use OpenGL where before I used DirectX, but now that I'm past the initial learning curve I don't miss DirectX at all.
The most obvious benefit I have seen is responsiveness. When I hit a key or click a menu item in Visual Studio, the time it can take to respond is unpredictable and unbounded. Accidental actions like inadvertantly pressing F1 can cause the whole system to grind to a halt. Strange, computationally costly things seem to constantly be going on in other threads.
Using GCC, there are perhaps a few more steps to do certain things (at least for a non-bash expert), but each of the necessary user input actions is responded to quickly. The overall flow of my work and thinking is never interrupted by the 30-second detours that seem endemic to anything Microsoft writes. It's much easier to tackle complex programming problems when I don't have to wait 30 seconds between steps. I work fast and think fast and breaking the flow with long waits really hurts my productivity.
The second big benefit I have seen is that I really understand and feel in control of the overall development process. The Visual C++ build process and App Wizard have always seemed obscure to me. So many files and folders are created out-of-the-box for a new project, and I never really felt like I was in control of these or understood them. The build process was always F5 or high-level calls to MSBUILD.EXE, and everything just kind of ended up in the right place... hopefully. I always just glossed over things like resource compilation, and I tolerated the lines and lines of boilerplate code and seemingly redundant files and folders that Microsoft was foisting upon me.
For example, as an exercise I just went and created a new "Win32 Project" in VS2008. This is the most basic C++ GUI project except for "Empty Project" which really is 100% empty. Just accepting the default options and building one time, I got all of the following:
1) A top-level folder holding a .sln file, some associated files, and a "debug" folder.
2) A subfolder of that folder with the same @$#@#ing name
3) 12 files in that subfolder, having 7 distinct extensions
4) Another "debug" folder in the subfolder
5) 11 files in this second "debug" folder, with 8 distinct extensions
6) An actual .EXE in the first "debug" folder from item #1
7) 19 total files all starting with "ProjectName."
And all this project does is display an empty window.
I love GCC, on the other hand, because I am in control. If I need a resource file, I have to add it. It's not just assumed by the tool. I end up knowing exactly how and when the resource gets compiled. It's not hard... the .RC is the source and the target is a 'COFF' file:
<FONT color=#008000 size=2>$ windres MyApp.rc MyApp.coff</FONT>
If you can't handle that you don't need to be programming. The .RCs themselves actually make a great deal of sense in a text editor.
The link process is also laid bare in GCC... the programmer tells GCC what to compile when by just stringing together all the C/C++/COFF files in the correct order, e.g.:
$ g++ app.cpp library1.cpp library2.cpp MyApp.coff
This is not some magical, cryptic process that takes place behind a velvet curtain... but looking at Visual C++ you wouldn't know that. Microsoft really wants you to think they are doing more than they actually are. I am no longer falling for that trick.
At this point, I can write new Win32-target apps of any sort using GCC, and I have succesfully migrated the first GUI app I identified for testing, as well as my favorite Direct3D demo. Coming from a Microsoft background, I have been really surprised at how quickly and smoothly these things have transpired.I didn't have to stay up all night drinking Jolt cola to accomplish any of this (unlike, for instance, my first experiences with Direct3D). I watched the Super Bowl on Sunday, then attended an NBA game Monday. I have to cook dinner every night for my wife and daughter, do my regular job, commute about an hour each day, etc.
My conclusion is that progamming doesn't have to be an agonizing series of time-consuming accidental difficulties. With the right tools, you can get to the heart of the problem and apply your skills. With Microsoft tools, you will be too busy playing slap-and-tickle with WinSxS or DCOM to ever get to the heart of the matter. You can call what you're doing programming, but don't call it productivity. I call it "playing footsie with the Office Assistant."
Finally, consider that all of my GCC code is cross-platform. It's simply more valuable than the code I generated previously. Microsoft code has value only to the extent that Microsoft continues to sell, license, and promote the runtime. Real GCC code - in partciular the OpenGL code- is a more timeless asset.
I really hope that maybe some of you will follow the same path I am. I know Microsoft is a comfortable security blanket, and that the open source crowd can be very insular and nerdy. But please believe me on this: Microsoft is crap. This is not coming from some nerd sitting in his mom's basement trying to make a router do stupid tricks. Rather, this is my inescapable, objective professional assessment, in spite of years of really trying to make a go of it with Microsoft. It's just not worth it.
@morbiuswilters said:
I find it pretty odd that anyone over the age of 20 can't easily get a credit card even with a few problems on their credit history, so you may have something major on there you don't know about. If it's something you can't get corrected, you could try applying for a very-low limit card with high APR and pay it off promptly to build credit history. As you demonstrate a history of successful credit utilization, you will get better offers from other companies. It's generally a good idea to keep your oldest card open, though, as having 1 card open for years is generally preferred to having a bunch of accounts that get closed out after a short period of time. Good luck!
Things are pretty tight right now. I think the "perfect credit" crowd can still get financing, but those with "good credit" (say, 700-750) are quite likely to be rejected.
The good news (for the fiscally irresponsible, at least) is that the whole system seems to be breaking down somewhat. Almost any credit report you look at has something negative on it, due simply to the preponderance of dubious data out there. Every business seems to have this supposedly-potent ability to put something bad on your credit report. But they've used that power overzealously (e.g. reporting people for breaking a lease a month early, for moving and forgetting a $15 water bill, or for no reason at all except to see if they'll pay). Now there are so many black marks out there on so many reports that (like "A"s at an elite college) they don't pack the meaning they once did. I mean, you might be surprised what passes for a 750-score credit report these days.
My advice to the young is to build credit, but not to focus just on the number; having a 700 with the wrong kind of adverse entry can be worse than having a 600 that's low for other reasons. If you had a car repossesed, you probably won't get good (or any) car financing, even if you've somehow managed to put together a relatively high score by way of other accounts. It just depends on the company reading the report, and most of them can and do use information beyond the raw number.
@Kyanar said:
The first bit was a bit of a WTF. The second was a long, boring, completely irrelevant (oh, and incorrect) rant.
No, firing the Flight Simulator team is a WTF, and saying so is not a rant. That team put together a rock solid real time simulation which (unique among Microsoft products) evolved gracefully and incrementally over the years, without a hitch. Some version of Microsoft Flight Simulator exists for just about every piece of crap API and/or underpowered system the market has provided (C64, TRS-80 CoCo, MS-DOS, etc.) in the last 25 years. I've run several of those implementations and I've never seen a blue screen or a cryptic message box. And I get the sense that these implementations share a core of well-written metaphorical DNA. If Microsoft has accomplished this with any other product or team, please let me know. But before you do, ask yourself:
1) Which were the crap versions of Windows? I bet you know the answer...
2) Which were the crap versions of MS-DOS? Again, if you're of a certain minimum age, you'll know...
3) Which was the crap version of MS Flight Simulator? (There wasn't one.)
I realize that comparing the OS product line to a game product line is not entirely fair... but suffice it to say that the rest of Microsoft could probably have learned some things from the Flight Simulator team. And I think the OS task and the real-time simulation task can both be roughly described as "system programming" or "highly advanced" tasks. It's just that MS failed (technically, at least) at one and succeeded at the other... then they fired the team that succeeded.
And as far as all that enterprisey crap that supposedly someone needs (Workflow, BizTalk, etc.), maybe we should just put that questions out there for the group. If anyone reading this has a wonderful BizTalk story to tell (perhaps a touching vignette about the first time they saw an "enterprise bus" or "business logic" tier as a youngster and how it inspired them to an IT career) then go ahead and share it. Or maybe you have a knee-slapper about the first time you marshalled a widget to the remote tier. I'll just sit here and re-write GDI32.DLL to be free-threadable while I'm waiting for it.
I found that "Find in Files" wasn't working in Visual Studio 2008. "No files were found to look in," the program said. I checked my settings (file extensions, case insensitivity, etc.) and found them to be correct. I exited and re-entered the IDE. Eventually I rebooted (for other reasons). Still, "Find in Files" wouldn't work. Finally, I decided I'd attempt to find a solution on the Internet. And I stumbled onto some equally confused developers reporting the same problem... with something like the last 3 or 4 versions of this product. But The Real WTF (sm) is the fix (obtained from http://blogs.ugidotnet.org/franny/archive/2005/12/08/31303.aspx): give the IDE focus and press CTRL+SCROLLLOCK. If that doesn't work, try ALT+BREAK. If that still doesn't work (as was the case for me) try, simply, PAUSE.
And this situation has persisted for 7+ years and counting. Worse, Microsoft won't even admit the problem or give the real fix ( http://msdn.microsoft.com/en-us/library/z613zk0e(VS.80).aspx ).
These clowns (i.e. the ones in Redmond) really piss me off. When I heard they were firing people last week, my first thought was that they really didn't have to do that, with $60BB+ in the bank. My second thought was maybe it was just an opportunity to do some necessary housecleaning... Microsoft would clear out some of the obvious deadwood working on projects like BizTalk, SharePoint, "Work Flow," Hailstorm (or whatever they're calling it now) and other enterprisey crap that we don't reall want or need. Maybe they'd take the opportunity to rid themselves of the idiots behind the latest (flickering, slow, blurry, confusing) version of Windows. Perhaps whoever had been responsible for ruining SourceSafe would finally get their just desserts (i.e. an ignominious 10AM walk to the Audi TT with a box full of personal property).
And then they fired the !@#!ING FLIGHT SIMULATOR PROGRAMMERS!!!1!!!1!
They gutted one of the few development teams at Microsoft that had ever really impressed me.
I WILL NOT buy products from these people any more. I won't even pirate them. I will do without before I will send any more cash to these idiots. CTRL+SCROLLLOCK indeed.
When I come on to a project, or inherit code, it seems like I find one of two things:
1) The author of the code was simply not very skilled. There will be awkwardness, bugs, etc., and things won't happen as quickly or predictably as one might hope; OR
2) The author of the code was very skilled indeed. His programming activities occupy most of his time, and the result is an advanced codebase, which typically functions better than in case #1 but is also larger.
Counterintuitively, I find myself hoping for #1. In case #2, I find myself overwhelmed by code that's basically unnecessary. This can take the form of an overly abstract, often OO codebase which fails to simply do what it's asked to do (in the hope of being something more). Or, it can take the form of a very low-level codebase which fails from an economic standpoint- it simply takes too long to modify or maintain. I have even seen both at once (the deeply nested class with __asm blocks at the top of each method) and that to me is the worst kind of WTF. I mean, I can do OO, or I can do Intel __asm, but to do both in the same @#$@-ing file just seems masochistic.
It's very rare, on the other hand, that I find the passionate, skilled programmer that knows when it's appropriate to apply these skills. I get the sense that my colleagues are basically programming for programming's sake. They're using __asm (or the decorator pattern, or OO, or whatever else) because that's what they associate with top-notch programmers, not because they think it will give the best overall quality and functionality for the time.
Some of the worst turf battles in the job seem to come when one of these advanced programmer types doesn't "get" to write a library for the group (as if doing the work would be some kind of a privilege) and instead some other hotshot wants to do him the "favor" of writing it instead.
This really bothers me. I'm competent but I don't want to program for programming's sake. I don't want to be a part of a profession that throws away everything else of value - from economics to having a real life to actually making money - in the service of some macho hacker code of ethos. The value of a program, to me, comes first in the money it brings and second in the fact that it actually gets used. I get the sense that my colleagues don't give a damn about these things.
Does anyone else feel this way? Or did I just choose a job I didn't like?
@drinkingbird said:
@Jimmy Savile said:
Is this some kind of joke, its not 1986 any more!Yeah, because everything should be run by needlessly powerful computers, and there's absolutely no market for low cost, low power, reliable electronic devices that just do the job they're designed for.
His attitude really pisses me off as well... you can't just assume resources are endless because it's 2008. All these people walking around saying how nobody needs to learn C, or pointers, or assembly, or optimization any more (and that everything can be written in C# / Java / Python) are the same people producing crappy bloatware like Word, Outlook, and IE.
If you confront them with reality, these people will admit something like "oh, well sure, if you're writing some oddball embedded app, then maybe you need to worry about that stuff, but for typical desktop programs it's not necessary." Of course, as a result most desktop apps are agonizingly slow.
I remember using an Apple ][ at school and experiencing keystroke lag for the first time. I was used to the faster TI-99/4A, and I rightly associated this lag with primitive, overtaxed hardware. The fact that we still have to deal with keystroke lag (which, if anything, has got steadily worse over time) in 2008 makes me want to vomit.
I'd venture to say that the majority of CPUs out there are 8 or 16 bit devices running object code that came from an assembler of C compiler. You business programmers, with your profligate refusal to collect your garbage or lift a finger in the way of optimization, could learn a lot from the people who program these devices.
@lolwtf said:
So this device controls elevators, and you have no support or documentation? Sounds dangerous. Suppose due to some quirk in the code that you missed when disassembling it, the elevator decides to accelerate straight down at some insane speed? If I were you I'd point that out, hopefully something can be done to improve the situation.
I've seen much, much worse in the world of industrial controls. At least the elevator can't move in open space where it can do real damage; and at least it doesn't use ASP.NET or Java. The bottom line is that software engineering is still in its infancy, like medicine was when it was practiced by barbers. (The "barber-surgeons" in this analogy are the electrical engineers, managers, degreeless grandkids, etc. trying to design software without formal CS training).
Hopefully the elevator (like all good control applications) has some minimal hardware failsafes to keep it from chewing up any puppies, babies, or $100 bills.
@Kermos said:
2 - The platform for the device I have to work with is based on an Intel 8051. That chip has got to have the absolute worst assembly language i have *ever* worked with. Give me a motorola 6809. Give me a PIC. Give me an ARM. But for crying out loud, keep the frigging 8051 away from me.
You're setting a pretty high bar there- the 6809 and the PIC are both emminently programmable. I once spent weeks writing ML for the 6809 (CoCo graphics library) for no purpose other than nostalgia / recreation. Right now, I am trying to finish my real work (.NET / WPF - ugh) so that I can do some recerational PIC programming. If Intel ever makes anything half as user-friendly as a PIC or a 6809, it's a fair bet they stole the design.
I've never done 8051 programming but I work with a guy who does. I really think his rationale is to obfuscate certain parts of the system which would supposedly not be secure on a PIC. (Of course, I don't necessarily buy this argument... I pretty good with PICs but I've never been able to steal anyone else's code).
@Kermos said:
5 - The device can't be remotely updated in the field so relasing software updates is impossible. The device only has a 485 port (required to communicate with the controller) and no way to hook it up to a PC without special 485 hardware that most people don't tend to own.
Hmm... RS485 doesn't scare me that much if the PC has an RS232 port. Am I missing something? When I do serial communications it typically involves an industrial PC; do these have more RS485 support than a typical PC (which is technically RS232)? Or are you just pointing out that most modern PCs don't even have RS232 support?
Incidentally, the more I work with RS485 the less I like Ethernet. Amazingly, Microsoft officially deprecated RS232 in 1997, which to me just typifies their affection for bloatware. Those serial protocols are pretty much the core of computer engineering. There timeless in my view, not subject to obsolescence by USB or (ugh) Ethernet.
@Kermos said:
6 - Oh, one of my bosses is under the firm belief that this shouldn't take longer than a week or two to have a fully working, 100% compatible tool.
Let me guess, he's an "idea man" whose visible role consists of spouting out supposedly brilliant ideas and then hectoring you into making them work in the exact form he imagined. Yeah, I really can't stand that kind of boss. Their performance expectations seem to be based on "24" or "CSI," where everyone has a grandkid that can game the NASDAQ while whipping Steve Wozniak's ass at chess and cracking an iPod.
@Jake Grey said:
Now I come to think about it, it's kind of a WTF that PC hardware manufacturers still haven't come up with something more user-friendly than either beep codes or the two-digit figure on a POST card readout after all these years;
Sounds like an opportunity. Maybe it's a good thing that the provenance of these ideas is recorded here...
@Jake Grey said:
MIDI-synthesised voice reading out the error instead of a beep code? (Admittedly the latter might add to the price of the board a bit, as the BIOS would have to go on a microSD card or something.)
Yeah, but that wouldn't have anything to do with MIDI, would it? My experience with MIDI is that it basically contains data like "turn on the Saxophone sound at 440hz." What you're talking about is a waveform sound.
I like the idea of putting the failure information on the net (e.g. emailing some network admin type). The target market for the device would be people administering distant or isolated computers.
@The Vicar said:
Customer: I want [request which is impossible for whatever reason]
Sometimes "whatever reason" ends up being plain physics, or more specifically hardware limitations. I mean, a CGA display adapter can only display 4 colors at 320x200 resolution... asking one's supposedly brilliant grandkid for a second opinion is not going to change that fact. I'm not sure what's to blame here.... self-centerdness? the schadenfreude of seeing a supposed expert proven wrong? a belief that the proverbial "squeaky wheel" gets the metaphorical "grease?"
The Real WTF™ is that JPA and I have basically articulated a potential solution to the problem here, in a quasi-recreational forum for programmers. I have always said that the world should listen more closely to what its CS graduates are saying...
@jpa said:
It could just take its power from USB, and actually have a microphone to decode the boot failures.
TRWTF is either PICs or the fact that I use them also...
Taking this a step further, couldn't one cut the wires off of the PC Speaker (or unplug it) and wire it into one of the PIC's digital inputs, thus allowing the PIC to "hear" the failure beeps? The PIC could power on with the motherboard and count the beeps. The e-mail could probably be generated if, say, two or three beeps occur instead of the expected single beep. This is, of course, somewhat motherboard-specific (or at least BIOS-specific), but I can believe that perhaps a few DIP switches could be used to switch amongst Phoenx mode, AMI mode, etc. and support just about everything.
I don't know if this would meet Navya's spec (USB is not involved), but it seems like a workable design.
@dhromed said:
I really have a hard time putting myself in the heads of the Navya's of the world. How do you get through to them, if not in their native tongue?
There is something in Linguistics called the Saapir-Whorf Hypothesis (sp.?) which asserts that people's entire way of thinking is shaped by their language. To some extent, we think in our language(s) even when we're not speaking or hearing them. Saapir-Whorf notices the big differences amongst languages and tries to draw parallels in attitude and behavior.
I'm not sure I completely believe Saapir-Whorf. I feel capable of "thinking" things I can't neatly express in English, and in amateur hands Saapir-Whorf can seem like a thinly disguised form of ethnic stereotyping. But there's probably some truth to it. I mean, reading those posts, I can't help but think Navya's native language must be much more suited to the practice of law than the practice of engineering.
Lately I have been doing circuit design / assembly language programming using PIC chips. These are cheap ($US 0.50 and up), mostly 8 and 16 bit devices with very limited capabilites. The one I am using has something like 1KB of RAM (as an on-chip register file), runs at 4 mhz, and has a "RISC" instruction set with no hardware multiply or divide capabilities.
Initially, at least, my foray back into low-level computer engineering seemed like a pleasant escape from the world of desktop software. Standards are more exacting. Documentation is more thorough. One knows exactly what's happening when, and buzzwords will get you nowhere. After 10 years of marshalling widgets to the remote tier, I was ready for (forgive me) a paradigm shift. Rightsizing my clock oscillator down to about 4mhz seemed to be that shift.
But now there are signs of trouble. Today, after another long and productive day of programming assembly / feeling better than people who have to use CSS, I decided to unwind reading the manufacturer's forums (microchip.com). Soon enough, I discovered that the "plz e-mail teh codez" phenomenon is by no means isolated to the world of software. Consider the following forum post, in which circuit designs are demanded from afar:
http://forum.microchip.com/tm.aspx?m=199351&mpage=1&key=MMC%2cCard%2cQuestion𰪷
If you don't want to click the link, the first post speaks volumes:
Hi,
Can anyone help me in designing post diagnostic card? i need the circuit design.
Thanks & regards,
Navya
The design in question turns out to be a completely implausible USB device that transmits POST codes over Ethernet. Yeah, it would be nice if a USB device could detect POST failures and e-mail them to me. But how is any of that supposed to happen if the computer just beeps and does nothing when power is applied? Reading down just a bit, I found that this same person (who claims to be an electronics engineer) has basically spammed microchip.com as well as usb.org with the same nonsense. A thousand people (kinder than I it seems) have tried to explain the fundamentally unworkable nature of the request in a thousand ways:
Navya,
The answers are going to be pretty similar to when you asked about doing the same thing via USB.
http://forum.microchip.com/tm.aspx?m=192237&mpage=1&key=𮻭
i.e. You would have to modify the PC BIOS to send the info via
ethernet, you cannot take control of the PC before the OS loads without
modifying the BIOS.
(I'll qualify that. If the ethernet card has
support for a BOOT ROM, then by placing custom code into the BOOT ROM,
you might be able to access the POST code and send it. However, if you
have to ask, I don't think you're capable of doing that...)
</span></i>
<br></p><p><i>Ric</i></p><p>Does this deter our protagonist? Not in the slightest: </p><p><i>
Hi , <br><br> Ric, I have a doubt. I
understood that we cant use usb for interfacing our device(ethernet
controller+microcontroller) to PC.Is there any oher hardware connection
?(except serial ports as some laptops wont be having serial ports)we
want a plugin type device only..ie y we went for usb plugin
option(which will be violating specification)
expecting ur early help,
Navya
I love the closing ("expecting ur early help"). A few posts later, Navya seems to be treating the thread something like a cross-examination:
Hi, <br> Can we access the PCI or ISA bus using USB device, if the system is ON? Can you please send me a suggestion? <br> <br> thanks, <br> Navya</p>And then, the cross-examination continues with the surprise introduction of Exhibit A, block diagram of a motherboard: <br><p style="font-style: italic;">Hi,</p><p style="font-style: italic;"> One more question please.. I found
in net that the USB host controller IC in motherboard has connection to
the PCI bus. USB has no seperate bus in motherboard. Can you please
clarify it?
Thanks,
Navya
This guy (or girl) really, really, really, really wants to support USB on what's essentially a boat anchor, and it's just not happening. And programmers can be stupid, but this represents an entirely different level of futility and desperation!