@joe.edwards said:
@Adanine said:FTFYThe rite of piracyI never knew about that one.
@joe.edwards said:
@Adanine said:FTFYThe rite of piracyI never knew about that one.
@morbiuswilters said:
I'd protest, but you have a biological imperative to hunt giraffes, and I don't want to be a bigot.
FTFY
@esoterik said:
straight C malloc/delete is a paradigm that just refuses to die
That's unfortunate, since straight C malloc/free is a paradigm that isn't all that difficult, and is actually correct.
@spamcourt said:
@bridget99 said:reason-biased
I had never heard that term before, what does it mean?
It's defined right there.
@bridget99 said:
I'm not sure what the intent could be that would make any difference. It is a bit unfair when people get turned into an example... but it's their own damned fault.
So you agree that she should have been fired?
@joe.edwards said:
I want a Visual Studio add-in or extension that adds an arsenal of weapons you can use to shoot, burn, bomb, and nuke bad code. Each weapon would destroy a different amount of code (gun shoots an identifier, flamethrower burns a line, bomb blows up a method, nuke annihilates a class, etc). Of course it should have vivid animations.
@KrakenLover said:
How so many businesses, so many business people, accountants, sales people, executives, can still not get this simple concept and be so resistant to the idea
There's no reason for any of those people to understand the technical details, nor why they matter. From their perspective, any explanation of why a WTF of this caliber is a Really Bad Thing is only going to sound like you're trying to fleece them, and hey, they're just not gullible enough to fall for that one. Perhaps TRWTF is their legal department.
@Ben L. said:
Yes, because chrome doesn't use Adobe Flash, it uses Google Chrome-brand Adobe Flash-like Substance.
Interestingly, my Chrome just updated (I should say, I updated it, because it starts to become unresponsive whenever there is an update pending) and managed to break Flash.
I can't shake the unsettling feeling that "[code]salary ASC[/code]" is the result of wishing to convey "increase your salary by taking a job with us!" and losing something in the translation.
@boe2 said:
not sure if I am missing some weird form of satire here
It's Lorne Kates. Of course it's satire.
@Sutherlands said:
@Kittemon said:I can't tell whether or not this is meant to be ironic
Well, even though you don't understand irony,
@Sutherlands said:
I was being serious in what I said.
@Sutherlands said:
Which makes me neither of your follow-up questions.
@BC_Programmer said:
a loose translation being "Shit, I have no idea what the words I used meant and have been called out on it, quickly! let's backpedal, beat around the bush and hope somebody else jumps in talking about something else!"
A precise translation being: "If I provide a definition, someone will say it's wrong, even if it isn't wrong. If I copy a definition from a dictionary, someone will say that I don't understand the definition, while someone else will deride my choice of dictionary brand as being considered non-authoritative and generally looked down upon on account of its sloppy definitions, etc."
I wasn't in the mood to delve into that bit of nonsense; naturally, it occurred anyway.
@Zecc said:
@Kittemon said:@Zecc said:Not before you declare what you understand by "sarcastic" .The same as sarcastic, I wager.
You lose the wager.
Something different than ironic, obviously.
<font size="1">Loosely speaking, for the purposes of this increasingly silly conversation, sarcasm ⊂ irony. Discuss this concept with your shoulder alien.</font>
@Sutherlands said:
@Kittemon said:I can't tell whether or not this is meant to be ironic, and assuming so, at which one of us it's directed.What do you think ironic means?...
@Zecc said:
The same as sarcastic, I wager.
@Sutherlands said:
What? How long do you think it would take to add a debugger to Ruby? I would guess a very large chunk of time. I'm sure that blakey's company doesn't want to spend their money for him to develop a debugger. I would also guess that blakey doesn't want to do it in his spare time, because that's work. It's not fun, all it does is make him more productive at work. Why would he spend his free time doing that? That's why people pay for things that work properly. They pay for things like debuggers.
I can't tell whether or not this is meant to be ironic, and assuming so, at which one of us it's directed.
@Sutherlands said:
I don't understand how anyone in this thread could argue with blakey when he says that ruby is a worse language than it would be if it had a good debugger. It boggles the mind.
I'm pretty certain nobody has made that argument.
@blakeyrat said:
Say I had a physical disability, like I was missing a hand. Would you endorse a programming language that required the use of both hands?
I'm trying hard to make sense of this metaphor. While I'm sure that some of us would appreciate a programming environment that allows us to have one hand free, it's likely that we would get more useful work done by applying both to the task at hand.
@blakeyrat said:
If I don't have the rote memory skills to effectively use a CLI or to effectively use an editor without things things like Intellisense, should I be excluded from coding in Ruby?
Frankly, that's silly. Nothing about Ruby per se excludes you from using it effectively, except of course that it is open source, therefore you choose to exclude yourself from it.
@blakeyrat said:
Usability isn't just about about making the software usable by the average person; it's also about accessibility, making the software accessible to everybody regardless of their capabilities (physical or mental).
Accessibility has its limits. Not everything can be made fully accessible, and where it is possible, it requires additional design consideration and implementation effort. Programming environments for Ruby probably lack accessibility considerations purely because nobody has devoted any effort to that end; not because it isn't fun, not because the Ruby community hates blakeyrat, but merely because it hasn't proved to be necessary or useful to anyone who cares. Which leads nicely to a point that I've been wanting to make for some time now:
@blakeyrat said:
Not that anybody involved in the Ruby ecosystem ever gave two shits about usability. This shit is important to me. I think it should be important to you.
It's open source. There's nothing preventing you from making the supremely useful tool that you want and popularizing it, except that you would apparently rather continually complain about not being able to mooch off others' work instead of just doing that work yourself. Why don't you go involve yourself in the Ruby ecosystem? Then there will be someone in it that gives two shits about its usability. That's how open source works, derp.
@blakeyrat said:
I don't believe the open source philosophy and culture leads to good software development. Quite the opposite: the philosophy says "ship early, ship often" regardless of quality level. The culture seems dead-set against any QA process or user testing.
I think you might genuinely believe this, and I don't know whether to lol or facepalm.
@snoofle said:
The reason this is one wide table is because it was originally about 20 tables. There were > 2 billion rows of data in each table. Doing the join was expensive. This turned out to be a pure performance boost.
...which makes the desire to use EAV even more of a WTF than usual.
@Wrongfellow said:
Let's put this in context here, shall we? Specifically, this bit of context:
Protip: context goes before the anecdote, not after it.
@blakeyrat said:
The mental illness is posting a WTF about an application and not mentioning <something important>.
We're all so fortunate that you never pull that kind of crap!
@dhromed said:
Tit for tat is not a logical conclusion. It spawned wholesale from snoofle's mind. ... Snoofle fabricated the situation as tit for tat, and somehow you and others are defending this as though it was implied by the words "used to it".
They know each other, and appear to have a tit-for-tat running prank-fest going on; otherwise the Director wouldn't have dared.
But the Director did dare, so the logical conclusion is...?
The syllogism may well be incorrect, but it would be a far greater leap to make that assumption based entirely on a blakeyrant.
@dhromed said:
I urge you two to not do any large-scale research. Perhaps a career as religious leader would be more suitable, where you can invent your own fantasy stories and claim they were suggested by existing texts.
You seem a bit uptight lately. Here, try a purple dildo.
@blakeyrat said:
@Kittemon said:@blakeyrat said:You could not break the law.
Maybe you could not, but what when your neighbor couldn't not?Unless you're talking about your neighbor hacking into your wifi
Which was, of course, precisely the question mooted in the post to which you initially replied, making your response seem... peculiar.
@blakeyrat said:
You could not break the law.
@arh said:
@ASheridan said:
Developers just don't like writing documentation. I've never known a developer to like writing documentation.
Now, when you're being paid to do a job, if documenation writing is part of that job, then you don't have a lot of choice. If you're working on something for free, then you're unlikely to be doing stuff you don't like doing.
It's not, as Blakeyrant suggests, that open source developers are any worse than non-open source developers (and they're not mutually exclusive states), it's just that if you're working on something for free, you'll trend toward the tasks that you enjoy more than those you really don't.
Sorry, but idiots don't write documentation. It doesn't matter if you do it for free or not, if you don't write documentation without being asked, you're one of those mediocre idiots I do everything to avoid working with.
Are you suggesting that you actually want the idiots to be writing documentation? Because I don't think that will actually produce a more useful result.
@ASheridan said:
@blakeyrat said:@nonpartisan said:The point is scope. You said you were a "system administrator". If all you ever did was to restart a service when it had problems, you were not a server administrator.completely irrelevant "no true scotsman" bullshit
What he said is pretty much on the ball though. If all you did was restart a service every now and again, that hardly quialifies as a system administrator.
What blakeyrat actually said was that he was running a SMAUG-based MUD for seven years. So that isn't seven years of experience as a sysadmin; that isn't even one year of experience repeated seven times over. That's more like -7 years experience.
@Xyro said:
Wtf, where did this thread come from? [...] Anyone wish to summarize?
blakeyrat tried to use Linux again.
@PSWorx said:
@Kittemon said:@blakeyrat said:So if a library exists, WHY THE HOLY SHIT IN HELL IS THE GUI TOOL CALLING THE CLI APP!??!??!!?>!?@>Your post explains nothing, it just raises more questions!
I'm surprised nobody brought up this point previously: the CLI defines a protocol. Anything built against that protocol can be insulated from irrelevant internal changes to the underlying library, so by default, it's preferable to write other software against an app's CLI rather than its library API. It's a basic *nixy philosophy.
So that explains 1) why it's done, and 2) why there is such strong resistance to changing the way it's done.
The weird fixation with strings that seems so prevalent in the *nix is something that has been bugging me for a long time already. Of course you can treat the input syntax and output formatting of a specific CLI utility as a protocol - just like you can treat the, I dunno, the public Application Programming Interface of the underyling library as one. Comparing the two approaches, I see the following pros and cons:
Using the CLI as a protocol:
- + Output can be interpreted by humans
- - You're restricted to exchanging strings. Good luck solving the problems of packaging structured information into strings for the 100th time.
Just how much structure do you need? Many useful programs can format their output in a trivial way; that's somewhat the point of the "do one thing only" philosophy. OTOH there isn't anything that restricts you to just the CLI if you really need to do something complex; there are plenty of programs that pass around information in disk files, and plenty of existing formats which can solve your packaging problem.
@PSWorx said:
- - Your tools waist CPU by formatting data into a human-readable string, which is then parsed back by the next tool - witout any human ever looking at it.
Unlikely to be significant, and the previous response also applies.
@PSWorx said:
- - There can only be one version of the protocol all the time. You have to take great pains to stay backwards compatible to every previous version ever.
- - Actually, there are no versions at all. There is no structured description of the protocol, except convention and a likely outdated manpage.
History suggests that the first point doesn't seem to be as painful for developers are you're making it out to be. The second point is speculatory.
@PSWorx said:
- - Corollay: Every client of your protocol probably parses it subtly different, depending on which particular regex the developer felt like using.
s/felt like using/needed to use/ Besides which, who cares? I can't control what the consumer does with the data I produce, no matter what format I've presented it in.
@PSWorx said:
- - If you want to do anything non-trivial, you'll want to chain different tools together, which all have their own rules on string parsing. Good luck figuring out the correct syntax when multiple tools interact. Examples: find -exec, using bash wildcards on directories with more than 10.000 files.
"their own rules on string parsing" is a red herring. To use a tool correctly, you have to know how it works, whether that involves parsing output from the CLI or linking against the API. Moreover, multiple unrelated tools interacting will always have a chance of failing to work as expected.
You forgot something quite important:
@PSWorx said:
Ok, now let's treat the API of the library as a protocol:
- + Debuggers can generate a "human readable" view of the protocol exchange for you - no need to burden the protocol itself with that.
That's reasonable, but now you've burdened the programmer with the need to reconcile two different ways of viewing the protocol, as well as increased your own maintenance load.
@PSWorx said:
- + You have an explicit description of the protocol as header files, type libraries, etc. You can add new functions while keeping your old API completely unchanged for legacy clients.
An explicit description is nice, but it still doesn't tell you anything about how to use it correctly. Moreover, your header files and type libraries may be less than useful when the client is written in a different programming language than the library. The CLI offers a simple and standard way to access your output.
Keeping your old API unchanged is also nice, until you realize that you really do need to fix it, at which point you're in the same boat as breaking the CLI. As for adding new functions, how is that going to break the CLI? Legacy clients are not going to invoke your new functions, because those functions didn't exist at the time the clients were written. They shouldn't be receiving the output from your new functions either.
@PSWorx said:
- + There is a standard way for launching tools (loading the library), invoking commands (calling function) and exchanging all kinds of data (arguments). You can chain two arbitrary libraries together and don't have to think about how to pass arguments between them! Unbelievable, I know!
Please leave the idiotic rants to blakeyrat.
@PSWorx said:
So, please, I'd really like to know this: Apart from "because we've always done it like this", why would the former EVER be a good idea?
@blakeyrat said:
So if a library exists, WHY THE HOLY SHIT IN HELL IS THE GUI TOOL CALLING THE CLI APP!??!??!!?>!?@>Your post explains nothing, it just raises more questions!
I'm surprised nobody brought up this point previously: the CLI defines a protocol. Anything built against that protocol can be insulated from irrelevant internal changes to the underlying library, so by default, it's preferable to write other software against an app's CLI rather than its library API. It's a basic *nixy philosophy.
So that explains 1) why it's done, and 2) why there is such strong resistance to changing the way it's done.
@Cassidy said:
@bridget99 said:
it does not work as advertisedtwo PHP apps accessing the same DB
Isn't SQLite specifically contraindicated for this use case?
@JoeCool said:
@bridget99 said:thread
There are specific flags you compile with to enable multi-threaded support. Read the documentation.
So, you're saying that it does, in fact, work exactly as advertised?
@bridget99 said:
if you really think all of the bugs have been worked out of foo, you're deluding yourself.
Brillant insight.
@Cassidy said:
@CodeNinja said:
then we ran across the resume for our current DB person. In fact, she found it.Should have put it forwards and - with a completely straight face - advise that this was the only person with the necessary skills and they should call Sutherlands up for an interview immediately.
FTFY
@Lorne Kates said:
Who had to program those subroutines? Who had to do QA on them?
The same guys who thought them up and then wrote them, I should think.
@Xyro said:
@blakeyrat said:@Xyro said:You mean "NB!!" or "URGENT!"?
No. That's totally different. "NT" means "no text", that is, "no text in email body", that is, "you don't need to actually open this email it's just a subject line". It has no relation with urgency.
I have not heard of that one before. That really doesn't even make sense. That's like sending out an email with the subject line of "This email intentionally left blank".@blakeyrat said:
@Xyro said:Yes sir.
Why'd you ask if you were just going to pull your own answer out of your ass?
Because I thought if I'd ask for clarification ("Was NT a typo for NB?"), I'd get made fun of for being a pedantic dickweed, and the thread would descend into flames. Last time that happened, a few of you were making fun of us, calling us aspies, and telling us to use inferencing and assumptions. Well, I tried that, and apparently that doesn't work either. :( I suppose I'm supposed to get all up in a fury that you gave an acronym/initialism/abbreviation without defining it, but I'm beat today.
Wait, what? As I read this thread I thought, hey, Xyro is making a funny and blakeyrat is being a pedantic dickweed and/or idiot. But... but...
...my whole world is coming unraveled
@Xyro said:
@Speakerphone Dude said:In my experience only white people between 25 and 32 with a lisp, a slightly above average academic record and a secret fantasy on the 14 year-old living next door use that expression.
>.>Speakerphone Dude, you seem like you're in a really bad mood today. Maybe you should put down the internets and play outside for a little while...
Maybe the neighbors moved out this morning.
@blakeyrat said:
don't use the quote button
@blakeyrat said:
Now go to hell.
@blakeyrat said:
Why the fuck aren't you clicking the "in reply to" link.
I did, despite thinking it obvious that you were replying to me. You have deduced... poorly.
@Lorne Kates said:
Still, though-- why the fuck didn't you click the "Quote" button?
If he quotes something, he may find himself obliged to actually read the quoted text, so as to avoid an accidental non sequitur. And reading may lead to accidental comprehension... the very antithesis of the blakeyrant.
@joe.edwards said:
Is that a (non-pedantic)-dickweed, or a non-(pedantic-dickweed)?
Is that a logical or, or a bitwise or?
@PJH said:
Seems the undef is ignored (at least with GCC)
#undef foo is always ignored for non-defined foo.
@PJH said:
@Kittemon said:@PJH said:Absent the the defines, my code is perfectly valid C.Is this C? If so, one wonders what happens with stuff like:
char buffer[512];
[...]
if (fread(buffer, sizeof buffer, fp)){....If it's C then that's obviously a syntax error.
@lettucemode said:
First of all, tags.
Secondly, would it matter? IIRC foo(x) and foo (x) are different for a suitable parameterized definition of foo.
@lettucemode said:
Secondly, there is often a difference between what a person is saying, and the words they use to say that thing.
@blakeyrat said:
BTW this is a fascinating look into the pedantic dickweed mind. A normal non-dickweed would probably go, "let's see a couple of lines added so the programmer doesn't have to deal with errors even though it could lead to data corruption further on... yeah, that's pretty goddamned similar to 'on error resume next' isn't it?"At this point, a non-pedantic-dickweed can use an amazing deductive ability called "inference" to understand what is really being said.
When something stupid has been said, it's unsafe to assume that something non-stupid was meant. A non-pedantic-dickweed will happily use an amazing deductive ability called "inference" to pretend to understand what a person is saying, despite the words they use to say that thing; whereas a pedantic dickweed will point out that something stupid and ambiguous has been said, preferring to put the onus of clarification on the originator of the stupidity instead of muddling through all the possible permutations of stupid to (possibly incorrectly) guess at a useful meaning.
@PJH said:
Is this C? If so, one wonders what happens with stuff like:
char buffer[512];
[...]
if (fread(buffer, sizeof buffer, fp)){....
If it's C then that's obviously a syntax error.
@blakeyrat said:
@morbiuswilters said:What's more likely is that I'll pass you at around 8200, then vanish for another 3 years. When I come back, you'll be over 20k and Blakeyrat will have transcended space and time to become a being of pure energy and disgruntlement.
@nat42 said:
A waitress asks what you and your friend will be drinking, you order a beer and your friend orders a double scotch on the rocks, you think "that sounds good" and tell the waitress to change your order to "what he's having".
Am I to believe that half of all computer science students couldn't figure out what two drinks should be sent to the table in problem above [...]?
Obviously, that will be a beer, a double scotch on the rocks, and a double scotch on the rocks (counting from 0 of course).
@blakeyrat said:
@DOA said:It's not like there's a quota and someone's using up the threads.With CS, that might not actually be true.
I was going to be all clever and suggest looking that up in the CS help but OMG WTF THAT ACTUALLY DISPLAYS SOMETHING USEFUL NOW?!
@nexekho said:
Talking to the code is a great method of debugging, actually. Coding Horror refers to it as rubber duck problem solving.
@a comment from a link from your link said:
For a while, I used a dinosaur hand puppet for this, and gave him Strong Bad's voice.
TRWTF is you for apparently reading neither Section 14 nor the resources at the end of the document that help you understand how Facebook works.
@Severity One said:
@TheCPUWizard said:Presumably, the point is that in C++, throw is a syntactical construct, whereas an exception is purely a library construct. Coming as it does from a C philosophy, it wouldn't make sense to have those tightly coupled unless it were absolutely necessary. Managed languages do it differently.@Severity One said:Exceptions in C++ have always seemed to me as a bit of an afterthought. After all, you can write something likethrow 5
and a bit further on catch an integer.
Actually the contraint to use an arbitrary base class (such as System.Exception in .NET) imposes some limitations. The ability to throw "anything" can easily be abused, but also does not impose constraints.Sure, but what's the point? C doesn't pose any restrictions on you to trash your memory; I wouldn't call that an advantage.
This 'restriction' is that an exception is a full-blown class with a couple of well-defined methods. I don't know .NET, but I would imagine it's similar to Java's java.lang.Exception, which lets you cascade exceptions and produce a stack trace.
@Daniel Beardsmore said:
@HunterM said:windows 7? ....same bag of fit?
honestly i brought 4GB RAM fr my laptop not knowing tht this bull *** is also there, im just a an effin normal customer tht wants to pay my money and buy a good system.... y the *** shud i hav to know about all this effin bottlenecks tht screw around with my operations? how do u increase GDI handle limit in windows 7, and does it make a difference????? im in the middle of my dissertation and pdf-xchange viewer keeps running out of resources to open files! die microsoft!$0 $0$0 $0 $0[Really, MSDN gonna let me edit this guys post??? I didn't even write this post... but I can not help myself, I think this post sucks and is not productive. I DO need to know these types of things... so users like chraniac here should not have too. I recommend using a different pdf viewer application... not Adobe's. His anger is real, his spelling is atrocious, but the direction of anger is misdirected IMO! I have said my peace... now lets see if I can "submit" this edit...]$01 WTF point for a senseless rant against buggy $0[ software, and 2 bonus WTF points ]$0 $0 $0 $0 for a garbled MSDN comment entry $0. A final bonus WTF point for saying "Not Adobe's" when the commenter specifically noted that the bug lay, for a change, not with Adobe Reader but Tracker Software's PDF-Xchanger Viewer.
You missed the obligatory wtf point.
@morbiuswilters said:
@Kittemon said:Middle of the day? Dude, it's morning.Yeah, I know, I was just trying to make it sound better than it actually is..
It's an IPA sampler; hard to get much better than that!
@morbiuswilters said:
@Kittemon said:Wrong timezone.Portland, Oregon is not on the coast.
You know all the Portlands?
...the one in Oregon is pretty close to a coast. Close enough to make a lame joke of it, anyway.
@morbiuswilters said:
@Kittemon said:Also, unconventional working hours.Hey, I'm not judging, but unconventional working hours or no, you're still drunk in the middle of the day.
Middle of the day? Dude, it's morning.