Maybe we're lucky and he did it as a joke.
*prays*
My roommate found this one yesterday:
^[a-zA-Z0-9_\{\}:\.\-\+]+.|.[^a-zA-Z0-9_%\{\}:\.\-\+]+[a-zA-Z0-9_\{\}:\.\-\+]+.*
I've seen this same kind of construct in MCF, an application in the SPEC benchmark suite (a suite widely used for benchmarking in computer engineering/science research).
I guess if you're making a suite representative of real applications you need to represent stupidity as well.
So, I'm working with some research code, and lo and behold, when I run with an input size of 64 I get a failed assertion due to a malloc. I present to you, the full glory of this highly informative and useful assertion:
malloc.c:3096: sYSMALLOc: Assertion `(old_top == (((mbinptr) (((char *) &((av)->bins[((1) - 1) * 2])) - __builtin_offsetof (struct malloc_chunk, fd)))) && old_size == 0) || ((unsigned long) (old_size) >= (unsigned long)((((__builtin_offsetof (struct malloc_chunk, fd_nextsize))+((2 * (sizeof(size_t))) - 1)) & ~((2 * (sizeof(size_t))) - 1))) && ((old_top)->size & 0x1) && ((unsigned long)old_end & pagemask) == 0)' failed.
....thanks assertion writing guy. Thanks a lot.
Just wondering...is the title was a deliberate reference to The Decemberists' The Island.
@tgape said:
Sounds very much like some code I got to rewrite back in 2001. Except your guy seems to understand the importance of wrapping lines at some point - potentially, even meaningful ones (although maybe not so much.) The mess I had to work with had many lines over 200 characters long.
Manually wrapping lines is just as bad as having 200 character lines in the first place. A manual wrapping that looks good on your screen might not look good on someone else's, and modifying the line breaks the formatting. Modern text editors have automatic wordwrap for a reason.
Aside from which, the solution to long lines is not wrapping, it's breaking it down into multiple lines of code.
it's probably a liability thing. If _they_ block a critically important email because they think it's spam, the customer calls up and bitches once he finds out. But if a _customer's_ spam blocker blocks a critically important email and they miss it well hey, that's their fault.
On another note, that's why I use gmail's tagging ability. If I sign up for an account at, say, abc.com, I'll do so with myemail+abc@gmail.com. It gets delivered to myemail@gmail.com, but with the tag intact so if you start getting spam to that address, you can filter off the tag and delete everything coming to that tag. (Unfortunately, it doesn't help you if you're not already using gmail, and a crafty spammer could theoretically parse out the tag, but I doubt many would bother putting in the effort to do that).
@DOA said:
On a side note is there a point to this creature creator? Without the actual game does it do anything interesting?
It's pretty fun to play with for a little while at least. That's about it though, just creating a creature and making them dance, pose, or do other things like that. There's a trial version of the creature creator (which has a limited number of parts) which is free, so if you're interested in the game at all you may as well pick that up.
To play a little bit of devil's advocate, imagine, say, a medical or biological researcher who had samples he had to tend to which had been growing for several months. When you research is something you can drop and pick back up somewhere else or right away it's one thing, but if your research involves time sensitive work or work that can be undone it's a completely different case.
If this guy is the same guy I think it is, (and if I understand his ramblings well enough), essentially what he's trying to propose is a data flow architecture. You take a giant mesh of processing elements (either logical or physical) and map instructions to them. Then when all inputs to the instruction arrive at the PE it fires and the output gets routed to another PE. No need to use parallel algorithms or break things down into threads. Just map
In general the concept is good, but it has serious problems of its own whicih have kept it from being used extensively. First of all, it requires significant transformations to transform imperative sequential code (like C) into data flow code. Secondly, you end up simply being unable to map enough instructions onto the processor to extract thread level parallelism. There just isn't enough storage. Lastly, poor programming or algorithmic choices can still cause lengthly critical paths.
All that being said, I may have misinterpreted some of what the guy's said and he may be even more loony than I thought.
Is that a site about blunders in IT messes up it's own April Fool's prank by inadvertently breaking links to the forums from the main page.