I'm TRWTF (javascript eval)
-
-
Yes. Thank god for express trains.
-
I learned FORTRAN in 1988 in college, managed to get through the class with a passing grade, but also managed to forget everything I'd learned before the end of the class
I learned FORTRAN about 1976, or a bit after. (F77 may or may not have been a thing, but the college was still teaching FORTRAN IV.) I would have had a perfect score, but for one question on one test. It was a true/false question of the form, "You can(not) do X in FORTRAN." I don't remember what X was, but I gave the opposite answer from the one the instructor expected. X was possible, but the instructor hadn't taught it. I showed him where the textbook said it was possible; he admitted I was right, but still wouldn't give me the point.
I actually used COBOL professionally once
You have my sympathy. I learned very early that COBOL was something to stay as far away from as possible. Halfway through the semester, the COBOL class was in the lab trying to compile and run their first program (we wrote "Hello, world" the first class session), and students were getting hundreds of errors. Never even looked at COBOL since.
-
TRWTF is that I learned FORTRAN in 2008. Previously, at Uni I had learned wonderful things like Java (WTF), C (OMG I hated it), Haskell (don't hate, but of limited commercial useful) and a lot of assembler for MIPS architecture (relevant, isn't it). So they advertised this position for a C developer and I got it and then when I was there it was all.... and here is our codebase in FORTRAN. If you could do some ALGOL that would be great too.
-
F77
Some research tells me this was what I was dealing with. But I think I only had access to a F95 compiler.
-
TRWTF is that I learned FORTRAN in 2008. [tale of woe elided]
Well, you took the job, so you posted this in the right thread. :)
In my case with COBOL, I took a semester of it in college, then in my first real job, one of our customers wanted to pull some data from an old COBOL system into their new ERP, and I knew just enough to know how to write the program to dump to a CSV that could be imported.
-
Girl's gotta eat. XD
-
-
-
I'm not sure if that makes for a "poor" or "lucky" @Arantor.
I won't eat him out of house and home. Promised.
Fun fact: in German we say "eat the hair of somebody's head" when we mean eat out of house and home. In @Arantor's case I think I could munch for a while. Not that it would be tasty.
-
-
-
i may not agree with crockford on all his rules of JS
On a whim I've decided to take a look at Crockford's code conventions. Some deserve to be mentioned on this site.Do not use _ (underbar) as the first or last character of a name. It is sometimes intended to indicate privacy, but it does not actually provide privacy. If privacy is important, use the forms that provide private members. Avoid conventions that demonstrate a lack of competence.
Because using naming conventions preclude using the correct privacy mechanisms, and vice-versa. Using a naming convention with no explicit support is a sign of incompetence. Well, unless you're talking about:Global variables should be in all caps. (JavaScript does not have macros or constants, so there isn't much point in using all caps to signify features that JavaScript doesn't have.)
Because (again) using naming conventions to compensate for language limitations is a bad idea. You should instead use a different convention from every other language out there.Avoid use of the continue statement. It tends to obscure the control flow of the function.
So what's the alternative? Surrounding the rest of a loop's body with a conditional? Using a goto?
I'm relatively sure ordering variable declarations alphabetically is a WTF too, but I can't be bothered to think of a good argument at the moment.
-
Dirty, dirty boy. That is 50 with the tawse for you.
-
Felt like a change again. Thank you.
-
So what's the alternative? Surrounding the rest of a loop's body with a conditional? Using a goto?
It would be interesting (but not so interesting that I'm will to do the legwork / research) to hear the reasoning behind some of these, because there is usually a back story where it all makes sense. And then you can decide if it applies in your circumstance.
I'm relatively sure ordering variable declarations alphabetically is a WTF too, but I can't be bothered to think of a good argument at the moment.
Seems like it makes them easier to find / parse (by the programmer) by keeping them orderly.
-
Because using naming conventions preclude using the correct privacy mechanisms
I am firm believer that often naming convention is the correct privacy mechanism. I am all for properly encapsulated code with good separation of concerns and all that but there are times when monkey patching is just so much less work than it's worth it.
I'm relatively sure ordering variable declarations alphabetically is a WTF too
The main benefit of sorting things without intrinsic order alphabetically is for merging. It eliminates some conflicts compared to always appending to the list as happens otherwise and it makes it easier to see whether all elements from both sides were properly included since the lists are in the same order.
-
Seems like it makes them easier to find / parse (by the programmer) by keeping them orderly.
I don't know. Maybe.Personally I find it's more intuitive to order them following a mix of criteria:
- Life-time scope of what the variable refers to. References to objects existing outside the function come first, followed by results of calculations not depending of intermediate results of this function, finally our own scratch variables come last;
- How high in the model's hierarchy each referenced entity is. "Big guys" come first;
- Variables should generally follow the order in which they are used in this function's body.
These criteria are rather abstract and imply no definite ordering, but in practice I feel like there is usually an order that makes more sense.
-
The main benefit of sorting things without intrinsic order alphabetically is for merging. It eliminates some conflicts compared to always appending to the list as happens otherwise and it makes it easier to see whether all elements from both sides were properly included since the lists are in the same order.
Does that happen that frequently? I don't really see that as a benefit anyway, assuming you run a lint over your code.
-
Does that happen that frequently?
Depends on how often you are merging. And I have experience with this more in C++ (includes) and python (imports) and such where the lists are longer. For variables, you should only have a few variables at any one place anyway and for something like 3-element list it does not matter.
-
yeah, but i'll deal with them, or disable the behavior to use JSlint. it is still probably the best JS linter out there, even if it does force most of crockfords style choices on you (and a lot of those rules are quite valid and actually a good idea. others though....)
-
Do not use _ (underbar) as the first or last character of a name. It is sometimes intended to indicate privacy, but it does not actually provide privacy. If privacy is important, use the forms that provide private members. Avoid conventions that demonstrate a lack of competence.
Because using naming conventions preclude using the correct privacy mechanisms, and vice-versa. Using a naming convention with no explicit support is a sign of incompetence.
If you want to use prototypical inheritance, that's the only way to get any kind of "private" members. He can whine to the sky, but the only way to avoid the "new functions for each new object" conundrum is to use prototypes.
Sacrificing performance for readability is a legit tradeoff (and the one I'm myself making in most of my code), but it's not suitable for the absolute good practice rule everyone should blindly follow.
Global variables should be in all caps. (JavaScript does not have macros or constants, so there isn't much point in using all caps to signify features that JavaScript doesn't have.)Because (again) using naming conventions to compensate for language limitations is a bad idea. You should instead use a different convention from every other language out there.
Personally, I'm reserving all caps for consts and quasi-enums. But whatever. You should have so few global vars, it shouldn't matter how they are named.
Avoid use of the continue statement. It tends to obscure the control flow of the function.
Fuck you Crockford. That's your personal coding religion, has nothing to do with javascript.
-
I am all for properly encapsulated code with good separation of concerns and all that but there are times when monkey patching is just so much less work than it's worth it.
And that, ladies and gentlemen, is why so much Javascript is awful. I prefer to write in languages where doing it right is easier than doing it wrong because of this exact attitude.
JSlint. it is still probably the best JS linter
Have you found it to be better than eslint? Because eslint lets me specify the exact standards I want to use, but I'm new to linting JS in general.
-
well it does help that my coding style was already closely aligned with Crockfords. I'll check ESlint because maybe there's a better one that i haven't found yet.
-
I believe the defaults are closely aligned with JSLint, but it allows every single rule to be adjusted.
-
i will need to investigate. JS lint allows most rules to be adjusted, but not all of them.
-
The one that ruled it out for my workplace is the unnecessary semicolons rule. We prefer extra semicolons to guard against accidentally omitting an important one.
-
that one annoys me, unnecessary semicolons are unnecessary [image meme.bmp]
but then my standard js file looks a lot like this:
/*jslint node: true, indent: 4 */ /** * @module FILL ME * @author Accalia * @license MIT * @overview blah blah blah */ (function () { 'use strict'; //code goes here }());
-
You probably don't need self-executing function in node.
-
probably not, but it is habit and i hate polluting global namespace
-
-
..... again?
-sigh- what did i do this time?
-
-
var variable = 2;
that semicolon is required for me
var variable = 2;;
that's an unnecessary semicolon.
if (false) { //something };
so's that one.
-
Oh. Then we're good.
-
so.... forgiven?
-
-
:dancing:.... hmm not good enough....
-
-
..... why?
-
Because it's easier to check in a code review that there are semicolons everywhere than to pay attention to where they must go, and therefore, someone will accidentally turn a closure after a function into a self-executing function call.
-
-
well yes, but the point was to show an unnecessary semicolon.
would you have preferred:
if (true) { } else { //Something }
‽
-
No, this is far more efficient:
switch (true) { case true: break; default: //Something }
-
<noscript> <script> // something </script> </noscript>
-
$(".happy-discourse-user:first").each(function(){ // something });
-
for(var i = 1; i < 1000; ++i) { switch(i) { case 0: // Something; default: break; } }
Threw in an optimization loop as a bonus.