Coding Confession: how not to use LINQ


  • BINNED

    @TwelveBaud said:

    but side effects are the only way to interact with a computer, therefore functional programming can't be useful on a computer

    You're going to have to qualify this for me. What side-effects? We have some kind of input and we produce some kind of output. That's all we're doing, really. I don't see space for side effects that are necessary or even desired in a well written program.

    @TwelveBaud said:

    Just saying where you're gonna end up going with this.

    Eh, I mostly wanted to see how far the pedantic dickweedery will go. I mean, take JS when manipulating DOM: most often output (where I count "what happens on the screen" as output also, not only the return value) depends on state of many elements in DOM. Now, I could enforce a rule that all DOM nodes that are relevant have to get passed to the function. I could then argue that a checkbox that is checked is a different input than the same checkbox that is unchecked, even though it's the same DOM node that we're dealing with.

    My point being, most of the functional vs. non-functional debate is ideological masturbation. Every correct program must produce same output when given same input, otherwise it's not correct (unless there is inherent randomness - in which case we can argue whether randomness is even allowed in "functional" language).

    Just write stuff that is maintainable, is my point. I don't give a rat's ass about what you want to call it in the end.


  • ♿ (Parody)

    @Onyx said:

    You're going to have to qualify this for me. What side-effects?

    The simplest is any sort of output, like printing to STDOUT. That's a side effect. So is changing the state of anything outside the scope of the function.

    @Onyx said:

    I don't see space for side effects that are necessary or even desired in a well written program.

    They're necessary for a useful program. At least in the sense of output sort of side effects.


  • BINNED

    @boomzilla said:

    The simplest is any sort of output, like printing to STDOUT. That's a side effect. So is changing the state of anything outside the scope of the function.

    So this is coming down to either me not understanding the definition of "side effect", or not understanding English.

    If I meant for stuff to happen on STDOUT, how is that a side effect? What, because I changed STDOUT's memory, but STDOUT is not the part of memory my function resides in?

    That's some pedantic shit right there.



  • @RaceProUK said:

    True, but in C#, functions aren't a first-class citizen;

    They kind of are.

    @RaceProUK said:

    you have to use one of the many Action<> or Func<> classes to represent them. And even then, they're limited. And that's not including some APIs which want Predicate<> or Delegate<>.

    This is one of those criticisms people make where I just kind of sit back and go, "so what? Who gives a shit how it's implemented as long as it works?"


  • ♿ (Parody)

    @Onyx said:

    That's some pedantic shit right there.

    Well. That's what you get when you wander into the functional section of programming.



  • The pedantic dickweeds of functional programming believe any change of the state outside of the function invalidates something as functional programming; input is only the parameters to a method and output is only the return values. Thus, printing something to the screen is a side effect. Building a buffer of input keystrokes is a side effect (since the buffer is retained after the key-hit-handler returns). Interacting with outside hardware is a side effect.

    Practicalists will accept some side effects as necessary (interaction with a user, for e.g. fetching parameters or job control) or unnecessary but desired (saving a results file). They segregate functional from side-effect-inducing, usually by tagging side-effect-inducing with a special character like !, and then say "side-effect free programming allows us to use memoization (fancy term for "function result caching) and get consistent, deterministic results". Pragmatists deny even that concession, and go write proofs on the wall of their padded cells.



  • @boomzilla said:

    The simplest is any sort of output, like printing to STDOUT. That's a side effect.

    It's only a side-effect if you didn't intend to do it. Otherwise, it's an effect.

    @boomzilla said:

    They're necessary for a useful program. At least in the sense of output sort of side effects.

    But then they aren't side-effects they're just effec-- nevermind. I give up.



  • @TwelveBaud said:

    The pedantic dickweeds of functional programming believe any change of the state outside of the function invalidates something as functional programming; input is only the parameters to a method and output is only the return values. Thus, printing something to the screen is a side effect. Building a buffer of input keystrokes is a side effect (since the buffer is retained after the key-hit-handler returns). Interacting with outside hardware is a side effect.

    By that logic, it's impossible to write a program that actually does anything at all.



  • As defined by moronic functional programmers, a side effect is any effect other than "getting a return value I can substitute in place of the function".

    @blakeyrat said:

    By that logic, it's impossible to write a program that actually does anything at all.

    EXACTLY!


  • FoxDev

    @blakeyrat said:

    They kind of are.

    Well, not really; it's not possible to define a function outside of a class or struct, whereas in a proper functional language, you can.



  • @TwelveBaud said:

    As defined by moronic functional programmers, a side effect is any effect other than "getting a return value I can substitute in place of the function".

    Ok well instead of redefining common well-understood words, maybe they should have just made up a new one like "frusluate" so we wouldn't all be so fucking confused.


  • ♿ (Parody)

    @blakeyrat said:

    It's only a side-effect if you didn't intend to do it. Otherwise, it's an effect.

    That's only true if you change the definition of side effect WRT a function. This sort of thing is very important to compilers. If you don't use the result of a function, it can be optimized away if it doesn't have side effects.

    @blakeyrat said:

    I give up.

    That's probably for the best.

    @blakeyrat said:

    By that logic, it's impossible to write a program that actually does anything at all.

    Yes. If you write everything as strictly purely functional.



  • @RaceProUK said:

    Well, not really; it's not possible to define a function outside of a class or struct, whereas in a proper functional language, you can.

    Can I do functional programming in C#? Yes.

    QED.

    I give no shits for what you consider "proper". It's implemented. It fucking works. It's even kind of debuggable (which is more than any other "properly functional" language can claim.)

    Everything else is pedantic dickweedery.



  • @boomzilla said:

    That's only true if you change the definition of side effect WRT a function.

    I'm not the one pulling new definitions out of thin-air here. I'm the one confused because until a few minutes ago I thought what words meant.


  • ♿ (Parody)

    @blakeyrat said:

    I'm not the one pulling new definitions out of thin-air here.

    Yes, you are. Or you're saying that English words can't have different meanings depending on context, especially when that context is particular subject matters.

    Either way, you're the one who's wrong.



  • You know what words mean when used by regular people who talk about regular things.

    You don't know what words mean when used by ivory tower mathematicians who write their "code" with Greek and Hebrew letters and ridiculous symbols whose meaning can't be divined by mere mortals


  • BINNED

    So after reading all of this stuff that Dickhorse refused to stream in as it was written:

    Who gives a shit, it works. I'd rather have an ugly, hacky looking piece of C (as long as it's commented and explained) that does the job than a "pure" piece of Haskell or whatever that does jack squat because it refuses to touch STDOUT, and then apologises for being useful by marking it with a !.

    Ideological wanking, as I assumed. Now excuse me, I have shit to make that will actually do something useful.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    Ok well instead of redefining common well-understood words, maybe they should have just made up a new one like "frusluate" so we wouldn't all be so fucking confused.

    The thing is, "side effect" does already have a well-established meaning in this context, although the weenies @Twelvebaud mentions are carrying it to an extreme.

    If you call printf() and it formats your hard drive, in addition to being a bug, that's also a side effect.


  • Discourse touched me in a no-no place

    @boomzilla said:

    Or you're saying that English words can't have different meanings depending on context, especially when that context is particular subject matters.

    Nothing goes over his head. He would catch it.


  • BINNED

    @FrostCat said:

    If you call printf() and it formats your hard drive, in addition to being a bug, that's also a side effect.

    Which is not a program that works correctly, then.


  • Discourse touched me in a no-no place

    @Onyx said:

    Which is not a program that works correctly, then.

    Well, now we're getting into semantics. If it printed the right thing, it worked correctly in addition to doing a bad thing. It's got layers, like an ogre.


  • BINNED

    Well then. I'm going to thank the developer in addition to garotting him with a VGA cable. Fair?



  • @FrostCat said:

    The thing is, "side effect" does already have a well-established meaning in this context, although the weenies @Twelvebaud mentions are carrying it to an extreme.

    It has a well-established meaning in every context.

    My problem is it also has a very poorly-established meaning, completely different from the well-established one, for this particular context only. Apparently.


  • ♿ (Parody)

    @blakeyrat said:

    My problem is it also has a very poorly-established meaning, completely different from the well-established one, for this particular context only. Apparently.

    That's not your problem.



  • Right; it's the problem of whoever overloaded the commonly-understood term.

    But since that's established, now it becomes 500,000 other people's problems.


  • Discourse touched me in a no-no place

    @Onyx said:

    I'm going to thank the developer in addition to garotting him with a VGA cable. Fair?

    I'm Ok with it.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    My problem is it also has a very poorly-established meaning, completely different from the well-established one, for this particular context only. Apparently.

    Semantics, again. It's well-established among computer science people and other nerds. If it helps, think of it as jargon.


  • ♿ (Parody)

    @blakeyrat said:

    Right; it's the problem of whoever overloaded the commonly-understood term.

    :rolleyes:

    How do you make through each day without being throttled by the people around you?



  • @FrostCat said:

    Semantics, again. It's well-established among computer science people and other nerds. If it helps, think of it as jargon.

    Right; but I hate jargon.

    If it's a new concept, they should have made-up a new word.



  • @boomzilla said:

    How do you make through each day without being throttled by the people around you?

    I don't understand how you make it through each day without grabbing a rifle and heading to Cuba.


  • ♿ (Parody)

    @blakeyrat said:

    If it's a new concept, they should have made-up a new word.

    It's not a new concept. It actually matches the "every day" meaning, but you refuse to think your way through it in a precise fashion.

    @blakeyrat said:

    I don't understand how you make it through each day without grabbing a rifle and heading to Cuba.

    I'm there right now.



  • @boomzilla said:

    I'm there right now.

    Oh man, pick me up some José L. Piedra's.


  • BINNED

    @boomzilla said:

    It's not a new concept. It actually matches the "every day" meaning, but you refuse to think your way through it in a precise fashion.

    I'm with Blakey here. This use seems to me to be co-opted from "common" meaning (not intended) into something very specific and thus confusing.

    The word "theory" suffered the same faith, but the other way around - the original meaning is pretty precise, but the common use is something rather vague.


  • ♿ (Parody)

    @Onyx said:

    I'm with Blakey here. This use seems to me to be co-opted from "common" meaning (not intended) into something very specific and thus confusing.

    Well, let's look at this from the standpoint of the caller. You call the function and get a result back. That's what this things does. If something else happens somewhere, that's a side effect of calling the function. The function's purpose (from a code / abstract / mathematical POV) is to do something and return a value.

    And that's the level we're talking about when we're discussing programming paradigms. Now, it's perfectly fine in my book if you want to talk about the intended effect of a function at the level of the main program and the things you expect to happen. But now you're talking at a different level.

    It's possible and reasonable to not care about certain levels some of the time or even at all. My eyes certainly glaze over when people get down to the hardware level and talk about gates and transistors and stuff. But what I don't to is rant about how a gate is the fucking thing you lock to keep the little shits off your lawn and why did you hardware nerds not use a proper word instead of intentionally confusing me and compromising the security of my grass with your illegitimate word games.



  • @boomzilla said:

    But what I don't to is rant about how a gate is the fucking thing you lock to keep the little shits off your lawn and why did you hardware nerds not use a proper word instead of intentionally confusing me and compromising the security of my grass with your illegitimate word games.

    "gate" is actually a very well-chosen term. YOUR EXAMPLE SUCKS THEREFORE YOU SUCK!


  • BINNED

    @boomzilla said:

    Well, let's look at this from the standpoint of the caller. You call the function and get a result back. That's what this things does. If something else happens somewhere, that's a side effect of calling the function. The function's purpose (from a code / abstract / mathematical POV) is to do something and return a value.

    What is the purpose of a function called showResult? Is printing to STDOUT a side-effect in that case?


  • FoxDev

    I do believe a lot of this debate is down to the difference between functional programming, where functions are first-class citizens, and pure functional programming, which guarantees no side-effects



  • But functions are first-class citizens in C#. All you're missing is type inference at var declaration time. You can pass them around, use them as parameters, return them, check them for safety, invoke them with this this or that, all kinds of things. You just need to give them a typedef.


  • FoxDev

    @TwelveBaud said:

    But functions are first-class citizens in C#

    Can a function exist without a class? No, ergo functions aren't first-class citizens 😛


  • ♿ (Parody)

    @blakeyrat said:

    "gate" is actually a very well-chosen term.

    And as I demonstrated, so is "side effect."

    @blakeyrat said:

    YOUR EXAMPLE SUCKSRULES THEREFORE YOU SUCKRULE!


  • ♿ (Parody)

    @Onyx said:

    What is the purpose of a function called showResult? Is printing to STDOUT a side-effect in that case?

    Are you asking from a programming paradigm perspective or from the perspective of your program?

    Maybe it returns the Best of Breed winners at the last dog show.


  • ♿ (Parody)

    Again, look at the function from the perspective of the compiler. It doesn't really understand STDOUT. But it can (probably) figure out if a function only touches the internals of the function (pure function) or touches something outside (side effect: changes a value, emits output).

    So if there are side effects, more happens than just a returned value. So there's more to consider the way you can optimize it, for one.



  • By that logic, objects aren't first-class citizens. Neither are variables. Neither are classes themselves -- they require namespaces and assemblies.

    This level of pedantry is beneath you. Rise above it. Or learn F# or VB.NET, where I know from experience you can spray functions everywhere your little hedgy heart desires.


  • BINNED

    @boomzilla said:

    Are you asking from a programming paradigm perspective or from the perspective of your program?

    From the perspective of making sense?

    @boomzilla said:

    Maybe it returns the Best of Breed winners at the last dog show.

    int showResults(char* stuff)
    {
        return (printf("%s\n", stuff) >= 0);
    }
    

    There. Replace with bool for languages that support it.

    Or did I do a boo-boo here and used a thing that might actually fail and the function will then return 0 if it does? Am I allowed to do error handling like this even?


  • FoxDev

    @TwelveBaud said:

    This level of pedantry is beneath you. Rise above it.

    But if I do that, I won't get a pedantry badge 😢


  • ♿ (Parody)

    @Onyx said:

    From the perspective of making sense?

    Are you saying compilers don't make sense?

    @Onyx said:

    There. Replace with bool for languages that support it.

    From the compiler's standpoint, this returns an int and has a side effect, so it can't be safely optimized away if the returned value isn't needed.

    @Onyx said:

    Or did I do a boo-boo here and used a thing that might actually fail and the function will then return 0 if it does? Am I allowed to do error handling like this even?

    In talking about whether a function has side effects, you can return whatever you want. That doesn't matter here. The only thing that matters is if the function has some effect on the system other than returning a value. Output and changing data are the main things that would happen (are there others? I don't know and I'm not going to do any research here.)


  • BINNED

    So in the end, the whole functional thing is trying to turn me into a compiler? As in, I have to make it's job easier?

    I'm trying to understand the appeal of it here. I see some valuable ideas in the paradigm, but the whole "purity" thing rubs me the wrong way. It seems to be designed so I have to do more work for the benefit of the ideal, and the only reason is actually the ideal, not any actual benefits.



  • @boomzilla said:

    Again, look at the function from the perspective of the compiler.

    Why would anybody ever do that.

    Plus a compiler is an inanimate object, it doesn't have a "perspective". You can't just anthropomorphize shit. You have to tell me what the compiler's like first. Is it generous? Happy? Warlike? I know of no human qualities I can assign to a compiler.



  • There are two appeals. First, it's easier to transform to and from mathematical equations, so you can have proof of correctness (for some value of proof). Second, if a function has no side effects, you can replace it with its return value. You can do this at compile time (couple of the O flags do this) or at run time ("memoization", a lookaside cache of return values).

    People with strong mathematics or theoretical computer science backgrounds enjoy working in this style; the benefit isn't in the numerical results, but in the mathematical rigor. People with business intelligence or applied computer science backgrounds prefer OO style because they're more comfortable with modeling the real world around them. People with managerial or legacy computer science backgrounds prefer procedural style because they're comfortable telling the computer exactly what to do, just like they would a human.


  • ♿ (Parody)

    @Onyx said:

    So in the end, the whole functional thing is trying to turn me into a compiler?

    No?

    @Onyx said:

    As in, I have to make it's job easier?

    I'm not a big functional programming guy (paging @Captain / @antiquarian / @tarunik ), but not having lots of side effects can make things easier for you.

    @Onyx said:

    I see some valuable ideas in the paradigm, but the whole "purity" thing rubs me the wrong way.

    I agree. Like with most things, you can get carried away with it. But it is what it is. I'm just trying to help you understand what's going on there.

    @blakeyrat said:

    Why would anybody ever do that.

    Just to fuck with you. That's pretty much all I can come up with, too.

    @blakeyrat said:

    You can't just anthropomorphize shit.

    Not in a discussion with you.

    @blakeyrat said:

    You have to tell me what the compiler's like first.

    But I'm a sucker so I'll play along some more anyways.

    One of the things the compiler "wants"1 to do is to optimize the code given as input to make it run efficiently. So it will analyze pieces of code to see if there is anything other than a straightforward compilation that it can do.

    Ideally, the compiler simply omits the code, because no code executes faster than not executing any code at all (and now some PD will tell me about some processor and conditions where that's false...we can deal with them later). For instance, if the input code calls a particular function but never actually relies on the return value of the function, the compiler can safely skip the call altogether and the program works just the same except faster. Success!

    But this is not always something that can happen. Some functions do more than simply do some computing and return a value. They might change some bit of data elsewhere in the system or output something. If the compiler skips the call now, the program will not function identically to the way it did when it still called the function. If it omits the call, we would say that the compiler has introduced a bug, or has a bug itself.

    That's a glimpse into the perspective of the compiler and a function call.

    1 Of course, it doesn't literally want to do this stuff2, that's just a metaphor for how its programmers programmed it and the options its users supplied when they ran it.
    2 Then again, maybe it does. Should we have another 'What is consciousness?' flamewar?


Log in to reply