Poll: Computer Programming - Art or science?



  • Why wouldn't magic be a craft, even if it's fictional? Every single story about magic shows it as such; with people going to school and being apprentices.



  • @boomzilla said:

    I think of coding as applied math.

    Onyx post got me thinking: is there a difference between applied math an implemented math?

    When you make a support structure of wood, is that applied physics, or implemented physics?

    We're talking about the meaning of words now anyway (what's a craft? what's an art? what is programming?), so might as well take it there.


  • ♿ (Parody)

    @dhromed said:

    is there a difference between applied math an implemented math?

    I've never heard the phrase "implemented math," but it seems like it would be synonymous with applied math, to me. Or maybe it's an instance . So, the act of programming involves using applied math. The program itself is an example of implemented math.


  • 🚽 Regular

    @dhromed said:

    Why wouldn't magic be a craft, even if it's fictional? Every single story about magic shows it as such; with people going to school and being apprentices.
    I wasn't saying fictional magic isn't a craft either.

    My post starts with "No" but that was because I initially interpreted your post as putting words in my mouth. That's unhygienic*.

    *Thank you, working spellchecker.

    Edit: uhg, let me clarify. You said:

    @dhromed said:

    So you're saying magic is a craft as well.
    And in my follow-up post I meant to say "No, that's not what I was saying, I was saying something else (but I don't disagree that magic is a craft, in particular real-life stage magic, but also fictional magic [except in cases where magic is portrayed as something innate])".



  • Why would innate magic not be a craft? You still have to learn (creative) use and control. I have the innate ability to think and use my hands to type things, but programming is still a craft.


  • 🚽 Regular

    Now we're discussing the meaning of the word "innate"!

    I meant to refer to something which you're born with and don't have to learn or practice. What would you call that?


  • BINNED

    @boomzilla said:

    That said, I think of coding as applied math. People who think math is numbers won't understand that.

    When I started college in the 80s, that was the first year Computer Science was available there as a major. Before that, all of the courses were part of the Applied Math major.



  • @Zecc said:

    I meant to refer to something which you're born with and don't have to learn or practice. What would you call that?

    Innate!

    My point still stands, obviously.


  • Considered Harmful

    @dhromed said:

    My point still stands, obviously.

    I think your point is bending over and showing its bare ass.



  • @dkf said:

    Captain said:

    Most programming languages don't have the tools to approximate mathematical notation

    Most of them do, but don't use it because we still tend to not use editors that support fancy notation reliably. Instead, programmers use multi-glyph tokens much more extensively.

    No, they really don't. There is a lot more to notation than the symbols that get used. How would you define a binary tree in C? Here's how you do it in Haskell:

    data Tree a = Leaf a
                | Branch { left :: Tree a, right :: Tree a }
    

    The data structure's structure as an initial F-algebra is immediately apparent. You can do recursion over this thing with absolutely zero syntactic overhead. In fact, it is so simple, that we can mechanically derive recursion schemes for the tree.

    The functor, when defined explicitly (though you can ask your compiler to do it for you), is:

    instance Functor Tree where
      fmap f (Leaf a) = Leaf (f a)
      fmap f (Branch a b) = Branch (fmap f a) (fmap f b)
    

    How would you define a tree in C? You'd need to use pointers. And you wouldn't be able to keep the "cases" together. And you'd need to manually keep track that you don't blow the stack. Most languages make the programmer do the bookkeeping, when the bookkeeping is utterly mechanical and should have been abstracted away in the first place. And if you don't use recursion, you need to do other "crazy" things to unroll loops and keep track of depth and other nonsense. The problem isn't wanting to express a simple data type in a clear way.

    Now, consider that the "factory pattern" from OO programming is the functor pattern, in different notation. (Notice that the compiler can define these for free, in sensible languages. Can your language automatically make factories for you?) How would a "tree factory" look in C? Can it make any static guarantees?

    @dkf said:

    Mathematics tends to suffer from being totally impenetrable to someone not in that exact part of the discipline because of the use of symbols to mean exactly what a small community wants at that time.

    And yet there is a simple, and extremely flexible vocabulary that all mathematicians use. That vocabulary is enough to provide all the context you need, when you're reading a proof or a program.

    What you're saying here is pretty much wrong. It's not the symbols that cause confusion. It's that the relationships the symbols embody define complex ideas that can push the limits of human ability to understand them. Fair enough, math can be hard. But mathematicians have learned through hard graft to not make the notation any harder than it has to be to express complex ideas.



  • @chubertdev said:

    It's useful, so it's not art.

    That level of sarcastic commenting will cause you to not grow in life.


  • Discourse touched me in a no-no place

    @Captain said:

    What you're saying here is pretty much wrong. It's not the symbols that cause confusion. It's that the relationships the symbols embody define complex ideas that can push the limits of human ability to understand them. Fair enough, math can be hard. But mathematicians have learned through hard graft to not make the notation any harder than it has to be to express complex ideas.

    Congratulations on missing my point. The problem isn't that symbols themselves cause confusion. The problem is that symbols need a consistent interpretation so that a system as stupid and bad at context as a computer can reliably understand exactly what they mean, yet mathematicians do not use symbols consistently.

    Oh, they use them consistently within a particular field. (Well, mostly.) But different fields use things in different ways. This matters a lot once you start to try and integrate between different fields.

    Most mathematicians that I've dealt with are also inclined to over-simplify. The might describe the general case, but they often omit the awkward (and not at all general) edge cases that don't contribute to proving the point that they are trying to make. This usually works well enough in mathematics, but it really doesn't with computing. Computers are stupid, so you have to prepare for all the damn edge cases.

    I tire of this thread. My final point is that while in theory there's no difference between theory and practice, in practice there is a difference. Math is all theory.



  • @dkf said:

    Computers are stupid, so you have to prepare for all the damn edge cases.

    That is quote worth making frame for.


  • Considered Harmful



  • What is sig.php?


  • Considered Harmful

    @Nagesh said:

    What is sig.php?

    It picks a user from a local database, scrapes the old forums for their user info, picks a quote and theme, and renders out an image.



  • Nice idea.





  • I refreshed to see if the sigs are cached or actually change. They do change, but I found this one particularly entertaining:

    @ben_lubar probably ought to see a shrink about that self-rape problem.


  • Considered Harmful

    @abarker said:

    I refreshed to see if the sigs are cached or actually change. They do change, but I found this one particularly entertaining:

    Yeah, he's putting in the ?admin=true query parameter, which triggers the "I suck" messages. This is from when Ronald was hotlinking my sig.


    Filed under: Clicking the image to enlarge actually reloads the image, so you can see a new quote without refreshing.


  • ♿ (Parody)

    @dkf said:

    But different fields use things in different ways. This matters a lot once you start to try and integrate between different fields.

    So you're saying mathematicians should be coding in C++.



  • @boomzilla said:

    So you're saying mathematicians should be coding in C++.

    Most programmers/developers I know shouldn't be coding in C/C++.



  • most people I know are not having any formal training in programming, being mechanical engineer or civil (construction) engineer. however they are able to string together code that gets by and passes test cases.

    they don't care about bit shifts or decimal floating point issue with processers. they are able to write decent code that works and make it way to production. That is important.



  • @Nagesh said:

    mechanical engineer or civil (construction) engineer … don't care about bit shifts or decimal floating point issue with processers

    That is ab-so-fucking-lutely scary. Engineers of any kind should be WAY concerned with the inaccuracies of floating point math in programs.



  • @DrakeSmith said:

    Engineers of any kind should be WAY concerned with the inaccuracies of floating point math in programs.

    Aware of, certainly. Concerned about cumulative inaccuracies, yes. But for most practical engineering purposes, if the final answer is accurate to, say, a half-dozen significant digits, that is more precision than you can achieve in whatever you're trying to build, so it's quite sufficient.



  • Plus the standard safety margins used in different engineering contexts tend to be pretty big (IIRC in civil it's x3 than needed for most things and x10 if you are containing pressure).



  • @HardwareGeek said:

    Aware of, certainly. Concerned about cumulative inaccuracies, yes. But for most practical engineering purposes, if the final answer is accurate to, say, a half-dozen significant digits, that is more precision than you can achieve in whatever you're trying to build, so it's quite sufficient.

    Unless they are, say, comparing the result of said math to a constant - they need to know to compare floats using some epsilon figure.



  • Just looking at the topic, the only proper answer to this question I can think of is "Yes."



  • @xacheron said:

    Just looking at the topic, the only proper answer to this question I can think of is "Yes."

    I am glad that you did not say FILE_NOT_FOUND!



  • This. It's both science and art, much like artistic carpentry for example.


  • Discourse touched me in a no-no place

    @boomzilla said:

    So you're saying mathematicians should be coding in C++.

    I never said “should”. I'm saying that some do indeed code in C++ (and are in fact better equipped to do so than most programmers, as they at least try to make their types algebraic), and the rest do horrific things in/with/to Mathematica.

    Better than some solar physicists I've known. IDL is almost as terrible as MUMPS…


  • Discourse touched me in a no-no place

    @xacheron said:

    only proper answer to this question I can think of is "Yes."

    Or “No”…



  • @dkf said:

    Or “No”…

    or FILE_NOT_FOUND


  • ♿ (Parody)

    @dkf said:

    I never said “should”.

    It was implied by your mention of mathematicians being all about operator overloading.


    Filed Under: Dammit now you made me explain the joke



  • @boomzilla said:

    I generally think about it this way. There's a lot of science and a lot of art.

    That said, I think of coding as applied math. People who think math is numbers won't understand that.

    That lack of understanding is a testament to the shallowness of most edifications in mathematics. CS is indeed applied math, but applied discrete math. The art comes in when it comes to developing elegant software, in much the same way that it comes into play for elegant proofs of mathematical theorems.

    In short, programs and proofs are the same type of object modulo all things IO. While I'm not a Haskell guru of @Captain's caliber, I at least can appreciate this argument. I'll go as far as saying as this is what is lost on the code-monkeys, and goes unappreciated by systems purists and embedded engineers as well.

    That being said, there is also an engineering discipline lurking under the covers as well, albeit one that indeed has very different cost constraints than more physically grounded engineering disciplines.



  • @Zecc said:

    That's unhygienic

    Well, if you use syntax-case instead of syntax-rules, you can abstract away the... sorry, I was channeling Kent Dybvig for a moment there.


Log in to reply