WTF Bites


  • Considered Harmful

    @topspin said in WTF Bites:

    But it also seems to extend to hiring people: “needs X years of experience in language Y.”
    Well, I’ve never used that, but how hard can it be?
    (It’s normally not the language but the library/tech stack that you need experience with.)

    If you're looking for senior devs, they also benefit a lot from industry-of-application experience. Heck, even devs in general - the industry tends to determine a lot of NFR priorities.


  • Discourse touched me in a no-no place

    @topspin said manager think in WTF Bites:

    I could tell the computer to do it myself

    If that's the case, they should get on and do it.

    As I said to my brother (very much a manager, albeit not of engineers) “learn to read the code, sure, good idea, but learning to write it well is a whole different thing. Picking the right thing from a vast range of wrong things is the hard part, so let the specialists do it.”



  • @topspin said in WTF Bites:

    (It’s normally not the language but the library/tech stack that you need experience with.)

    Largely this. The thing that takes time is learning the "proper" way of doing things in that language/environment. You need to learn what's available and how people use it typically.

    There's the old joke that you can write Fortran in any language. There's something to it. I can "write C" in many other languages, and I'd reckon that's enough to solve most problems that would get thrown at me. Doesn't mean that this results in good/idiomatic code, and people actually experienced in ${other language} will likely rightfully go :wtf:.



  • @cvi said in WTF Bites:

    @topspin said in WTF Bites:

    (It’s normally not the language but the library/tech stack that you need experience with.)

    Largely this. The thing that takes time is learning the "proper" way of doing things in that language/environment. You need to learn what's available and how people use it typically.

    There's the old joke that you can write Fortran in any language. There's something to it. I can "write C" in many other languages, and I'd reckon that's enough to solve most problems that would get thrown at me. Doesn't mean that this results in good/idiomatic code, and people actually experienced in ${other language} will likely rightfully go :wtf:.

    What usually takes time for me is getting the business rules, both the official and the actual, and the data flows in the current implementation. Platforms have pretty similar themes, much like languages and once you've worked with a slew, getting into a new one is fairly easy.


  • Discourse touched me in a no-no place

    @Carnage said in WTF Bites:

    What usually takes time for me is getting the business rules, both the official and the actual, and the data flows in the current implementation. Platforms have pretty similar themes, much like languages and once you've worked with a slew, getting into a new one is fairly easy.

    The other thing that takes a lot of time is figuring out the data model. If you're lucky, you don't have to do that. It's often the data model that is the cause of the Build One To Throw Away effect, where the first prototype version of anything is a pile of shit that you use to learn what not to do..



  • @dkf said in WTF Bites:

    @Carnage said in WTF Bites:

    What usually takes time for me is getting the business rules, both the official and the actual, and the data flows in the current implementation. Platforms have pretty similar themes, much like languages and once you've worked with a slew, getting into a new one is fairly easy.

    The other thing that takes a lot of time is figuring out the data model. If you're lucky, you don't have to do that. It's often the data model that is the cause of the Build One To Throw Away effect, where the first prototype version of anything is a pile of shit that you use to learn what not to do..

    My current gig doesn't have a data model. It's an amorphous big ball of mud in a state of constant change.
    And if you ask the systems you are sending data to what they want and how they want it, they either don't reply at all or tell you to just send what you think they should have.
    Stabby feels.



  • @dkf said in WTF Bites:

    recommend coding for workflow/rules exactly because it is so much simpler than GUI-based block configuration stuff as you describe.

    The project I am currently working on revolves around IDE for a language that is specified as a XML-serialized-AST only, and in the IDE you edit it as a flow chart—drag&drop boxes, but typing is still required for naming and various constants.

    The approach has benefit that pulling things from toolboxes make it obvious what you can use, and therefore a bit easier to start with, and seeing the structure is sometimes useful too. But typing would be often quicker, and there is no way to search in this, so when you need to modify a more complex script you created three years ago, you are in a bit of a fix.

    So we said if we had time or were starting over, we'd create some syntax for it and we'd have two views—one showing the flow chart (with automatic layout; I think it uses it already) and another with text form. With both synchronizing to the underlying AST.

    I think that would be the reasonable approach for many similar domain-specific languages including the NiFi. And it shouldn't be too hard to do, either. The graphic editing, which is the harder part, is already there, and there has to be some underlying AST the engine works with. So it's just about adding an alternate visualization as structured text.


  • Discourse touched me in a no-no place

    @Bulb said in WTF Bites:

    The graphic editing, which is the harder part, is already there, and there has to be some underlying AST the engine works with.

    Now imagine how much more would have been done without the graphical editing, if the effort spent on that had gone on other things.

    I've written several of these graphical editing things. They're time sinks, and they don't help very much when it comes to programming. (There are domains where graphical editing is sensible.)



  • @topspin said in WTF Bites:

    I could tell the computer to do it myself, but I don’t understand why you need this crazy moon language to do it.

    Corollary of what I just said is that the crazy moon language and the crazy moon pictures you get for graphic programming are just as difficult to understand because they are isomorphic after all.

    The only thing you can simplify is when you can use a domain-specific language with just a few features. But at the moment you add variables, conditions and loops you have a general-purpose language and you should have just written a library for an existing language instead.

    (Side-note: in the project I work on there is an executor for the AST, but the part I work on compiles it into Lua with a special library instead because the current direct executor wouldn't run on the target hardware)


  • Banned

    @Zerosquare said in WTF Bites:

    I don't think the ones who actual develop those tools are naive enough to believe that learning the syntax is what makes programming difficult.

    I didn't think that's possible either but then I met Blakey "The Only Reason Programmers Exist As A Profession Is Extreme Elitism In Form Of Intentionally Difficult Tools" and Levicki "All We Have To Do Is Follow The Footsteps Of Inform And Make It Possible To Write Programs In Plain English".



  • @Gąska said in WTF Bites:

    I didn't think that's possible either but then I met Blakey "The Only Reason Programmers Exist As A Profession Is Extreme Elitism In Form Of Intentionally Difficult Tools" and Levicki "All We Have To Do Is Follow The Footsteps Of Inform And Make It Possible To Write Programs In Plain English".

    I had to program in Labview when I was studying. The keyword is "program". Yes, the only thing that you do is connect different boxes that do different stuff to pipe data between them. Well, then there are loops, of course, and conditions (which sets it apart from the way you create shader graphs in rendering). You could combine data into structs. You could have multiple concurrent loops (in fact, creating a consumer-producer system was quite easy, have to give them that).

    But ultimately, to use that system successfully, you had to learn the (visual) language and environment. It was an awful lot like programming overall. I personally wasn't a fan of the visual style, as it seemed to scale quite badly with complexity, but I can see how someone might like it.


  • BINNED

    @Gąska said in WTF Bites:

    @Zerosquare said in WTF Bites:

    I don't think the ones who actual develop those tools are naive enough to believe that learning the syntax is what makes programming difficult.

    I didn't think that's possible either but then I met Blakey "The Only Reason Programmers Exist As A Profession Is Extreme Elitism In Form Of Intentionally Difficult Tools" and Levicki "All We Have To Do Is Follow The Footsteps Of Inform And Make It Possible To Write Programs In Plain English".

    Which is funny, because other than this superficial agreement, I bet neither would agree with the other on anything. Not even the time of day.



  • @cvi said in WTF Bites:

    I had to program in Labview when I was studying.
    (...)
    I personally wasn't a fan of the visual style, as it seemed to scale quite badly with complexity

    It does scale badly, by quickly degenerating into the visual equivalent of spaghetti code. Which actually looks a bit like spaghetti.


  • BINNED

    @Zerosquare said in WTF Bites:

    @cvi said in WTF Bites:

    I had to program in Labview when I was studying.
    (...)
    I personally wasn't a fan of the visual style, as it seemed to scale quite badly with complexity

    It does scale badly, by quickly degenerating into the visual equivalent of spaghetti code. Which actually looks a bit like spaghetti.

    Is that because you can’t modularize well or because people who program with it generally are the same people who would write 1000 line functions with gotos?


  • Banned

    @topspin said in WTF Bites:

    @Gąska said in WTF Bites:

    @Zerosquare said in WTF Bites:

    I don't think the ones who actual develop those tools are naive enough to believe that learning the syntax is what makes programming difficult.

    I didn't think that's possible either but then I met Blakey "The Only Reason Programmers Exist As A Profession Is Extreme Elitism In Form Of Intentionally Difficult Tools" and Levicki "All We Have To Do Is Follow The Footsteps Of Inform And Make It Possible To Write Programs In Plain English".

    Which is funny, because other than this superficial agreement, I bet neither would agree with the other on anything. Not even the time of day.

    And their hatred for me personally. Two superficial agreements. Programming for masses, hatred for me, and love for 90s Microsoft. THREE superficial agreements. I'll come in again...



  • @Zerosquare said in WTF Bites:

    It does scale badly, by quickly degenerating into the visual equivalent of spaghetti code. Which actually looks a bit like spaghetti.

    Yes, that was my experience as well. Never spent that much time with it, though, so I don't know if that was the typical experience or just my inexperience. I was already rather familiar with "normal" programming, so I definitively tried to force "normal" programming concepts and patterns on top of it (with varying degrees of success).

    However, on the plus side, de-tangling the line-spaghetti was great for procrastination.


  • Banned

    @topspin said in WTF Bites:

    @Zerosquare said in WTF Bites:

    @cvi said in WTF Bites:

    I had to program in Labview when I was studying.
    (...)
    I personally wasn't a fan of the visual style, as it seemed to scale quite badly with complexity

    It does scale badly, by quickly degenerating into the visual equivalent of spaghetti code. Which actually looks a bit like spaghetti.

    Is that because you can’t modularize well or because people who program with it generally are the same people who would write 1000 line functions with gotos?

    With visual programming, in addition to all the stuff you have to deal with in regular programming, you also have to maintain the visual part. All the functions, all the variables, all the data flows occupy physical space now. Also, you know that feeling when you know that MS Word has some particular function available but you can't find for the life of you which submenu it's located in? Imagine every line of code you write being like that - having to click everything up instead of just writing.



  • @topspin said in WTF Bites:

    @Zerosquare said in WTF Bites:

    @cvi said in WTF Bites:

    I had to program in Labview when I was studying.
    (...)
    I personally wasn't a fan of the visual style, as it seemed to scale quite badly with complexity

    It does scale badly, by quickly degenerating into the visual equivalent of spaghetti code. Which actually looks a bit like spaghetti.

    Is that because you can’t modularize well or because people who program with it generally are the same people who would write 1000 line functions with gotos?

    In the specific case I am familiar with

    • You certainly can modularize, functions and modules are supported.
    • There is no goto, just conditions, loops and scoped(!) threads (the “parallel” construct takes n blocks to run in threads, but it joins them all before returning itself).

    but even then the flow chart simply becomes too large to see at a glance around the statement count and cyclomatic complexity where a function starts becoming convoluted or a bit earlier.


  • Discourse touched me in a no-no place

    @cvi said in WTF Bites:

    I personally wasn't a fan of the visual style, as it seemed to scale quite badly with complexity

    That's the biggest problem with graphical programming. Heck, scaling with complexity is plain old hard in the first place, and graphical programming is also hard. Combining two hard things doesn't usually make an easy thing, at least not in most professional contexts. (There are ways to do the equivalent of named subprograms, but making them work naturally is a bit tricky.)

    I remember using a graphical programming language where I stopped doing toy examples and instead cranked out something real with it and scared the language developers quite a bit. They'd not really considered what happened when you have 10k components on the screen at once. Yes, that was with subprogram components. It worked fine at controlling a large cluster doing a lot of complex work, but production-scaling the UI wasn't really ever going to happen. (I wrote the program by writing a program to manipulate the serialized model of the GUI directly so that I could generate everything using a semantic composition engine that the project had built. I'm far too lazy to point-and-click enough to do that correctly all by hand.)



  • @dkf said in WTF Bites:

    @cvi said in WTF Bites:

    I personally wasn't a fan of the visual style, as it seemed to scale quite badly with complexity

    That's the biggest problem with graphical programming. Heck, scaling with complexity is plain old hard in the first place, and graphical programming is also hard. Combining two hard things doesn't usually make an easy thing, at least not in most professional contexts. (There are ways to do the equivalent of named subprograms, but making them work naturally is a bit tricky.)

    I remember using a graphical programming language where I stopped doing toy examples and instead cranked out something real with it and scared the language developers quite a bit. They'd not really considered what happened when you have 10k components on the screen at once. Yes, that was with subprogram components. It worked fine at controlling a large cluster doing a lot of complex work, but production-scaling the UI wasn't really ever going to happen. (I wrote the program by writing a program to manipulate the serialized model of the GUI directly so that I could generate everything using a semantic composition engine that the project had built. I'm far too lazy to point-and-click enough to do that correctly all by hand.)

    It might be interesting to have the visual interface be VR. Still utterly impractical, but interesting.
    I think abstraction layers must be in the visual approach so you can zoom in and out as your need require. It would possibly deal with getting information overload when trying to understand something. On the other hand, proper abstraction is insanely hard to get right so that works just add another hard thing on the pile.


  • Discourse touched me in a no-no place

    @Bulb said in WTF Bites:

    cyclomatic complexity

    That basically corresponds directly to visual complexity. The only way you can possibly tame it is by having some way to take a group of existing components and wrap it into a black box, a new component that you don't usually see inside of. Except now you need to give the thing a name (because otherwise you've got a pile of little boxes that all look identical) and a description and maybe its own icon and fuck it's all starting to get complicated again and you've got the same tasks as writing a function except there's even more of them.

    And you can't use recursion in visual programming. (OTOH, looping over streams/pipes becomes much easier. Indeed, you really shouldn't consider connections between things to be one-off…)


  • Discourse touched me in a no-no place

    @Carnage said in WTF Bites:

    It might be interesting to have the visual interface be VR. Still utterly impractical, but interesting.

    You could spend a few man-years making that work, and probably only work badly. Or you could just plug your secret sauce into your favourite scripting language in a few hours and the job's done and will work well.



  • @dkf said in WTF Bites:

    You could spend a few man-years making that work, and probably only work badly. Or you could just plug your secret sauce into your favourite scripting language in a few hours and the job's done and will work well.

    But how would you milk VC for a decade goofing off then? 🤔


  • Discourse touched me in a no-no place

    @Carnage said in WTF Bites:

    @dkf said in WTF Bites:

    You could spend a few man-years making that work, and probably only work badly. Or you could just plug your secret sauce into your favourite scripting language in a few hours and the job's done and will work well.

    But how would you milk VC for a decade goofing off then? 🤔

    I prefer to milk Big Government, and as you get up to where the major money is, you end up having to actually have some results to show and the reviewers tend to actually have some idea what the state of the art actually is.


  • Considered Harmful

    @Gąska said in WTF Bites:

    @topspin said in WTF Bites:

    @Gąska said in WTF Bites:

    @Zerosquare said in WTF Bites:

    I don't think the ones who actual develop those tools are naive enough to believe that learning the syntax is what makes programming difficult.

    I didn't think that's possible either but then I met Blakey "The Only Reason Programmers Exist As A Profession Is Extreme Elitism In Form Of Intentionally Difficult Tools" and Levicki "All We Have To Do Is Follow The Footsteps Of Inform And Make It Possible To Write Programs In Plain English".

    Which is funny, because other than this superficial agreement, I bet neither would agree with the other on anything. Not even the time of day.

    And their hatred for me personally. Two superficial agreements. Programming for masses, hatred for me, and love for 90s Microsoft. THREE superficial agreements. I'll come in again...

    I doubt they so much hated you as held you in contempt, tbf. Hate demands a degree of intensity of feeling, this you seem in no danger of inspiring.


  • Considered Harmful

    @dkf said in WTF Bites:

    @Carnage said in WTF Bites:

    @dkf said in WTF Bites:

    You could spend a few man-years making that work, and probably only work badly. Or you could just plug your secret sauce into your favourite scripting language in a few hours and the job's done and will work well.

    But how would you milk VC for a decade goofing off then? 🤔

    I prefer to milk Big Government, and as you get up to where the major money is, you end up having to actually have some results to show and the reviewers tend to actually have some idea what the state of the art actually is.

    Milking The Civil Bureaucradome works similarly, once the market cap is enough to have to weather market forces.


  • 🚽 Regular

    @dkf said in WTF Bites:

    And you can't use recursion in visual programming.

    Why not?


  • Considered Harmful

    @Zecc said in WTF Bites:

    @dkf said in WTF Bites:

    And you can't use recursion in visual programming.

    Why not?

    I did it in Authorware... I had ickle object-like flowcharts that were even reentrant iirc



  • @Carnage said in WTF Bites:

    It might be interesting to have the visual interface be VR. Still utterly impractical, but interesting.

    Do it for something that has a limited scope. E.g. home automation with s/VR/AR. Logic (in the average) case is much simpler, and drawing lines/"wires" between devices in AR makes a bit more sense (e.g., you connect that smart-IoS-switch to that smart-IoS-bulb).

    As for your original idea, either it's been done to some degree, or there are some HCI nerds who would get rather excited about that. (And some that would hate every aspect of it, to be fair.)



  • @dkf said in WTF Bites:

    @Bulb said in WTF Bites:

    cyclomatic complexity

    That basically corresponds directly to visual complexity. The only way you can possibly tame it is by having some way to take a group of existing components and wrap it into a black box, a new component that you don't usually see inside of. Except now you need to give the thing a name (because otherwise you've got a pile of little boxes that all look identical) and a description and maybe its own icon and fuck it's all starting to get complicated again and you've got the same tasks as writing a function except there's even more of them.

    In my case in particular you have to give names even to the boxes that contain the primitive functions, so… most authors end up with a bunch of macro 1, macro 2, macro 3, action 1, action 2, etc.

    And you can't use recursion in visual programming. (OTOH, looping over streams/pipes becomes much easier. Indeed, you really shouldn't consider connections between things to be one-off…)

    Well, this particular case has functions, so you can do recursion in it. Of course the visual diagram does fuck all for understanding it.


  • Considered Harmful

    @Bulb said in WTF Bites:

    Well, this particular case has functions, so you can do recursion

    You also need at least one of, scoping or an explicit call stack



  • @Gribnit Yes, the functions create scopes like any other (sane modern) programming language. It is really a normal structured programming language except it is defined only as a XML serialization of its AST instead of any specific syntax. It's a miracle the industry could mostly agree on standardizing at least that; a syntax would be too bikesheddy to ever get standardized.


  • Considered Harmful

    @Bulb said in WTF Bites:

    @Gribnit Yes, the functions create scopes like any other (sane modern) programming language. It is really a normal structured programming language except it is defined only as a XML serialization of its AST instead of any specific syntax. It's a miracle the industry could mostly agree on standardizing at least that; a syntax would be too bikesheddy to ever get standardized.

    Ugh, fucking BPMN? I might have to read that soon.



  • @Gribnit No, ISO 13209. But there is a lot of similar ones.


  • Discourse touched me in a no-no place

    @Gribnit said in WTF Bites:

    Ugh, fucking BPMN? I might have to read that soon.

    Kiss goodbye to your remaining sanity.



  • @cvi said in WTF Bites:

    Do it for something that has a limited scope. E.g. home automation with s/VR/AR. Logic (in the average) case is much simpler, and drawing lines/"wires" between devices in AR makes a bit more sense (e.g., you connect that smart-IoS-switch to that smart-IoS-bulb).

    That would actually be quite nifty. But useless to most people, you don't change your IoS stuff every two days (well actually you do, because 1) IoS is IoS so it breaks every two days and 2) if you're deep-enough in IoS stuff to need something like this, you've drank so much kool-aid that you're likely buying new IoS stuff every two days...).

    It would probably work as some sort of specialised installer tool. Like someone coming to your home to setup a whole raft of IoS stuff at once, and using such an AR interface to quickly set it up (possibly showing you the AR stuff on a second device while they do the work).

    But that's all sci-fi. I don't think there is such a need for IoS that there would be an actual market for it, and IoS is such... S that there is no hope to ever be able to manage all stuff from a single interface.


  • Considered Harmful

    @dkf said in WTF Bites:

    @Gribnit said in WTF Bites:

    Ugh, fucking BPMN? I might have to read that soon.

    Kiss goodbye to your remaining sanity.

    It was more an exchange of blows, iirc



  • I don't have high expectations when it comes MS Office. Even when it works, it's kinda shit. However, at the moment, it manages to not only be shit but also totally shit itself:

    f26ee64d-4185-4050-b56b-9dfc53e7aa2d-image.png

    For the record, pressing Alt-Shift-A did nothing.


  • Considered Harmful

    @cvi said in WTF Bites:

    I don't have high expectations when it comes MS Office. Even when it works, it's kinda shit. However, at the moment, it manages to not only be shit but also totally shit itself:

    f26ee64d-4185-4050-b56b-9dfc53e7aa2d-image.png

    For the record, pressing Alt-Shift-A did nothing.

    What's your issue? Everything looks sorted from here.



  • @remi said in WTF Bites:

    But that's all sci-fi. I don't think there is such a need for IoS that there would be an actual market for it, and IoS is such... S that there is no hope to ever be able to manage all stuff from a single interface.

    There are some attempts at making things work together, e.g. this.

    But, I agree otherwise. Difficult to justify developing and making it work well enough for the limited use cases. Possibly for PR reasons or to have a "cool" campaign (didn't Ikea do something AR-ish where you could place virtual furniture in your home by looking "through" your phone?).



  • @cvi said in WTF Bites:

    (didn't Ikea do something AR-ish where you could place virtual furniture in your home by looking "through" your phone?).

    I don't know about that, but they have a relatively non-WTF-y room simulator (at least for the kitchen, which is what I used it for recently), which includes the ability to move around, so there is a working (more or less) 3D render in there. I guess it might not be too much work to make that into an AR thing? Find some lib to map what your camera sees to a 3D model and slap the simulation on top of that.

    In any case, that's still in the cool-for-PR domain. I don't think you'd really get a realistic-enough photo-render to really add value compared to their current tool. Or to get there you'd need too much work from the user to properly set the exact room dimension and lighting conditions, meaning that for a random average user it would be more frustrating than helpful.

    (I guess the bottom line of that train of thought is that AR is probably much easier for small, clearly out-of-universe annotations, and much harder if trying to inject "real" objects?)


  • Considered Harmful

    @cvi said in WTF Bites:

    (didn't Ikea do something AR-ish where you could place virtual furniture in your home by looking "through" your phone?).

    I'm still using my virtual Snori.


  • I survived the hour long Uno hand

    @Zecc said in WTF Bites:

    @dkf said in WTF Bites:

    And you can't use recursion in visual programming.

    Why not?

    Then the rubes that write your check would see it’s turtles all the way down.


  • Considered Harmful

    @izzion said in WTF Bites:

    @Zecc said in WTF Bites:

    @dkf said in WTF Bites:

    And you can't use recursion in visual programming.

    Why not?

    Then the rubes that write your check would see it’s turtles all the way down.

    No, they'd get lost at the first turtle.


  • Notification Spam Recipient

    @dkf said in WTF Bites:

    (I wrote the program by writing a program to manipulate the serialized model of the GUI directly so that I could generate everything using a semantic composition engine that the project had built. I'm far too lazy to point-and-click enough to do that correctly all by hand.)

    Mad lad!


  • Notification Spam Recipient

    @Carnage said in WTF Bites:

    It might be interesting to have the visual interface be VR.

    I've off-and-on pondered how to implement that for Hypatia. Problem is, not sure how to make it useful in the context of physical space...


  • Notification Spam Recipient

    @dkf said in WTF Bites:

    Or you could just plug your secret sauce into your favourite scripting language in a few hours and the job's done and will work well.

    Current plan was to just shove Lua in and pop a message telling them to pull off the headset and edit it. 😆


  • Considered Harmful

    @Tsaukpaetra said in WTF Bites:

    @dkf said in WTF Bites:

    (I wrote the program by writing a program to manipulate the serialized model of the GUI directly so that I could generate everything using a semantic composition engine that the project had built. I'm far too lazy to point-and-click enough to do that correctly all by hand.)

    Mad lad!

    Mad is fucking around in a visual editor.


  • Considered Harmful

    @dkf said in WTF Bites:

    And you can't use recursion in visual programming.

    Eh, you can't express it sanely, other than that what's stopping you? The runtime could magic up a call stack maybe.


  • Banned

    My phone has failed me at its basic function: receiving phone calls. Phone calls from my sick sister who needed immediate ride home.

    It positioned badly in my pocket and power button was pressed, resulting in a reboot. After every reboot, it locks out fingerprint sensor and requires you to use PIN. TIL it also locks out incoming calls until you unlock screen once.

    I want to be back in 2003 when cell phones just worked.

    Anyone knows if all Android 10 phones are like that or it's just mine?


Log in to reply