UNISON architecture astronauts



  • It's actually an interesting idea, but reading that about page reminded me SO MUCH of this:

    That's one sure tip-off to the fact that you're being assaulted by an Architecture Astronaut: the incredible amount of bombast; the heroic, utopian grandiloquence; the boastfulness; the complete lack of reality. And people buy it! The business press goes wild!



  • Perhaps 70% of developer time is spent dealing with parsing, serialization, and persistence. Values are encoded to and from JSON, to and from various binary formats, and to and from various persistent data stores… over and over again.

    Another 25% is spent on explicit networking. We don’t merely specify that a value must be sent from one node to another, we also specify how in exhaustive detail.

    Somewhere in between all this plumbing code is a tiny amount of interesting, pure computation, which takes up the remaining 5% of developer time. And there’s very little reuse of that 5% across applications, because every app is wrapped in a different 95% of cruft and the useful logic is often difficult to separate!

    I call bullshit. Which fits the bombast, etc.



  • @boomzilla said:

    I call bullshit. Which fits the bombast, etc.

    Next paragraph:

    These numbers are made up, of course, but if anything they are optimistic.

    They call bullshit on themselves.



  • Programs are edited in a (browser-based) semantic editor which guarantees programs are well-formed and typecheck by construction.

    nty



  • @blakeyrat said:

    And people buy it! The business press goes wild!

    Except nobody buys this guy's cool aid.

    Look at the video. It's like a Click & Play wannabe with less options.



  • @cartman82 said:

    Next paragraph:

    I admit, I stopped reading at that crap.



  • A recent example illustrates this. Your typical architecture astronaut will take a fact like "Napster is a peer-to-peer service for downloading music" and ignore everything but the architecture, thinking it's interesting because it's peer to peer, completely missing the point that it's interesting because you can type the name of a song and listen to it right away.

    All they'll talk about is peer-to-peer this, that, and the other thing. Suddenly you have peer-to-peer conferences, peer-to-peer venture capital funds, and even peer-to-peer backlash with the imbecile business journalists dripping with glee as they copy each other's stories: "Peer To Peer: Dead!"

    It's a text from 2001 but it perfectly describes many things from the last 5 years.



  • @boomzilla said:

    Perhaps 70% of developer time is spent dealing with parsing, serialization, and persistence.

    Wow, he must be a really terrible programmer.



  • I'll confess something: in the past, I had my own dreams of inventing the architecture, some kind of magical paradigm that would revolutionize software forever, and fix all the problems with computers. Some kind of universal object format that could capture the structure of every file format, allowing for perfect interoperability with all systems, or some language that could capture what you want instead of just opaque instructions. I was always trying to generalize concepts, seeing if I could somehow unify filesystems and databases because they're all the same in principle, etc.

    But I've realized that all these ideas just don't work. You can improve any single part but you can't make something that just fixes everything. Except for a full AI that could understand what you really want it to do, but that's still out of the question.

    Unfortunately this guy seems stuck in the same path I was... and he's going to build some shitty product that fails miserably to accomplish anything other than getting people to laugh at him.



  • No Silver Bullet.

    Those who do not learn from the past are booger-filled morons. Also: doomed to repeat it.



  • @boomzilla said:

    >Perhaps 70% of developer time is spent dealing with parsing, serialization, and persistence. Values are encoded to and from JSON, to and from various binary formats, and to and from various persistent data stores… over and over again.

    Two lines: DEFINE_KEYFIELD(m_nMaxLiveAliens, FIELD_INTEGER, "MaxLiveAliens"), and MaxLiveAliens(integer) : "Max Num. of live aliens" : 1 : "Maximum number of live aliens that can be present in the level from this spawner at any given time."

    @boomzilla said:

    >Another 25% is spent on explicit networking. We don’t merely specify that a value must be sent from one node to another, we also specify how in exhaustive detail.

    Two lines: SendPropEHandle( SENDINFO( m_hAimTarget ) ), and CNetworkHandle( CBaseEntity, m_hAimTarget );


  • Discourse touched me in a no-no place

    @anonymous234 said:

    It's a text from 2001 but it perfectly describes many things from the last 5 years.

    It's the Fartner hype curve all over again.



  • Well, he sort of makes a point that lot of time is spent on the data modeling and handling.

    For example, I work with TV metadata, and you would expect this to be standard stuff. A movie has a title, duration, etc. The same with a series and an episode. There's so much you can model from.

    Well, I have to see the first project where two client's system use something that looks & behaves in the same way. For any project, 30%-40% is spent on struggling with the data and the backend.

    For example, some APIs, when requested a serie returns all seasons as IDs, others have a special endpoint, and other return all of a series episodes with a "season" field.

    It's joke and so frustrating.



  • If you’ve read the About post, you might be thinking:
    •Whoever wrote that is completely nuts. (No argument there)
    •This sounds like a ten year research project that will be in a perpetual state of vaporware. (Turns out, NO)

    Fuck, this guy can read minds.

    The node is implemented (in Haskell),

    That... surprises me less than it should.

    More realistically, I’ll be working on Unison part-time, in between paid consulting work.

    So does this.

    Anyway, if you scrape all the buzzword talk, it kinda-sorta sounds sensible. Not enough to make me an Unison Witness, but I'd be curious whether it does take off.



  • @anonymous234 said:

    some language that could capture what you want instead of just opaque instructions.

    Then I follow the link, and it all sounds sensible on the surface, but then I notice Charles Simonyi as the idea's author, and I understand it's not going to work.



  • It's annoying, because semantic editors are definitely going to be better than plain-text editors but a) you don't need to invent a whole new programming language to do it b) this guys talking about not allowing people to enter invalid code—I think I know programmers well enough to know how will that's gonna go down c) seemed like he didn't have much of an opinion about how to navigate in such an editor—when I think of structured editors, the ability to zip from block to block, navigating the actual tree structure of the code is definitely one of the killer features, so that he's not waxing eloquent about that seems like a bad sign d) regardless of all that other shit, though, can we just talk about how great it's gonna be when a stray token in one part of a file cannot cause a syntax error in a completely different section.


  • Discourse touched me in a no-no place

    @Buddy said:

    It's annoying, because semantic editors are definitely going to be better than plain-text editors

    Theoretically. Practically, you run the danger of getting sucked into ever more complex reasoning trying to resolve every possible case ahead of time, and you still won't actually know whether the code is correct (unless you've got a full formal description of what it's supposed to be, which is a really rare thing). There will always be a higher-level problem than you can reason about, because the amount of reasoning about Turing-complete systems that is tractable is really quite limited.

    Just being able to prevent syntax errors might be nice I suppose, but it's really not a very valuable thing; they're usually the easiest things to find and fix…



  • I'm with you on most of that; what I'm saying is that preventing bad code isn't the value that semantic editors are thing to add. The real advantage is that by replacing punctuation with widgets, you reduce the amount of busy-work that programmers have to do, make advanced refactoring tasks more intuitive, allow significantly richer diffs to be stored and displayed, and increase the performance of editors, which means increasing the amount of help they can provide and still feel responsive (think: the difference between Resharper and vanilla VS)



  • @dkf said:

    semantic editors

    True semantic editors (more specifically the analysis engine for understanding the semantics) are incredibly hard. Work has even been done using IBM Watson technology yet even that has fallen short (so far). That being said, if a reasonable approximation can be achieved then the impact would be HUGE.



  • @anonymous234 said:

    I'll confess something: in the past, I had my own dreams of inventing the architecture....

    I had my own dream of creating an architecture, and twice came up with designs that had a chance (ok, less than 1%, but still a chance) of actually becoming an architecture.





  • In Unison, terms and types are uniquely identified by a nameless hash of their structure. References stored in the syntax tree are by hash, and human-readable names are separately stored metadata used only for display purposes by the editor. As in Nix, the value associated with a hash never changes. “Modifying” a term creates a new term, with a new hash.

    Content-addressable data is a very interesting idea with very interesting properties. Having thought about it on and off for many years, it seems to me that one of its most interesting properties is just how viciously difficult it is to work with.


  • Discourse touched me in a no-no place

    @TheCPUWizard said:

    True semantic editors (more specifically the analysis engine for understanding the semantics) are incredibly hard.

    And they're hard for about the same reason that the semantic web is hard. We can describe these things easily enough, but really understanding them is a major challenge, and getting anyone or anything else to understand them is at least an order of magnitude harder than that. The reasoners have severe limitations, and the language for describing the system is itself a nightmare because it has an open set of descriptive terms (it's as hard as programming, except you usually can't find any of the interface definitions; all you've got is a few notes scribbled on the back of a bar mat in crayon, or not really any much better than that).

    By the time we've realized the benefit from these things, we'll be wondering if it was really worth all that work. From what I've seen, the semantic web people are retreating to controlled vocabularies, which users (and myself) really can see the benefit of. I wonder how long before the semantic editor people return to an equivalent position just to get shit out there and used. (Or are they going to be the Smug Lisp Weenies For A New Generation?)


  • Winner of the 2016 Presidential Election

    @dkf said:

    (it's as hard as programming, except you usually can't find any of the interface definitions; all you've got is a few notes scribbled on the back of a bar mat in crayon, or not really any much better than that)

    If I ever saw an apt description of W3C specs...



  • @ben_lubar said:

    @boomzilla said:
    >Perhaps 70% of developer time is spent dealing with parsing, serialization, and persistence. Values are encoded to and from JSON, to and from various binary formats, and to and from various persistent data stores… over and over again.

    Two lines: DEFINE_KEYFIELD(m_nMaxLiveAliens, FIELD_INTEGER, "MaxLiveAliens"), and MaxLiveAliens(integer) : "Max Num. of live aliens" : 1 : "Maximum number of live aliens that can be present in the level from this spawner at any given time."

    @boomzilla said:

    >Another 25% is spent on explicit networking. We don’t merely specify that a value must be sent from one node to another, we also specify how in exhaustive detail.

    Two lines: SendPropEHandle( SENDINFO( m_hAimTarget ) ), and CNetworkHandle( CBaseEntity, m_hAimTarget );

    Of course, Valve has an entire framework set up to make SendProp/RecvProps work. Including having both client and server copies of each class.



  • @powerlord said:

    Of course, Valve has an entire framework set up to make SendProp/RecvProps work. Including having both client and server copies of each class.

    And that's exactly the point. Most people use these frameworks to do this sort of thing. His numbers, to me, imply that people are rolling their own. That's a lie. It's an ass pull for drama.



  • If you're not making a new game engine, you shouldn't be dealing with the low level stuff at all.

    If you are making a new game engine, you'd better have a pretty good reason for it.


  • Discourse touched me in a no-no place

    @ben_lubar said:

    If you are making a new game engine, you'd better have a pretty good reason for it.

    Would “Because I can and I want to” be a good enough reason?

    What you say is what I'd say to anyone saying that they want to make a new programming language, and what I reply with is one of the very few truly acceptable reasons for going ahead anyway. :smile: It's easy to do write an esoteric language, but a full multi-platform production-grade language is really astoundingly hard, especially if you're also setting out to do a good, broad standard library to go with it. If you're going to take on a task of that magnitude, you'd better both want to do it and feel driven to do it; you'll need both to see it through. (Or a battalion of grad students junior devs minions.)



  • @dkf said:

    Would “Because I can and I want to” be a good enough reason?

    I'd say it depends on your expectations. If you intend it as an interesting / learning experience, go for it.


  • Discourse touched me in a no-no place

    @boomzilla said:

    If you intend it as an interesting / learning experience, go for it.

    The hard part is the step from “oh, I learned by doing this” to “this is a useful product for other people to build businesses on top of”.



  • @ben_lubar said:

    If you are making a new game engine, you'd better have a pretty good reason for it.

    I was shocked 4A made a game engine for Metro: 2033 and Metro: Last Light.

    Think of how much better those games could have been if they'd just used Unreal or Crytek or something.

    That said, their engine is excellent. But there's nothing in their game that couldn't have been done just as well in Unreal 3.0.



  • You stoked about Unreal 4 integrating with Visual Studio?



  • I don't do game dev so I don't really have any opinion on it.

    I am glad to see people following-up on the promise of XNA, even if I'm annoyed it's not Microsoft themselves.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    But there's nothing in their game that couldn't have been done just as well in Unreal 3.0.

    The really stand-out feature of those games was their sound design, which was much better than most games achieve. I don't know if their use of a custom engine was responsible for that, or whether they could have solved it with a common engine platform.



  • I believe that's more an effect of having really strong talent on design and spending plenty of time to release a perfected game.



  • The monogame people have been following up on the promise of XNA for some time now. They even got it working on Windows Universal. I'm just glad to see good tools become more readily available, especially alongside a good engine.

    @dkf said:

    The really stand-out feature of those games was their sound design, which was much better than most games achieve.

    I haven't played them, but that alone sounds like a very good reason. Some games (DOTA for one!) think you can get away with saying you have good 'sound design' because your characters banter with eachother, when a game like HoN beats them hands-down at even attack hit sounds.

    I wish more companies would put that much thought into it.



  • @Magus said:

    I haven't played them,

    The two Metro games are perhaps the best FPS games of the last 10 years. Definitely in the top 5.


  • Discourse touched me in a no-no place

    @Magus said:

    I haven't played them, but that alone sounds like a very good reason.

    The graphics are at the same sort of level that many games achieve now, though much thought appears to have been put into how things are interacted with. But the soundscape… that's something truly special. Eerie and claustrophobic, and downright scary at times, especially as you hear your character really struggling for breath as your air filter runs out while knowing you've not got time to change it because of the enemies you can hear closing in on your position…



  • I'm just not sure it's my genre. I've been quite enjoying Human Revolution, now that I'm finally bothering to play through it, and I intend to play that Wolfenstein game soon.

    I'm just not into horror.



  • @Magus said:

    and I intend to play that Wolfenstein game soon.

    Which one is "that" one? There's two out right now. What's with the pointless vagueness? Just type the damned name of the game.

    (Wolfenstein: The New Order is shockingly good, BTW. I haven't yet played The Old Blood.)

    @Magus said:

    I'm just not into horror.

    It's not really horror per-se.

    No more so than, say, Marathon or System Shock 2 was a horror game. Sure it had a lot of scary parts, and takes place in a really creepy setting, but. The game itself isn't really horror. Not like The Evil Within is horror.



  • @blakeyrat said:

    Wolfenstein: The New Order

    That one. Couldn't remember the name. I might try the other too afterward.

    @blakeyrat said:

    Sure it had a lot of scary parts, and takes place in a really creepy setting

    Those are things I really like in some types of game, like Diablo. It may not make much sense, but that's how it is.



  • If you like the very concept of gaming at all, you owe it to yourself to play Metro: 2033 (and STALKER: Shadow of Chernobyl if you haven't.)



  • I probably will at some point, but I never prioritize things (except at work), so even I never know when I'll get around to something.



  • Question... if I were to purchase the Metro games, should I get the originals or the HD versions?



  • Is your graphics card under 10 years old? If not, you're an idiot. If so, why are you asking?



  • I haven't played the HD versions.


  • Winner of the 2016 Presidential Election

    @blakeyrat said:

    If you like the very concept of gaming at all, you owe it to yourself to play Metro: 2033

    I got pissed off at the fact that even though I was playing with keyboard and mouse, it detected my controller and gave me all the prompts based on that.

    I'm also not much of a fan of FPS games in general, so it didn't do much for me, but I could tell it was a quality product for people who like that sort of thing



  • Are there already HD version of those games?!
    TIL: http://steamcommunity.com/app/286690/discussions/0/34094415777794897/



  • @Jaloopa said:

    I got pissed off at the fact that even though I was playing with keyboard and mouse, it detected my controller and gave me all the prompts based on that.

    Seriously?

    If that's the most negative thing you can say about the game, it must be a pretty great game. Because how fucking petty can you get?

    "Oh noes!!! A game assumed I'd use the GAME CONTROLLER plugged into my computer to CONTROL the GAME!!! THIS IS NAZI STUFF!!!!!"


  • Winner of the 2016 Presidential Election

    Normally I use a controller over keyboard and mouse, but an FPS is one of the situations where the mouse is better. I got to a bit with something like "press X to jump", and it wasn't the X key. This immediately broke the flow for me, leading to trying a few keys before going into the options to look through the list of keybindings to look it up. Similar stuff with putting on the gas mask and other pretty fundamental operations.

    It's not hard to detect which input the player is using. The Saints' Row games get this right, if you use a controller on the menu it has button prompts, if you decide to use the mouse or arrow keys to navigate it automatically changes the UI. Metro required me to unplug the controller before I started if I wanted it to correctly tell me how to control the game.

    This is all incidental though, Like I said, the main reason I didn't enjoy it, despite seeing that it was obviously a quality product, was that the genre doesn't appeal to me.


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.