This site design made me say WTF



  • @blakeyrat said:

    I say that to prop-up my own ego and impress chicks.

    ...you always leave an impression on folks


    Flagged for "Thread derailed": suddenly we're talking about the War of 1812.

    At least I am.


  • ♿ (Parody)

    One time I went to see the Fireworks show on the National Mall. We were right in front of the Capitol. The Marine Corps Band was there playing the 1812 Overture, including actual cannons firing at appropriate times. They echoed off the Capitol. It was very cool and a little bit creepy.



  • You haven't lived until you've heard 1812 performed with actual artillery. I recommend it to everybody.

    ... did they seriously use cannons? Or modern rifled guns? Cannons would have been even more awesome.



  • @blakeyrat said:

    Given the choice between Python and Ruby, I'd rather work in Ruby. But I don't particularly dislike Python. It just has shitty tools. But so does Ruby. And Java. And GoLang, and Node, and etc.

    Out of curiosity, what do you think are the least shitty tools? And are we talking IDEs and the like or did you mean something else?



  • @kilroo said:

    Out of curiosity, what do you think are the least shitty tools? And are we talking IDEs and the like or did you mean something else?

    This was already answered in another thread (TL to dig it up). Pre-empting the Blakeyrant:

    1. Visual Studio full stop, but only for C#. (I think he'd go flying off the deep end if asked to hack SML.NET)
    2. IDEs, as he cannot imagine how people develop without them.


  • Well, I should have specified, but I was actually wondering (assuming he meant IDEs) whether he has ever bothered to identify a runner-up to Visual Studio.

    I'm guessing the "full stop" means you don't think he has.

    Personally, despite its warts, I find Eclipse to work pretty well for what I do...but that is admittedly partly because (1) I really like Mylyn/Tasktop and (2) the Eclipse compare editor was the first compare editor I ever used enough to understand it.



  • @blakeyrat said:

    You haven't lived until you've heard 1812 performed with actual artillery instead of an orchestra.

    FTFY



  • @kilroo said:

    Well, I should have specified, but I was actually wondering (assuming he meant IDEs) whether he has ever bothered to identify a runner-up to Visual Studio.

    I'm guessing the "full stop" means you don't think he has

    He has not, from the Blakeyrants on the topic that I have read. He also somehow believes that learning anything 'beyond' C# is a waste of his time. Blub paradox much?

    @kilroo said:

    Personally, despite its warts, I find Eclipse to work pretty well for what I do...but that is admittedly partly because (1) I really like Mylyn/Tasktop and (2) the Eclipse compare editor was the first compare editor I ever used enough to understand it.

    Eclipse has plenty of warts. But I manage well enough to get along with Eclipse (C++, ColdFusion, Clojure) and VS (for C++) (at the same time!) as well as being able to operate without an IDE. (I had quite the shock during my Computer Engineering senior capstone when none of my teammates could figure out how to do command line builds using CMake + make.) I've also poked at IntelliJ IDEA (whilst poking at Scala).



  • @tarunik said:

    Visual Studio full stop, but only for C#. (I think he'd go flying off the deep end if asked to hack SML.NET)

    Where do you get this shit from?

    Last time I had to do Ruby work, I used IronRuby which plugs into VS and did a decently good job, just the debugger sucked-ass.

    @tarunik said:

    IDEs, as he cannot imagine how people develop without them.

    I can imagine exactly how people develop without them, I just think it's fucking stupid to do so.

    @tarunik said:

    He has not, from the Blakeyrants on the topic that I have read. He also somehow believes that learning anything 'beyond' C# is a waste of his time. Blub paradox much?

    What. The. Fuck.



  • Well, Microsoft dropped XNA, but the community picked it up and made Monogame, which has the added advantage of being cross-platform. I use it. It's pretty cool.



  • @trithne said:

    Wait, this is Azure? Maybe I've not been paying enough attention, but I didn't think that was what Azure was for.

    Filed Under: Still sore about dropping XNA

    Yes, sure it is. Azure is for cloud computing. Why shouldn't you offload serious number crunching to a scalable number crunching platform?

    That was what the presentation was all about.



  • Nah, you need to hear it at the Hatch Shell...



  • @Groaner said:

    I'll believe that when I see a physics-heavy game server (on the order of hundreds or thousands of simultaneous rigid bodies) written in it. Bonus points if it accomplishes this without depending on an unmanaged game engine.

    Working on it! (No, really, I am.)



  • I flinched when I saw that page. That is a crime against the interwebs.



  • @blakeyrat said:

    You haven't lived until you've heard 1812 performed with actual artillery.

    I've heard recordings, but that's not at all the same thing, of course. I have a vague memory of I think having heard it live at the Hollywood Bowl once, several decades ago. Maybe. If that memory is accurate and not just a figment of my imagination, it would probably have been in the late 70's or very early 80's.


  • Discourse touched me in a no-no place

    @Magus said:

    Microsoft dropped XNA

    I'd love to know what the hell they were thinking[1] and I'd love to know what the replacement is. I wanted to toy around with some 3D stuff and I don't know what to use in .Net now.

    [1] yes, I know, welcome to the party. I'm fashionably late.



  • Like I said, what you now use is Monogame, which works under .Net perfectly. I'm using it for a windows store app. XNA can be ported to it in most cases with no effort, and all the tutorials apply. Of course, if you just want to learn 3D stuff, I'd suggest looking into the library Monogame uses for OpenGL, OpenTK. I had done very basic OpenGL in university in C++ and was pleasantly surprised to see a wrapper that was so much like the base library, yet so much easier to use (think GL(DOT) rather than GL(UNDERSCORE) and what that means to intellisense!).

    All the DirectX tutorials I tried started with "WELL FIRST YOU NEED TO LEARN HLSL AND WRITE A VERTEX SHADER SO THAT YOU CAN SEE POINTS YOU DRAW! DIRECTX IS USELESS UNLESS YOU KNOW EVERYTHING, SO WE'LL START WITH THE BASICS. BTW YOU WILL NOW USE C++.", whereas the OpenGL tutorials explain how to paint something in a solid color in one line, and very quickly move on. Not sure what's with that.



  • Sorry if I stated your Blakeyworld views, not the reality you have dealt with in the past.



  • @Arantor said:

    Microsoft were demoing such a thing at a game dev conference a couple of months ago. It was primarily a showcase for Azure where the physics modelling for a highly destructible terrain was being done on the server with graphics work being done on the client; on the order of 40,000 physics objects being modelled simultaneously. Watching the demo never waver from the 30 FPS benchmark was quite interesting (as compared to the fully local model doing the same thing, barely struggling to make even 2 FPS)

    This one?

    @One of the more sensible commenters said:

    This would mean that the data packets -- for all of the geometry
    calculations happening server-side -- would have to be transferred to
    the Xbox One, just as they're happening in the cloud. The system would
    have to ping the servers for every update of the physics happening on
    the server (for every chunk that falls or every particle that breaks).

    You would need a zero latency connection for it to work the way it appears
    in the video, as any minor hiccup would cause a stutter in the render
    update for the physics.

    Having the cloud handle the heavy lifting and letting client machines be dumb terminals is certainly an interesting idea, and very much possible. I would feel uneasy about the architecture of such a solution, however. There are a lot of moving parts. Suddenly, you would have to worry about latencies between individual nodes, and have some form of reliable communication between them (since a small anomaly in a physics simulation can have butterfly-like effects). Memory accesses are nanoseconds. Pings are microseconds to milliseconds. Then there's all the synchronization and staying within the ~30ms frame budget.

    I only consider myself a novice* when it comes to netcode, but from what I've read and what I've seen, it's a pretty hard problem, and developers use all sorts of dirty cheating tricks to make the most of their bandwidth. Single-precision floats receive as lossy a compression as possible. TCP is still considered the Great Satan, even if it shortens development time. Eye candy effects (particles, ragdolls, etc.) usually exist only on the client.

    I'm going to be slightly more optimistic than the commenter above and say that one could let the server be completely authoritative and not run a physics simulation on the client machines at all, save for simple linear interpolation/prediction, and it might work under ideal circumstances (and smooth out minor jitter), but as soon as you get a ping spike, you get some interesting phenomena like objects intersecting or tunneling through one another.

    @chubertdev said:

    This one narrow use case means that it's awesome!

    It's a narrow use case that's particularly important to those of us developing multiplayer games in our spare time that will be the Next Big Thing™ and allow us to get rich and retire from maintaining other people's WTF code so that we only have to maintain our own WTF code!

    Besides, he used an absolute. Absolutes are irresistible pedantic dickweed bait.

    @blakeyrat said:

    I have a buddy who made a slick demo with 3000 (IIRC) bodies using the new System.Threading.Tasks classes.

    Not too shabby. Which physics engine did s/he use? (I'm assuming s/he didn't write one from scratch, because that's a whole separate world of hurt...)

    @Matches said:

    Working on it! (No, really, I am.)

    I know, and you're using Unity, so you have both my sympathies and my best wishes.

    I'm going to stay in C++ land until I see a compelling upgrade path for my project to a superior platform that won't require vendor lock-in or a major rewrite.

    *Reverse Dunning-Kruger?


  • Yes, that's the one. Fun conference.

    Oh yes, the cloud as physics farm is a work in progress and it has some serious issues to contend with but the fact that it's possible at all is an interesting development - and the economies of scale mean it's possible even at indie scale to try this stuff.



  • Using unity is a yes and no statement. My architecture essentially does ipc to move the game event immediately to a local game server which is true . Net 4.5, objects are shared as Json objects, the unity client focuses entirely on rendering, the local server manages sync events, network multiplayer stuff, and ai. (Dumbed down explanation, but yeah. )

    Unity mono simply isn't advanced enough to handle the task. True async would be nightmarish, and features delegated to .net 2.0 or 3.5. Not good.

    Things are changing, but my game back end is completely independent from unity front end, and porting to a new engine would only mean fixing up the ui which is not too terribly complex.



  • @Magus said:

    All the DirectX tutorials I tried started with "WELL FIRST YOU NEED TO LEARN HLSL AND WRITE A VERTEX SHADER SO THAT YOU CAN SEE POINTS YOU DRAW! DIRECTX IS USELESS UNLESS YOU KNOW EVERYTHING, SO WE'LL START WITH THE BASICS. BTW YOU WILL NOW USE C++.", whereas the OpenGL tutorials explain how to paint something in a solid color in one line, and very quickly move on. Not sure what's with that.

    DirectX's inscrutable nature led me to learn OpenGL.
    OpenGL's difficulty in doing anything more complex than rendering a few polygons led me to learn Ogre.

    @Arantor said:

    the economies of scale mean it's possible even at indie scale to try this stuff.

    If anything, I like that I can quickly and cheaply spin up a game server, or hundreds.

    @Matches said:

    my game back end is completely independent from unity front end, and porting to a new engine would only mean fixing up the ui which is not too terribly complex.

    That's a good position in which to be.


  • Discourse touched me in a no-no place

    @Groaner said:

    There are a lot of moving parts.

    That'd be the big worry. Latency and jitter would be the killer. However they wouldn't be a particular problem within the cloud part; it's not that hard to set things up so that things for a particular user are physically colocated with dedicated network (well, up to a certain size, above which everything gets hard). Right now, it's only expensive configs that work that way (e.g., the top end of the Amazon allocations) but if it is your major use case, you can easily crank them out.

    No, it's the latency and jitter of the client that will really be a problem, and that's a problem that is hard for the game system makers to do anything about. [spoiler](Cue the “Blame the ISPs” thread…)[/spoiler]


  • Discourse touched me in a no-no place

    @Zemm said:

    At uni my "web publishing" unit requested CGI scripts written in C. Circa 2002. I did it in perl and still got full marks.

    Ah, but Perl is written in C. That perl script? It's just a minor configuration file…

    (I've heard of this argument being used for real in some companies.)



  • @tufty linked this in another thread, I middle mouse click opened it and kept reading discourse.

    I was closing tabs, and this site popped up, and my visual cortex felt assaulted.


  • Discourse touched me in a no-no place

    @Magus said:

    Not sure what's with that.

    Sturgeon's law, obviously.


  • Grade A Premium Asshole

    @blakeyrat said:

    Last time I had to do Ruby work, I used IronRuby which plugs into VS and did a decently good job, just the debugger sucked-ass.

    A programmer of your caliber needs to use a debugger? My world is all askew now. Fuckall, I am sure of nothing now.

    @blakeyrat


  • Discourse touched me in a no-no place

    WTF. I replied in this topic, and now it's muted.



  • @Intercourse said:

    A programmer of your caliber needs to use a debugger?

    Using a debugger is what makes a programmer high-caliber.


  • Grade A Premium Asshole

    I thought compilers changed how they worked to fit your code, because obviously you are never wrong.



  • You're doing some kind of really shitty comedy bit here that I'm just not following, buddy.



  • I would have thought your dry wit would have picked up on it.

    Basically, you're now officially TDWTF's answer to Chuck Norris. You don't need a debugger, because you're never wrong and if you are wrong, the compiler will adjust the universe on your behalf.



  • Ok; and when does it get... funny?



  • @blakeyrat said:

    Ok; and when does it get... funny?

    What made you assume the humour was all for your benefit? Don't be selfish, let the rest of us have a laugh too.


  • Grade A Premium Asshole

    @Arantor said:

    What made you assume the humour was all for your benefit? Don't be selfish, let the rest of us have a laugh too.

    @blakeyrat can divide by zero.


  • Grade A Premium Asshole

    @Arantor said:

    What made you assume the humour was all for your benefit? Don't be selfish, let the rest of us have a laugh too.

    @blakeyrat can write a "Hello World" app in three bytes. He would do it in less, but he does not want to show off.


  • Grade A Premium Asshole

    @Arantor said:

    What made you assume the humour was all for your benefit? Don't be selfish, let the rest of us have a laugh too.

    @blakeyrat never includes System. That's cheating.


  • Grade A Premium Asshole

    @Arantor said:

    What made you assume the humour was all for your benefit? Don't be selfish, let the rest of us have a laugh too.

    @blakeyrat always removes the F1 key from any keyboard he uses. Computers ask him for help. Not the other way around.



  • @blakeyrat said:

    Using a debugger is what makes a programmer high-caliber.

    Couldn't agree more.

    I was surprised when I realized how many programmers are out there who have no idea how to use the tooling they have available. Especially prevalent with designers turned frontend people I have encountered. A "senior" frontend guy using console.logs, not even noticing the sources tab in Chrome.

    Really people, learn your fucking tools.



  • Funny blakey facts are funny, but nothing beats the jon skeet facts page.

    Hilarious.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    Using a debugger is what makes a programmer high-caliber.

    That and a subsidised vending machine.



  • @cartman82 said:

    jon skeet facts page.

    Now with official sanction from the powers that be!


    Stack Overflow Is You
    11-24-08 by Jeff Atwood. 31 comments
    (...)
    Funny stuff. We do prefer that questions on Stack Overflow stay on the topic of programming, but as Joel and I have discussed before on the podcast, this is somewhat subjective, and it’s OK to err on the side of “fun” every now and then. Not all the time, mind you, but occasional peripherally related digressions that the community enjoys (and upvotes) are perfectly fine.


    locked by Robert Harvey Sep 22 '11 at 0:57
    This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. This question and its answers are frozen and cannot be changed. More info: help center.

    @cartman82 said:

    Hilarious.


  • ♿ (Parody)

    @blakeyrat said:

    Using a debugger The Second Amendment is what makes a programmer high-caliber.

    USATFY



  • @boomzilla said:

    blakeyrat:
    Using a debugger The Second Amendment is what allows a programmer to be high-caliber.

    Strictly speaking, prohibits the USG from making laws restricting your right to be a high caliber programmer.


    But what about the militia thing? Are OS projects programmers' militia?

    EDIT: Oh; forgot. Blakey's avatar went where?... Discurse.



  • @Intercourse said:

    @blakeyrat always removes the F1 key from any keyboard he uses. Computers ask him for help. Not the other way around.

    Exactly. He's TDWTF's answer to Chuck Norris.

    And I gotta admit, if the battle were Chuck Norris vs @blakeyrat, I know where my money would be.


  • BINNED

    @Arantor said:

    I know where my money would be.

    On JCVD ?



  • @Luhmann said:

    On JCVD ?

    Something like that.


  • Grade A Premium Asshole

    @Arantor said:

    And I gotta admit, if the battle were Chuck Norris vs @blakeyrat, I know where my money would be.

    My money would be on @boomzilla swooping in and kicking both of their asses. Chuck Norris does martial arts to make money. @boomzilla does it as a hobby. He thinks kicking people's asses is fun. While boomzilla is kicking ass, my money would be on @blakeyrat complaining and telling him how he is doing it wrong while contributing nothing positive to the situation.



  • Like I said, I knew where my money would be. I didn't say it would be on either of them. 😃


  • ♿ (Parody)

    @Intercourse said:

    Chuck Norris does martial arts to make money.

    Yes, but he's been practicing forever and I suspect he does it for the sake of it more than for the money.

    @Intercourse said:

    My money would be on @boomzilla swooping in and kicking both of their asses.

    Then I will have failed. 😢


Log in to reply