Optifine modder rips the Minecraft devs for the code in the newest version.


  • Discourse touched me in a no-no place

    @Eldelshell said:

    No way man, she's already running a dedicated 1GB NVidia. It used to run quite fine until 1.8.

    Hell, that card moves Skyrim without any problem!

    ROFLOL--you say that like the comparison is meaningful. Just out of curiosity, which model?

    But do check the VBOs, and also make sure if you try to downgrade you do it on a backup like I said, just in case they didn't put a conversion routine in. In case you didn't know, one of the 1.7->1.8 changes was moving from numeric block and entity ids to text ones (e.g., "minecraft:stone") and at least early 1.7s didn't understand the new thing, so blocks would get converted to different types and entities basically vanished: your inventory and that of all your chests went poof. They may have added code in later releases to deal with that but I dont' remember.



  • I don't know, I'm currently in Sweden and not able to access her box. But anyway, it was running at 60fps with MCA and other big mods until I downloaded 1.8

    And why is Minecraft and Skyrim not comparable? They are both 3D games! Skyrim should be putting that box to its knees and far from that it runs at 30-40 fps without any mod.


  • ♿ (Parody)

    @Eldelshell said:

    I don't know, I'm currently in Sweden and not able to access her box.

    Hubba hubba.


  • Grade A Premium Asshole

    @Eldelshell said:

    I don't know, I'm currently in Sweden and not able to access her box.

    https://www.youtube.com/watch?v=YWFW0B9EBx0


  • Banned

    @FrostCat said:

    one of the 1.7->1.8 changes was moving from numeric block and entity ids to text ones (e.g., "minecraft:stone")

    So much hate.


  • Discourse touched me in a no-no place

    @boomzilla said:

    Hubba hubba.

    You're talking about his daughter.


  • Discourse touched me in a no-no place

    @Eldelshell said:

    And why is Minecraft and Skyrim not comparable?

    One's well-made, and the other's not?

    Yeah, sorry, couldn't keep a straight face, they're both coded by idiots.



  • You sick bastards.


  • Discourse touched me in a no-no place

    @Gaska said:

    So much hate.

    What?


  • Discourse touched me in a no-no place

    @Eldelshell said:

    You sick bastards.

    I assume they didn't realize.


  • ♿ (Parody)

    @FrostCat said:

    You're talking about his daughter.

    @Eldelshell said:

    You sick bastards.

    Hey, @Eldelshell was the one talking about her box.

    @FrostCat said:

    I assume they didn't realize.

    I remembered, but the joke won out over propriety.


  • Discourse touched me in a no-no place

    @boomzilla said:

    I remembered, but the joke won out over propriety.

    As someone with a daughter I would actually skip that joke.



  • @Maciejasjmj said:

    Do you have a fetish for manually deallocating objects?

    It gets me so hot.

    @tarunik said:

    On the other hand, this attitude can get toxic at times, with folks swearing off proper abstractions in favor of spaghetti code and not having any trust at all in their compilers. It's much easier to optimize what your profiler is telling you is a hotspot when you can isolate that hotspot from the rest of your code!

    Here here. Also, as much as I hate Java with the fire of 10000 supernovas, the whole GC debate seems to have its tits akilter. GC exists to make development easier and programs less prone to segfaults and other major issues. It is not and has never been the silver bullet of having no memory management worries that it gets cast as. Plus, it's not especially relevant to Minecraft, which (if the Optifine dev is to be believed) ran fine until they introduced these wasteful throwaway objects. Even in Java, there must be a better way to do this than creating a couple hundred megs of immutables to be GCed every second, even if the devs aren't willing to juggle int triplets.

    @FrostCat said:

    If you're making 50-200MB of objects a second only to throw them immediately away, to the point that you're GCing every 4 seconds and killing the framerate, you are, in fact, Doing It WrongTM.

    I don't have enough likes in the likebox to like this as well as it deserves.

    @Gaska said:

    No abstraction doesn't mean spaghetti code, or vice versa.

    It's not a logical implication, no, but it does tend to happen that way. Also, bear in mind that abstraction doesn't necessarily require OOP style, which is another thing that gets closely associated with spaghetti. Indeed, that seems to be the problem we're discussing: that the MC devs chose to bundle their troubles into small objects which then implode the GC every few seconds.

    @Gaska said:

    If disabling VBOs increases performance, I don't want to ever, ever, ever see their codebase. Ever.

    This probably has more to do with hardware than software; trying to use VBOs on underpowered integrated chips usually means using something akin to software emulated graphics, which ends about as well as you can imagine.

    @FrostCat said:

    I bet Java Hello World doesn't take only a couple dozen megs of memory.

    Because I just can't leave well enough alone:

    HEAP SUMMARY:
    in use at exit: 31,555,444 bytes in 1,155 blocks
    total heap usage: 4,063 allocs, 2,908 frees, 31,920,144 bytes allocated

    LEAK SUMMARY:
    definitely lost: 4,331 bytes in 30 blocks
    indirectly lost: 5,104 bytes in 23 blocks
    possibly lost: 173,909 bytes in 150 blocks
    still reachable: 31,372,100 bytes in 952 blocks
    suppressed: 0 bytes in 0 blocks

    From running valgrind on this code:

    class HelloWorldApp {
    public static void main(String[] args) {
    System.out.println("Hello World!");
    }
    }

    @FrostCat said:

    Snort. I'm kind of tempted to write a .asm version, or something in IL, just for comparison.

    Shockingly, node.js had these results:

    HEAP SUMMARY:
    in use at exit: 85,339 bytes in 61 blocks
    total heap usage: 2,848 allocs, 2,787 frees, 17,269,877 bytes allocated

    LEAK SUMMARY:
    definitely lost: 1,144 bytes in 45 blocks
    indirectly lost: 8,192 bytes in 1 blocks
    possibly lost: 856 bytes in 3 blocks
    still reachable: 75,147 bytes in 12 blocks
    suppressed: 0 bytes in 0 blocks

    I actually expected node to be much bulkier. Maybe I should create an object to print Hello World and compare that...

    @Gaska said:

    Still don't want to see their codebase. Mostly because of Java. I programmed in it for two days, and don't want anymore ever in my life.

    This is a common reaction to Java that I also share.

    @blakeyrat said:

    Most spaghetti code I've seen in the last 5-10 years has been specifically due to abstraction. Dozens of pointless levels of unnecessary abstraction which make it impossible to just grab a debugger and step through what the code is actually doing.

    This would be that needless OOP I mentioned earlier. The litmus test is usually whether an object has more than one private elements that make no sense as part of another object; 80% of the time the object is just fluff and can be cannibalized by another.

    This is one of the unforeseen side effects of C++: abstraction is so cheap, it lets those crazy abstraction junkies create inheritance trees that make the family trees of the European monarchs look simple. I'm always a little surprised that Java commonly has the same issue. (AbstractFactory, anyone?)

    @Gaska said:

    FrostCat:
    one of the 1.7->1.8 changes was moving from numeric block and entity ids to text ones (e.g., "minecraft:stone")

    So much hate.

    Seriously; I can understand representing those in an editor or in the console as text, but storing them that way?

    So, I notice noone has brought up MS's acquisition. Anyone care to speculate how long it'll take them to port MC to .net?


  • Discourse touched me in a no-no place

    @VaelynPhi said:

    ran fine

    Ha! Maybe in comparison to the current version, but not in an absolute sense.


  • Discourse touched me in a no-no place

    @VaelynPhi said:

    Seriously; I can understand representing those in an editor or in the console as text, but storing them that way?

    Have you ever seen the NBT format spec? It's not necessarily bad, but it's braindead. I take that as a portent of the quality of the rest of the codebase.

    @VaelynPhi said:

    Anyone care to speculate how long it'll take them to port MC to .net?

    I don't think they'll do it, when it's already been ported to 7 other consoles.



  • @blakeyrat said:

    Most spaghetti code I've seen in the last 5-10 years has been specifically due to abstraction. Dozens of pointless levels of unnecessary abstraction which make it impossible to just grab a debugger and step through what the code is actually doing.

    That's the opposite problem to what I'm referring to. Jenga code is its own can of worms...

    @VaelynPhi said:

    Here here. Also, as much as I hate Java with the fire of 10000 supernovas, the whole GC debate seems to have its tits akilter. GC exists to make development easier and programs less prone to segfaults and other major issues. It is not and has never been the silver bullet of having no memory management worries that it gets cast as.

    BINGO! People don't understand that just because they don't have to manage memory explicitly, they aren't absolved of the need to manage resources.

    @VaelynPhi said:

    It's not a logical implication, no, but it does tend to happen that way. Also, bear in mind that abstraction doesn't necessarily require OOP style, which is another thing that gets closely associated with spaghetti. Indeed, that seems to be the problem we're discussing: that the MC devs chose to bundle their troubles into small objects which then implode the GC every few seconds.

    This as well: when I say 'proper abstraction', I mean 'not having manually-written-out tree-traversal in the middle of your culling algorithm', not 'every last little thing in its own object'. As I said above, Jenga code (i.e. code that suffers from an excess of invented abstraction layers, and resembles a half-finished Jenga game as a result), is its own can of worms, and can really kill performance as well because you wind up losing the ability to implement efficient algorithms to the excessive and wrong boundaries introduced -- whether between functions, between threads/processes, or between machines. For reasonably large n, getting a better big-O beats micro-tweaking, every last time.

    @VaelynPhi said:

    This would be that needless OOP I mentioned earlier. The litmus test is usually whether an object has more than one private elements that make no sense as part of another object; 80% of the time the object is just fluff and can be cannibalized by another.

    This is one of the unforeseen side effects of C++: abstraction is so cheap, it lets those crazy abstraction junkies create inheritance trees that make the family trees of the European monarchs look simple. I'm always a little surprised that Java commonly has the same issue. (AbstractFactory, anyone?)


    Well put. People split out objects that don't need to be split out, then abuse inheritance instead of using composition in a sane fashion, simply to satisfy some sort of 'object fetish'. TRWTF is OOP evangelism?



  • @tarunik said:

    TRWTF is OOP evangelism?

    I think it's overcompensation for an imagined problem. OOP as a tool can make software very well-structured. However, people are often taught programming from a dimwitted declarative perspective: you type in instructions and the computer follows them exactly like a recipe. Of course, even here the metaphor hints that this is too simplistic, since any good chef knows recipes contain recipes, and composing them is not as straightforward as dumping one set of steps into the other. Yet, we get CS grads whose approach to development is significantly less sophisticated than someone coming out of culinary school...

    So in step the OOP evangelists with what they imagine is the solution. Then, instead of reteaching those devs decent coding practices, they just introduce another tool, and the devs use that tool in just as blank a manner as they used the rest of them: like a shitty recipe.

    You get basically the same kind of nonsense from the functional evangelists, but with a different effect. I think this is because Lisp is difficult enough that it baptises in fire. The OOP standard, C++, isn't nearly hard enough. Actually, it occurs to me that the OOP standard might be Java... maybe that does explain things. Hrm.


  • Banned

    @FrostCat said:

    What?

    They went away from The Good Thing in order to replace it with The Bad Thing. Integers are fixed size, stored directly by value, can be used as array indices and comparison is cheap. Strings are variable size, gated behind two dereferences, comparison is expensive, and must use non-trivial algorithms to be used as array indices.

    @VaelynPhi said:

    This probably has more to do with hardware than software; trying to use VBOs on underpowered integrated chips usually means using something akin to software emulated graphics, which ends about as well as you can imagine.

    [citation needed]

    No, seriously, I can't find any source on that, and it seems like very useful and important information. Also, bear in mind that 99% of computers nowadays support DX10.1/OGL3.3.

    @tarunik said:

    This as well: when I say 'proper abstraction', I mean 'not having manually-written-out tree-traversal in the middle of your culling algorithm', not 'every last little thing in its own object'.

    That's very poor definition of 'proper abstraction'. Doesn't define it really. Reminds me of that xkcd: "What is Google+?" "Not Facebook".

    Abstraction, of any kind, means indirection. Indirection means pointer dereference in one form or another. Pointer dereference means risk of cache misses. Cache misses are dozens CPU cycles lost each.

    @VaelynPhi said:

    we get CS grads whose approach to development is significantly less sophisticated than someone coming out of culinary school..

    Cooking as profession is over 5000 years old. CS is one hundredth of that.



  • @Gaska said:

    VaelynPhi:
    This probably has more to do with hardware than software; trying to use VBOs on underpowered integrated chips usually means using something akin to software emulated graphics, which ends about as well as you can imagine.

    [citation needed]

    No, seriously, I can't find any source on that, and it seems like very useful and important information. Also, bear in mind that 99% of computers nowadays support DX10.1/OGL3.3.

    It's less useful now than it used to be, actually. It's difficult for me to find a source on it too. These days, it's mostly trivium. It boils down to the fact that, on platforms without hardware support for VBOs, they were translated into immediate mode calls. For smalltime things this isn't such a big deal, but for really complex calculations it can be enough to crash the program. The iPhone had this issue back around v3 or so. Same with some integrated graphics chips from around the same generation.

    I doubt it's a common problem now, but it is the only explanation I could think of for why disabling VBOs could cause a jump in performance (presumably because the code written only in immediate mode calls is optimized for it, whereas VBO code might not do well in the translation).

    @Gaska said:

    Cooking as profession is over 5000 years old. CS is one hundredth of that.

    So is prostitution, but that doesn't mean anyone's made any major developments in it recently. :P



  • @Gaska said:

    That's very poor definition of 'proper abstraction'. Doesn't define it really. Reminds me of that xkcd: "What is Google+?" "Not Facebook".

    I was contrasting examples, not trying to approximate a rigorous definition there. What I am saying is that 'abstraction should be used when you have eminently reusable factors available to you' instead of the abstraction-for-abstraction's-sake brainworms that have infected the OO world.

    @Gaska said:

    Abstraction, of any kind, means indirection. Indirection means pointer dereference in one form or another. Pointer dereference means risk of cache misses. Cache misses are dozens CPU cycles lost each.

    So std::vector is still inferior to juggling C-style arrays and pointers directly, even when it's been shown (WARNING: SO link as the original thing the answer links to died) that industrial-grade compilers generate the same assembler for basic operations in both cases? Also keep in mind that std::vector has much more flexibility available (the C-style approach has no equivalent to .capacity() separate from .size()), and can match C-arrays easily in just about all aspects (WARNING: SO link again) when used properly?

    So: having consistent abstractions for things as important as basic data structures is a good idea. Remember that an abstract data type in C++ is syntactical sugar for a struct in C with functions to manipulate it...


  • Discourse touched me in a no-no place

    @Gaska said:

    They went away from The Good Thing in order to replace it with The Bad Thing. Integers are fixed size, stored directly by value, can be used as array indices and comparison is cheap. Strings are variable size, gated behind two dereferences, comparison is expensive, and must use non-trivial algorithms to be used as array indices.

    Ah, the true nature of the complaint. I figure in the grand scheme of things it's a minor WTF, and it "easily" solves the problem they were facing. I'm much more concerned with the more serious problems like runaway memory usage.


  • Discourse touched me in a no-no place

    @Gaska said:

    [citation needed]

    No, seriously, I can't find any source on that, and it seems like very useful and important information. Also, bear in mind that 99% of computers nowadays support DX10.1/OGL3.3.

    They were explicitly griping about having to continue to support OpenGL 1.x, so draw your own conclusion.

    My guess is buggy video cards, somehow, but I haven't heard any specifics. You know how forums are: "Help my minecraft don't work sincne i got neew Java".


  • Discourse touched me in a no-no place

    @VaelynPhi said:

    So is prostitution, but that doesn't mean anyone's made any major developments in it recently.

    The Internet, as in so many areas, has enabled disintermediation.


  • Banned

    @VaelynPhi said:

    The iPhone had this issue back around v3 or so. Same with some integrated graphics chips from around the same generation.

    Smartphones are special case - they appeared around 2007, and their architecture is so different from PC that crappy drivers and GPU designs were unavoidable. But with years they matured, and they all use OpenGL ES now, which is very very close to PC OpenGL, both in features and compatibility.

    @tarunik said:

    So std::vector is still inferior to juggling C-style arrays and pointers directly, even when it's been shown (WARNING: SO link as the original thing the answer links to died) that industrial-grade compilers generate the same assembler for basic operations in both cases?

    Okay, maybe I went too far with this "of any kind" of mine. Anyway - you're right, vector is as good as C-style arrays. Because essentially, it is C-style arrays.

    @tarunik said:

    the C-style approach has no equivalent to .capacity() separate from .size()

    Not if you don't program it yourself. But it's definitely doable in C. Everything doable in C++ is doable in C, though sometimes with much more effort (in this particular case, with minimal amount of more effort actually).

    @tarunik said:

    and can match C-arrays easily in just about all aspects (WARNING: SO link again) when used properly?

    Vector elements can't be allocated on stack. But that's not a bad thing - it's just a situation where vector is wrong choice for the task at hand, and one should use std::array.

    @FrostCat said:

    Ah, the true nature of the complaint. I figure in the grand scheme of things it's a minor WTF, and it "easily" solves the problem they were facing. I'm much more concerned with the more serious problems like runaway memory usage.

    You don't get that it will going to at least double memory consumption of every instance of a brick in the whole infinite Minecraft world, do you?

    @FrostCat said:

    They were explicitly griping about having to continue to support OpenGL 1.x, so draw your own conclusion.

    They're braindead fucktards. No hardware not supporting at least OGL2 has even 512MB RAM needed to even start their motherfucking game, let alone play it.



  • The quality of both recipes and programs are determined by one main metric: execution.


  • Discourse touched me in a no-no place

    @Gaska said:

    You don't get that it will going to at least double memory consumption of every instance of a brick in the whole infinite Minecraft world, do you?

    I do. But given that the world will only have so many blocks in memory at any given time, it doesn't particularly bother me the way other things do. While I like the idea of not using more memory than you need, and I appreciate a clever way to reduce memory usage, I'm not the kind of purist that gets outraged simply over wasteful memory usage: I have a 4GB machine for a reason. The id type switch isn't what causes the frame rate to nosedive every few seconds; that's much more troubling to me.


  • Discourse touched me in a no-no place

    @Gaska said:

    No hardware not supporting at least OGL2 has even 512MB RAM needed to even start their motherfucking game, let alone play it.

    Since that was a snipped-out quote presented without more context, it's remotely possible they were talking about the Android port. I dunno.

    Regardless, many people have a fetish involving not dropping ancient hardware, as I discovered when I proposed removing DOS-specific code (I mean DOS as distinct from Windows) from the Omega Roguelike source tree in approximately 2000.


  • Banned

    @FrostCat said:

    I do. But given that the world will only have so many blocks in memory at any given time, it doesn't particularly bother me the way other things do.

    Expect it's not just visible blocks. It's any blocks. Inventories. Counters. Crafting formulas. Everything that has anything to do with any bricks.

    @FrostCat said:

    Regardless, many people have a fetish involving not dropping ancient hardware, as I discovered when I proposed removing DOS-specific code (I mean DOS as distinct from Windows) from the Omega Roguelike source tree in approximately 2000.

    Those people deserve their own circle in hell.


  • Discourse touched me in a no-no place

    @Gaska said:

    Expect it's not just visible blocks. It's any blocks. Inventories. Counters. Crafting formulas. Everything that has anything to do with any bricks.

    Disk is cheap, too, man.

    I get that this bothers you, but it doesn't trip my outrage meter. I reserve that for other things, like the fact that there's no built-in minimap, or the world doesn't spawn enough diamond.


  • ♿ (Parody)

    @FrostCat said:

    As someone with a daughter I would actually skip that joke.

    As someone with a daughter I would have said something like "I can't access her computer."


  • Discourse touched me in a no-no place

    @boomzilla said:

    As someone with a daughter I would have said something like "I can't access her computer."

    Point.


  • Banned

    @FrostCat said:

    Disk is cheap, too, man

    Don't even get me started on how much it will affect things that AREN'T kept in RAM all the time the whole game.


  • Discourse touched me in a no-no place

    @Gaska said:

    Don't even get me started on how much it will affect things that AREN'T kept in RAM all the time the whole game.

    This must be what @intercourse felt like poking @blakeyrat.



  • I find it funny this thread gets mentioned on CodingConfessional about a similar situation with wasting objects and memory.



  • @FrostCat said:

    If disabling VBOs increases performance, I don't want to ever, ever, ever see their codebase. Ever.

    I did say enable, just in case you misread me. I assume anyone for whom disabling them made it worse had some kind of defective display adapter.

    Was watching a video yesterday where two established minecraft youtubers with presumably good gaming PCs were bitching about the 1.8 lag. One told the other to turn VBOs off and everything was instantly better.

    https://www.youtube.com/watch?v=_japkkOaByA#t=5m50s

    (please don't judge me for my youtube tastes)


  • Discourse touched me in a no-no place

    @dookdook said:

    One told the other to turn VBOs off and everything was instantly better.

    There is definitely somethigng wrong if that works! I just don't know what.



  • I have no idea what I just watched but those guys cracked me up. "take some diamonds!" "argl argl argl"



  • Only 4?


  • Discourse touched me in a no-no place

    Yeah, I didn't think I'd need more when I built it and I haven't gotten around to getting another 4.



  • I just realised.

    Minecraft is this generation's Crysis.


  • FoxDev

    i am in the process of designing my winter wrapup present to myself and its ability to handle minecraft is a major design constraint.


  • Discourse touched me in a no-no place

    @accalia said:

    i am in the process of designing my winter wrapup

    I know spring comes late to Maine, but isn't that a little excessive?


  • FoxDev

    well.... maybe. i'll be getting it in December so..... midwinter gala?


  • Banned

    Since this thread is already ponified... I watched Equestria Girls: Rainbow Rocks recently. It was awful. Worse than all season 3 episodes combined.



  • Pray we do not ponify it further?

    EDIT: Where's the damn cornify plugin already?


  • Fake News

    Scratch that, ask PJH to put a Halloween-themed unicorn in the topright corner.


  • Discourse touched me in a no-no place

    @JBert said:

    Scratch that, ask PJH to put a Halloween-themed unicorn in the topright corner.

    http://freequotesimages.com/wp-content/uploads/2014/10/Halloween-Unicorn-1.jpg

    http://www.puredelish.com/images/cache/665ece91edbc6e1b5ca3fb4eb2b650bd.jpg

    Also: Googling "Halloween Unicorn" is not necessarily for the faint of heart.


  • FoxDev

    hmmmmm..... bot idea?



  • Needs to be an actual plugin for bonus points.



  • @Gaska said:

    Worse than all season 3 episodes combined.

    I don't get everyone's hate for S3. Except for the two Spike episodes and the Applejack episode (and arguably the Babs episode), they were all solid. That's a better hit to miss percentage than any other season.

    Unless it's all just residual butthurt about the alicorn thing.


Log in to reply