Dont need no loops



  • stmd is "more correct" in this case, but "less correct" in the general case (which is "I fucked up my list numbering; Markdown, fix it! Fix it, Markdown!")



  • @TwelveBaud said:

    "I fucked up my list numbering; Markdown, fix it! Fix it, Markdown!"

    As I've stated before - if you break formatting, it's gonna be broken. It should, however, be consistently broken - so that you know what to avoid, and can figure out the rules which govern the system.

    Markdown, on the other hand...

    Filed under: case in point - this post



  • @cartman82 said:

    There's some deeper psychological mechanics at work here that I just don't understand.

    It's pretty easy for me to understand. Goes like this:

    Computers are completely fucking stupid; less intelligent than cockroaches. People who grew up alongside personal computers know this instinctively. People whose first exposure to computers has happened since about 1995, or whose brains have been warped by too much CS exposure, less so.

    The simple fact is that dealing with computers absolutely requires that users come to grips with a certain irreducible minimum of complexity. There is simply no way to make this untrue. Over the years there have been some designs that put that irreducible minimum briefly within the reach of the mostly untrained - the original Mac OS is notable in this regard - but then the real world re-intruded in the form of storage devices more capable than a 400Kbyte floppy disk, and the ubiquity of data formats made possible by those devices, and networks, and the expectation of apparently-typeset business correspondence with the advent of laser printing, and the genuine usefulness of being able to have the machine work on several things at once, and the Internet...

    There's a substantial cohort of programmers and designers who have looked at the Mac's design wins and extrapolated from them a belief in a general principle: that computers can and should be made both less intimidating and more useful, and that this can be done by hiding anything even vaguely complex from the end user and having the computer just get really good at guessing what it is that the user wants to achieve rather than allowing or requiring any such result to be expressed in detail.

    But for that belief to be correct, computers would need not only human-comparable AI (which has been ten years away for as long as I've been alive and looks set to remain so for the foreseeable future) but consummate social skills as well. Not gonna happen. Most people are obtuse enough to be irritating when you're trying to get them to do exactly what you want; what hope is a sub-cockroach CPU ever going to have?

    So what happens instead is that a bit more code gets added here, and a corner case gets handled better there, and on and on and so it goes. Of course this never really cuts to the heart of the problem because computers are completely fucking stupid, and after a while the added cruft (like customizable toolbars, and obscure options tucked away in odd corners of the UI because they need to go somewhere but don't seem a natural fit anywhere) becomes annoying enough that some bold Young Turk will run a New Broom right through an entire design and dumb it right back down to being even more completely fucking stupid than version 1.0 was.

    The latest manifestations of this kind of New Broom tendency are two fairly new memes: Everything Is A Phone (Touch Screens FTW), and the Skeuomorphism Is For Pussies school of flat UI design. When these combine synergistically, the result is designs that are both completely baffling for newbies and insanely irritating for those of us who actually needed attention paid to all those odd corner cases.

    Jeff Atwood sees himself as a bold Young Turk, and Discourse is his New Broom. Consequently it is even more completely fucking stupid than anything that came before it.

    The correct take-away message from the Mac OS experience is not that dumbing computers down makes them easier: it's that sometimes - not often, but sometimes - a bunch of actual geniuses gets together and makes something genuinely revolutionary and really cool.

    Atwood isn't an actual genius, and this thing he has made and foisted on the rest of us? It blows goats.

    FUCK OFF, TOASTERS.


  • Discourse touched me in a no-no place

    @flabdablet said:

    But for that belief to be correct, computers would need not only human-comparable AI (which has been ten years away for as long as I've been alive and looks set to remain so for the foreseeable future) but consummate social skills as well. Not gonna happen. Most people are obtuse enough to be irritating when you're trying to get them to do exactly what you want; what hope is a sub-cockroach CPU ever going to have?

    I think we're actually closer than we've ever been before, as we're both getting much more processing power and a better idea what intelligence actually is (it appears to be more closely linked with attention and less with raw calculation power; it's sure got fuck all to do with playing chess). Fortunately, the people working on this aren't pissing it away on jumping straight for Strong AI, but are rather doing more immediately useful stuff that just happens to be in the right direction. This Is Why I'm More Hopeful; the definitional quagmire of what actually is strong AI is being side-stepped.

    There's also been a lot of work on the social skills problem, again, not trying to boil the ocean. Its the the ocean-boiling superprojects that I'm suspicious of.



  • @dkf said:

    I think we're actually closer than we've ever been before, as we're both getting much more processing power and a better idea what intelligence actually is

    We continue to learn about intelligence at a rate that keeps the imbalance between what we think is probably required to implement it abiologically, and current levels of feasible IT energy density, pretty stable at about "ten years from now". It's been that way for as long as I've been paying attention (about four decades) and I can't see it changing any time soon.



  • Looks like they fixed most of the markdown issues you were trying to illustrate in that topic. It's been rebaked a few times since you wrote it.


  • Discourse touched me in a no-no place

    @flabdablet said:

    We continue to learn about intelligence at a rate that keeps the imbalance between what we think is probably required to implement it abiologically, and current levels of feasible IT energy density, pretty stable at about "ten years from now". It's been that way for as long as I've been paying attention (about four decades) and I can't see it changing any time soon.

    Half the problem is that the goalposts keep getting moved. Yeah, that makes achieving things hard.

    True AI may continue to be difficult for a long time to come, but I suspect we'll get computers up to the level of the average voter inside a few years. Computers might not be able to really think, but hardly any people do either.



  • Don't worry, CrazyJim1 is on the case!!!!!



  • @dkf said:

    I suspect we'll get computers up to the level of the average voter inside a few years.

    I suspect that even the average voter will retain a massive common sense advantage over any descendant of Clippy for at least the next several decades - Siri, Cortana and Maluuba et al notwithstanding.

    The main innovation that might induce me to start revising that opinion would be a Google self-driving car that doesn't rely on predigested terrain models and could safely and successfully take me for a tour around my local bush tracks without getting bogged, then navigate and drive me 350km to the centre of my nearest capital city, hack its way through city traffic while avoiding all the toll routes, and park itself in the closest available spot to my favourite restaurant.

    About ten years away, I expect :-)


  • Discourse touched me in a no-no place

    @flabdablet said:

    Maluuba

    That sounds like it is illegal in 37 states and the District of Columbia…
    @flabdablet said:
    The main innovation that might induce me to start revising that opinion would be a Google self-driving car that doesn't rely on predigested terrain models and could safely and successfully take me for a tour around my local bush tracks without getting bogged, then navigate and drive me 350km to the centre of my nearest capital city, hack its way through city traffic while avoiding all the toll routes, and park itself in the closest available spot to my favourite restaurant.

    Most of the routing does rely on that sort of thing, but then again most people did too prior to electronic navigation aids.



  • @flabdablet said:

    Jeff Atwood sees himself as a bold Young Turk, and Discourse is his New Broom. Consequently it is even more completely fucking stupid than anything that came before it.

    The correct take-away message from the Mac OS experience is not that dumbing computers down makes them easier: it's that sometimes - not often, but sometimes - a bunch of actual geniuses gets together and makes something genuinely revolutionary and really cool.

    Atwood isn't an actual genius, and this thing he has made and foisted on the rest of us? It blows goats.


    This, +∞


Log in to reply