COSA is about to burst onto the multicore scene



  • In response to Ask Slashdot - Are Academic Journals Obsolete?, a user named MOBE2001 writes:

    The only purpose of peer review is not quality control but control,
    period. It is a mechanism used by an elitist group to keep outsiders at
    bay. Thus science becomes immune to public scrutiny, not a very good
    thing. As Paul Feyerabend said, (paraphrasing) we did not get rid of
    the dictatorship of the one true religion to fall under the tyranny of
    another.

    Peer review is an incestuous process that works for a
    while but eventually engenders ridiculously hideous monsters. Examples
    are time travel, cats that are both dead and alive when nobody is
    looking, parallel universes, dimensions that curled up into little
    balls so tiny as to be unobservable, etc... This is the reason that
    Feyerabend wrote in Against Method that "the most stupid procedures and
    the most laughable results in their domain are surrounded with an aura
    of excellence. It is time to cut them down in size and give them a more
    modest position in society."

    When challenged that he's only bitter because his own ideas are being rejected, he writes:

    On the contrary, my work (Project COSA) has never been more popular.
    COSA is about to burst onto the multicore scene like a locomotive.
    Surprise, surprise.

    Project COSA is a solution for parallel processing:

    Right away, even before I knew enough assembly language to write my
    first program, it was clear to me that computers were fundamentally
    flawed. I don’t know how but I immediately understood that a computer
    program should be more like a neural network. I knew that processors
    should be designed and optimized to emulate the reactive parallelism of
    the brain at the instruction level. I also realized that, for
    performance purposes, the emulation mechanism had to reside within the
    processor hardware itself.

    I find it interesting to get a glimpse of the world through the eyes of someone who knows all that is wrong and feels persecuted for trying to set it right. 

     



  • From your tags you're obviously aware of the parallels to Swampy. This guy writes better than Swampy, but might be deluded in a more confounding way. Who wants to drag him here for our amusement?



  •  @bstorer said:

    Who wants to drag him here for our amusement?

    Not me. Don't we have enough slashdotians around here spewing their opinions and being flamed to death?



  • @MasterPlanSoftware said:

    Not me. Don't we have enough slashdotians around here spewing their opinions and being flamed to death?
    Yeah, but what's one more?



  • @bstorer said:

    Yeah, but what's one more?
     

    Look... even if I have 40 porcupine spikes stuck in my anus, I somehow doubt I am going to say "Whats one more? Pound that one right in there."



  • @MasterPlanSoftware said:

    Look... even if I have 40 porcupine spikes stuck in my anus, I somehow doubt I am going to say "Whats one more? Pound that one right in there."
    I guess we're going to have to disagree on that one.



  • @bstorer said:

    @MasterPlanSoftware said:
    Look... even if I have 40 porcupine spikes stuck in my anus, I somehow doubt I am going to say "Whats one more? Pound that one right in there."
    I guess we're going to have to disagree on that one.
     

    I just consulted our official charter and it says we now have to resort to sexual experimentation. Please bring the crisco. I will get the porcupine.



  • @MasterPlanSoftware said:

    @bstorer said:

    @MasterPlanSoftware said:
    Look... even if I have 40 porcupine spikes stuck in my anus, I somehow doubt I am going to say "Whats one more? Pound that one right in there."
    I guess we're going to have to disagree on that one.
     

    I just consulted our official charter and it says we now have to resort to sexual experimentation. Please bring the crisco. I will get the porcupine.

    I'll bring beer and cupcakes.



  • @Kiss me I'm Polish said:

    I'll bring beer and cupcakes.
     

    WTF who invited the European with the identity crisis?





  • @MasterPlanSoftware said:

    @Kiss me I'm Polish said:

    I'll bring beer and cupcakes.
     

    WTF who invited the European with the identity crisis?

    Who cares? He's bringing cupcakes!



  • @TwelveBaud said:

    That would be #330, entitled "Indecision."
     

    Thanks. You are an idiot.



  • MOBE2001 is a lot of fun.  He posts in pretty much every multi-threading discussion about how "algorithms are holding us back," and that all the current computer manufacturers and computer scientists foolishly have it wrong.  As proof, he links to his own blog which has a lot of speculation, no mathmatical proofs, and no working implementation.  Heck, with all the time he spends advocating this he could at least write an emulator (for his theoretical hardware) and a first-pass prototype language.

     He also believes that quantum mechanics are hogwash and that most modern physists are charlatans.

    And yes, its sad that I know all this.  Its just too damn fun to try and imagine what it must feel like to be that delusional.



  • I don't know wih amuses me the most: that the guy writed on his blog "I hate algorithmic proggraming", or that "cosa" translates as "thing" in spanish.- 



  • Uh, wow. That's some really well presented garbage. I don't really see him doing anything that someone with a FPGA and knowledge of VHDL/Verilog can't do already today. HDL Synthesis already handles "objects" with defined in/out ports, vendors have SOPC builder software that allows integration of these components via a GUI, including automatic connecting of ports, buses and even bus arbitration logic. Indeed, the real world is parallel, and so are logic gates on a CPU.

    I don't see how true parallel logic helps achieve bug-free components though. It's quite the opposite actually. It's almost impossible to design purely combinatorial circuits, as they inevitably leade to timing problems due to differing signal latencies. There is a very good reason why virtually every circuit beyond a few gates is build using a reference clock that ensures well defined states. Even then, development is hardly bug free. One just has to look at the Errata of every CPU and microprocessor manufactured. Even the best and brightest companies end up with bugs in their silicon (most notable are probably the pentium bugs.) How can that be happening if parallelism solves these problems?

    Everything he describes sounds like a basic VHDL compiler plus functional simulator. I'm not going to waste any more time looking at this crap, but so far I haven't seen anything that couldn't be done with ModelSim today. 



  •  If this guy is the same guy I think it is, (and if I understand his ramblings well enough), essentially what he's trying to propose is a data flow architecture.  You take a giant mesh of processing elements (either logical or physical) and map instructions to them.  Then when all inputs to the instruction arrive at the PE it fires and the output gets routed to another PE.  No need to use parallel algorithms or break things down into threads.  Just map

    In general the concept is good, but it has serious problems of its own whicih have kept it from being used extensively.  First of all, it requires significant transformations to transform imperative sequential code (like C) into data flow code.  Secondly, you end up simply being unable to map enough instructions onto the processor to extract thread level parallelism.  There just isn't enough storage.  Lastly, poor programming or algorithmic choices can still cause lengthly critical paths.

     All that being said, I may have misinterpreted some of what the guy's said and he may be even more loony than I thought.



  • @ZippoLag said:

    I don't know wih amuses me the most: that the guy writed on his blog "I hate algorithmic proggraming", or that "cosa" translates as "thing" in spanish.- 

    Those are the only options? I guess I gotta go with the first one, then, but I do so under protest.



  • @bstorer said:

    @ZippoLag said:

    I don't know wih amuses me the most: that the guy writed on his blog "I hate algorithmic proggraming", or that "cosa" translates as "thing" in spanish.- 

    Those are the only options? I guess I gotta go with the first one, then, but I do so under protest.
     

    Yay! I love when someone quoetes me with the biggest amount of sarcasm possible (to this is not enterely the case).

    Anyway, for those who wen't on and read something on the blog, doesn't this guy's architecture remind you of Befunge of sorts? 



  • @Nandurius said:

    Everything he describes sounds like a basic VHDL compiler plus functional simulator. I'm not going to waste any more time looking at this crap, but so far I haven't seen anything that couldn't be done with ModelSim today. 

     

    Thank you, I was wondering if I was insane for immediately thinking "FPGA".  Personally I've thought for a while that it would be great to have an FPGA or two on a motherboard or a PCI card, which could be programmed from userspace for the more intense stuff that's better suited for that (audio and video encoding / decoding, serious image processing, etc.), BUT you'd have to be insane to think that would work well for everything.  If everything's done in hardware, sequential or timed stuff would probably get to be a pain in the ass - you have to create a state machine or counter + clock or something.  Plus, as you said, debugging would be a complete nightmare. At least with software, you can still step through each instruction (or line of code) one-by-one to see things happen one step at a time, but all that goes out the window with his ideas.

    ...and, his whole "Look at me, everyone else is completely wrong, and total morons, and evil too, but I'm the savior of humanity" attitude makes me think he's a huge asshole. 



  • I think that it bears a striking resemblance to CSP.


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.