Geri the SubLEq Guy



  • Because threads are free. This is the saga of the Dawn OS, and it's creator, Geri the SubLEq Guy.

    I've mentioned this before, but just to make sure everyone is on the same page: one of the (erstwhile) members of the OSDev forums, Geri, was obsessed with the idea that if Reduced Instruction Set Computers were faster than Complex Instruction Set Computers, then surely the fastest would be a computer with only one instruction, right?

    All of this business about 'pipelines' and 'load/.store discipline' and such has to be a massive conspiracy by the chip manufacturers, making things deliberately complicated for... uh, reasons!

    Well, he wasn't going to stand for it, so he looked at some of the literature about One Instruction Set Computers (OISCs), mostly on the esolang message boards, and settled on one called 'SubLEq', "subtract the first two data words of the triplet starting at the instruction pointer, and branch to the third one if the result is less than or equal to zero". He decided that he would design a Truly Open Source CPU Architecture which would run like greased lightning, and would be really easy to implement as an interpreter, or as a TTL board, or using an FPGA, and everyone would be FREEEEEE and the software would be easy to write and never crash again.

    He wrote a crappy interpreter in C for it, because he knew jack shit about hardware but, hey, anyone who did know it can make this so it's all good, and he decided on a 'specification' for compatibility that make no attempt to consider how any of this would actual work. He then wrote a C subset compiler, and an operating system targeting his simulated system (with a GUI and everything, woohoo). Never mind that it ran like me trying to finish a marathon, it was perfect and sunshines and everyone needs to drop what they are doing and buy a copy!

    Buy, you say? Of course. Being from the Gavino_Learning School of Business (where the word 'Capitalism' means whatever you say it means, no more or less, thank you Humpty Dumpty), he didn't want to get any of that nasty FOSS stuff on his software, so he naturally started to sell it, closed source. For DogeCoin. He claims he even got at least one buyer... though the evidence he gave for the transaction is a bit, ah, questionable, nor does it address the reasons the alleged purchase was made.

    He was also rather... tactless is a good way to put it, and eventually his insulting and occasionally racist BS got him bounced from the group. But I understand he is still out there, and still trying to sell Dawn OS and his OISC idea to all and sundry.



  • Well, at the instruction level an OISC can be "blindingly fast". 🙂



  • Also this guy is Hungarian, he uses the uw.hu free hosting which has been around since the 90s I think, and has a freemail.hu email address.



  • @marczellm

    • An own p2p wireless networking protocol based on geolocation and requiring no external network provider cormprations

    I want that feature. I hate cormprations.

    Imagine the most dystropic world you can:

    -There are computers, but they instruction set and hardware set is so complex that nobody on the entry world fully understands how they work.

    This is the most "dystropic" world he can imagine? Jesus. Not one where you, for example, get beaten senseless for wearing sunscreen even though it's super sunny all the time and that's just off the very top of my head.

    -Imagine that software development become so complex and expensive that basically no software is being written any more, only ,,apps'' designed in ,,devtools''.

    Those BASTARDS! Using TOOLS!

    -Imagine a world, where the education of the students on computers is about showing the shapes of computers, and they call it ,,science''.

    Ok kids, welcome to Computer Science 101. Rectangle. That concludes this course.

    -Imagine a world where being an IT professional means fake-expertising a $30000 hardware with clicking in a specific software basically as a teached operator, with a 0.001% efficiency rate teached by fake professionals payed by corporations through bribed public servants - from your tax.

    That sounds dystopic-- or would if I have any clue what it's saying.

    -Imagine a computer, which requires 1 billion transistors to flicker the cursor on the screen.

    You'd think with all those transistors they'd implement double-buffering so the graphics don't flicker.

    I have bad news: This is not an imagination, this is our current world.

    0_1515628334303_inceptionbutton.mp3


  • area_can

    @blakeyrat said in Geri the SubLEq Guy:

    Imagine a computer, which requires 1 billion transistors to flicker the cursor on the screen.

    VS Code uses 13% CPU when focused and idle, draining battery...MacBook Pro



  • @bb36e But does it flicker or is the motion smooth?


  • Fake News

    -Imagine a world where being an IT professional means fake-expertising a $30000 hardware with clicking in a specific software basically as a teached operator, with a 0.001% efficiency rate teached by fake professionals payed by corporations through bribed public servants - from your tax.

    ... I honestly tried to figure out what that sentence meant. I did put in a good-faith effort. But I think my brain ran away. Send help.



  • @bb36e - https://news.ycombinator.com/item?id=13941293

    Powerful* text editors built on the web stack cannot rely on the OS text caret and have to provide their own.
    In this case, VSCode is probably using the most reasonable approach to blinking a cursor: a step timing-function with a CSS keyframe animation. This tells the browser to only change the opacity every 500ms. Meanwhile, Chrome hasn't yet optimised this completely yet, hence http://crbug.com/361587.
    So currently, Chrome is doing the full rendering lifecycle (style, paint, layers) every 16ms when it should be only doing that work at a 500ms intervall. I'm confident that the engineers working on Chrome's style components can sort this out, but it'll take a little bit of work. I think the added visibility on this topic will likely escalate the priority of the fix. 🙂



  • @thecpuwizard said in Geri the SubLEq Guy:

    Well, at the instruction level an OISC can be "blindingly fast". 🙂

    And that has always been the essential trade-off for RISC-vs-CISC. You reduce the instruction set to get rid of the fluff (e.g. System/360's jolly instruction "EDIT" - itoa() in a single machine instruction) and thereby you make it possible to make the CPU simpler inside (e.g. it becomes feasible to do away with a microsequencer and run the code in pure logic, etc.) and therefore individual instruction cycles become faster.

    But you stop reducing it when the additional volume of instructions starts making the machine slower.

    Alternative version: yeah, blindingly fast: you'll have to write so many instructions you'll go blind.



  • @steve_the_cynic - Indeed 🙂



  • @steve_the_cynic Wasn't there also a bit about the inbuilt complex instructions either having bugs or being slower than a supplied implementation?



  • @pleegwat said in Geri the SubLEq Guy:

    @steve_the_cynic Wasn't there also a bit about the inbuilt complex instructions either having bugs or being slower than a supplied implementation?

    Possibly (and if there is a bug, it will be very hard to fix - or to distribute the fixes - if it's in microcode), but the key part is that in the original RISC theory, such instructions force an architectural structure that makes the non-complex instructions slower. Subsequent CISC advances have largely invalidated that underlying assumption, of course.



  • @scholrlea said in Geri the SubLEq Guy:

    surely the fastest would be a computer with only one instruction, right?



  • @steve_the_cynic It might be noted that whenever anyone mentioned anything like this to Geri himself, he blew it off saying, in effect, that he's not a hardware geek and it wasn't his problem - he knew he was right, it was up to the guys making the hardware implementations to follow his lead.

    undefined

    Nice to have such faith in oneself, I suppose. Were it warranted, that is.



  • @blakeyrat said in Geri the SubLEq Guy:

    Those BASTARDS! Using TOOLS!

    In the ideal world, computers would be simple enough that you could program them by poking the hard disk with a magnetised needle. No tools necessary.


  • Impossible Mission - B

    @anonymous234 Or, if you're a power user, just use butterflies.



  • @masonwheeler said in Geri the SubLEq Guy:

    @anonymous234 Or, if you're a power user, just use butterflies.

    Good ol' M-x M-c M-butterfly...



  • @steve_the_cynic Idea: a processor with only one instruction, but the instruction is "read the byte at the specified memory position and do whatever it would do if this were an x86 processor"



  • @scholrlea said in Geri the SubLEq Guy:

    @steve_the_cynic It might be noted that whenever anyone mentioned anything like this to Geri himself, he blew it off saying, in effect, that he's not a hardware geek and it wasn't his problem - he knew he was right, it was up to the guys making the hardware implementations to follow his lead.

    After all, no one said it would easy to make good hardware! 🔥



  • @anonymous234 said in Geri the SubLEq Guy:

    a magnetised needle. No tools necessary.

    Sounds like a tool to me.



  • @scholrlea said in Geri the SubLEq Guy:

    then surely the fastest would be a computer with only one instruction, right?

    What about a computer with no instructions?

    @anonymous234 said in Geri the SubLEq Guy:

    a processor with only one instruction, but the instruction is "read the byte at the specified memory position and do whatever it would do if this were an x86 processor"

    It'd need 15 modes to accommodate that, after adding a bunch of prefices like "REX.L REX.W lock xacquire" an instruction can be up to 15 bytes long.



  • I should probably mention that there was another, even crazier thread he started in which he posted what he called a 'tutorial' on writing an operating system.

    About half the original post was a screed about the evils of x86 and ARM, and how he knew more than everyone else in the forum, and the other half basically being 'how to copy Dawn OS". To say it was not received well would be an understatement.

    The thread eventually get burned, but I did save some of the parts of it I had had a hand in, as well as from a related joke post I had made in the AD section. I posted these sections to my own subfolder in the wiki, with the intention of using the cogent parts for something or other which I now have forgotten about.

    Most of this is actually my replies to Geri, but some of his glorious prose is in them as well. Feel free to spork me along with Geri and anyone else over these.

    Unfortunately, the Wayback Machine can't seem to find the original, though I may be mistaken. I will try to collect some more choice Geri-isms RSN.



  • Geri Sez:

    In extern "C" block:

    in short: extern c will keep function and symbol names in compatible ways by the compiler, so if you make a library then it will stay usable widely,

    In Simple Random Number Generator:

    Geri: in the Dawn operating system, i **** the time, shuffle it with previous results, add pixels from screen, and mouse position.
    Dozniak: I'm quite curious as to what you do to the time to offend the profanity filter.
    Geri:

    In Smaller C:

    yes, i only need to emulate the subleq instruction set, paint the screen, and manage io.
    however, the subleq emulator algo also runs terrible, only half million instruction per second or less (code compiled with gcc does 200-400 million per second)

    but i did additional tests. with inline assembly in smallerc unreal mode, i get beethwen 1-1.5 gbyte/sec memory write and overally really nice performance. so i just will rewrite the subleq instruction emulation to inline assembly, and that magically will fix the most important performance problems. not all problems, becouse the rest of the code will stay in c, that will add 10-20 seconds to the boot process.

    In Has anyone here tried to use their OS as their primary OS?:

    hardware accelerated graphics drivers are not necessary any more, the cpus are very fast
    [..]
    cpus are very fast, for example the previous debian version used software opengl rasterizer with amd cards as default, i had 1360x768 screen, i had amd athlon2 x4 635 cpu @ 3,1 ghz, it was even fluid when 3d kde display effects was enabled, even if i watched hd videos, everything was perfectly fluid, it not even utilized more than 1 core generally.

    now debian switched back to hardware 3d with radeons, the display is now actually SLOWER.

    a modern i7 should be capable of rendering a HD desktop with just one core even when you do bilinear filtering and alphamapped display elements and effects, it should stay far above 25 fps in all cases, unless you pack out tons of windows. actually even an 1,6 ghz atom should stay usable.

    In Do you need a full working gui, programs and compiler in it?:

    I seen a lot of people who put a good working bootable kernel together that was able to input from keyboard and mouse, give out sound, or to raw access disks - and these people later sadly had to stop with developments, becouse they stuck at creating the gui.

    putting a gui together is a very hard and pretty much a different type of work than doing the kernel.
    it needs a totally different mind-set than googling x86/arm behaviors to have something to be accessed, it needs a different kind of work that most people cant afford to learn just for a kernel project.

    but i have a solution for you: You can use Dawn operating system as a GUI! Lets combine our forces, and create something great!

    Translation: If yo want to make your own OS... help me make my OS? 😵

    In OS Secruity (sic - mind you, it was the OP who tyop'ed that, not Geri):

    maybe this sounds pessimistic, but i dont think this area is having a future. there is basically 2 kind of data:

    1. your public data, its free to be accessed by anyone. this means your pictures of your summer vacation, your public pictures of your family. you have it uploaded to public, or semi-public places like social networking sites, cloud shares, on your cell phones, etc

    2. your private data. you protect this, you carry it with yourself, if you upload it, you are uploading it with encryption. this includes your works, your contracts, your personal documents, your source codes, your private photos, list of your business contacts, money transactions, private and health documents, business secrets, emails.

    most of people dont have the second, they are perfectly fine with sharing they online life, however, that kind of person is not contect creator, its strictly a content consumer. content creators need having both the two points.

    the question is, can this notably evolve into something else? there can be better tools to encrypt and upload data to virtual online drives easyer than the current ones, there can be better wireless devices with better software sets giving great data share access possibilities for the users. maybe there will be better file sharing protocols in the future. maybe the whole networking will be replaced one time.

    but there is prety much just this two great category of data handling, and if we think on internet, and cloud integration into an OS, the industry alreday invented all, and the facebook-type exhibicionism became the primary type of user data handling behavior.

    i dont think these areas will change in any forms, becouse it did changed in the last 3-10 years INTO this direction, and the evolution of this data handling is seems to be finished, both on business and both on social levels. encrypted file system drivers are also finished (veracrypt, tcnext). there is nothing left to invent, just to make better integration and compatibility with the existing conceptions.


  • BINNED

    File sharing rant

    Clearly, we need to introduce Geri to Swampy. Port SSDS to Dawn OS and you're golden!


    Filed under: Nobody shares knowledge in one instruction better than this!



  • @heterodox said in Geri the SubLEq Guy:

    -Imagine a world where being an IT professional means fake-expertising a $30000 hardware with clicking in a specific software basically as a teached operator, with a 0.001% efficiency rate teached by fake professionals payed by corporations through bribed public servants - from your tax.

    ... I honestly tried to figure out what that sentence meant. I did put in a good-faith effort. But I think my brain ran away. Send help.

    If single instruction is faster... parse the sentence means with one brain cell... much more efficientific and bigly.

    I mean... it's obvious that its how that sentence was written. That's how you have to decipher it.


  • area_can

    @wernercd said in Geri the SubLEq Guy:

    If single instruction is faster... parse the sentence means with one brain cell... much more efficientific and bigly.

    Why waste time? Say lot word when few word do trick:

    Kevin's Small Talk - The Office US – 02:33
    — The Office US



  • @onyx said in Geri the SubLEq Guy:

    File sharing rant

    Clearly, we need to introduce Geri to Swampy. Port SSDS to Dawn OS and you're golden!


    Filed under: Nobody shares knowledge in one instruction better than this!

    If you are going to try to hook @SpectateSwamp up on a blind date, you really ought to summon him to the thread.

    Filed Under: I'm going to regret this, aren't I?


  • BINNED

    @scholrlea said in Geri the SubLEq Guy:

    Filed Under: I'm going to regret this, aren't I?

    0_1515948747055_6bcd0d3b-3aae-4c1e-bace-349f7e570791-image.png

    Probably.



Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.