Arduino in Node


  • I survived the hour long Uno hand

    So apparently some people are doing Arduino programming from Node.js: http://nodebots.io/

    Is there any logical reason why I'd do this, given that I know C++ enough to work with the standard Arduino libraries? What possible advantage is there for someone who's already a polyglot? Or is this just for webdev mavericks who can't stand the idea of learning another language?


  • SockDev

    @Yamikuronue said:

    Is there any logical reason why I'd do this

    because NodeJS is currently the new hotness and ultimate geek cred (until something new comes along to displace it (maybe Rust will do that? i dunno)

    @Yamikuronue said:

    What possible advantage is there for someone who's already a polyglot?

    other than the geek cred? not really.

    @Yamikuronue said:

    Or is this just for webdev mavericks who can't stand the idea of learning another language?

    little of this, but as i understand it mostly just a "because we could" mentality along with the aforesaid sexiness of NodeJS (at the moment anyway)



  • :headdesk:

    Who the @#$!@@ thought this was a good idea?

    There's a reason you use native-compiled systems languages in the embedded world...



  • @tarunik said:

    There's a reason you use native-compiled systems languages in the embedded world...

    Because embedded people hate change! They're all grumpy Luddites!!!

    Seriously, though, if it works, who gives a shit? You'll be 50 times more productive in a memory-managed environment.



  • @Yamikuronue said:

    Is there any logical reason why I'd do this

    So beards can say they've done it.



  • This post is deleted!


  • @blakeyrat said:

    Because embedded people hate change! They're all grumpy Luddites!!!

    Seriously, though, if it works, who gives a shit? You'll be 50 times more productive in a memory-managed environment.

    Unless it's Java.


  • Grade A Premium Asshole

    When your only tool is a hammer, you tend to treat all problems as nails.



  • Sure, but the thing is: we know that memory-managed languages produce better code, faster. Much faster, usually.

    The trade-off is you have to buy more expensive hardware to run them. I don't know exactly where the tipping point is there, but acting like, "embedded will ALWAYS BE DONE IN C++!" is stupid. Because hardware gets cheaper every year, and if your competitor can finish the product in 6 months while you're dicking around trying to solve memory allocation errors, they'll eat your lunch.

    So reactions like tarunik's just sound quaint to me, like he's coming forward in time from the 1980s.


  • BINNED

    @accalia said:

    because NodeJS is currently the new hotness and ultimate geek cred

    I use PHP, C++ and as little JS as I can get away with. I waited Qt people to finish their websocket implementation just so I can avoid NodeJS.

    I, apparently, have no geek cred. I am fine with this.



  • @Yamikuronue said:

    Is there any logical reason why I'd do this, given that I know C++ enough to work with the standard Arduino libraries?

    You're 100% more hip.

    Seriously though, there might be some advantage in more event-driven programming than C++ reasonably allows, seeing how embedded programming is mostly waiting for input to get triggered, reacting, and then going back to sleep.

    @blakeyrat said:

    The trade-off is you have to buy more expensive hardware to run them.

    And the problem is that Arduino is supposed to be one of the most cheap-ass boards obtainable.


  • Grade A Premium Asshole

    Oh, wait, my reply was not meant for you. It was for the OP. If all a person knows is JS, then Node comes along and they use it to solve every problem.



  • @Yamikuronue said:

    Is there any logical reason why I'd do this, given that I know C++ enough to work with the standard Arduino libraries?

    Probably not.

    It seems like a neat idea to me though. Arduinos are mainly for hobbyists and inventors making prototypes; lowering the barrier for entry means more people can play around and come up with cool stuff.

    I have a personal project I'm working on with an Arduino, and the slowest-going part has been the programming since I know very little C++. I have no motivation to devote hours/days/weeks to learning the language better when there is a strong chance I won't use that knowledge outside of this project.



  • @blakeyrat said:

    Seriously, though, if it works, who gives a shit? You'll be 50 times more productive in a memory-managed environment.

    Write cycle-accurate bitbanging code in C#, and then come back to me.

    @blakeyrat said:

    So reactions like tarunik's just sound quaint to me, like he's coming forward in time from the 1980s.

    I'd rather have static memory safety analysis in my embedded systems -- a la Rust. Managing memory for me at runtime is OK when there's oodles of resources that the computer can spend on doing it for me, but when you're counting clocks because you have to talk to some wacko nonstandard peripheral chip as fast as you can, you don't have that luxury. Or do your shoulder aliens deny the existence of wacko nonstandard peripherals and bitbanged interfaces?

    @Maciejasjmj said:

    And the problem is that Arduino is supposed to be one of the most cheap-ass boards obtainable.

    It's really not that cheap, compared to say a STM32 Discovery board. But, I'm rather familiar with the ATmega328 and its friends, and expecting a managed, bytecoded language to run on an 8-bit microcontroller, even with the AVR ISA's nods towards high-level-language support, is absurd.



  • @blakeyrat said:

    Sure, but the thing is: we know that memory-managed languages produce better code, faster. Much faster, usually.

    The trade-off is you have to buy more expensive hardware to run them. I don't know exactly where the tipping point is there, but acting like, "embedded will ALWAYS BE DONE IN C++!" is stupid. Because hardware gets cheaper every year, and if your competitor can finish the product in 6 months while you're dicking around trying to solve memory allocation errors, they'll eat your lunch.

    So reactions like tarunik's just sound quaint to me, like he's coming forward in time from the 1980s.

    If you are solving memory allocation errors on a 8-bit micro for 6 months, you are doing it wrong (TM).



  • @tarunik said:

    @Maciejasjmj said:
    And the problem is that Arduino is supposed to be one of the most cheap-ass boards obtainable.

    It's really not that cheap, compared to say a STM32 Discovery board. But, I'm rather familiar with the ATmega328 and its friends, and expecting a managed, bytecoded language to run on an 8-bit microcontroller, even with the AVR ISA's nods towards high-level-language support, is absurd.

    An Ancient AVR can run Linux via emulating ARM.

    Just run Node.JS on top of that.



  • @Onyx said:

    I use PHP, C++ and as little JS as I can get away with. I waited Qt people to finish their websocket implementation just so I can avoid NodeJS.

    I, apparently, have no geek cred. I am fine with this.

    Node.JS is actually not bad at all. I expected it to be another hipster Ruby thing that is difficult, buggy, makes no sense, and only done for hipster cred, but Node is actually a very nice platform.

    I just wish it used something other than JavaScript, but that's the only downside I've found so far.



  • Theoretically speaking -- yes, you're right, this is a consequence of the universality of Turing machines...but practically speaking? Just look at how dog slow that emulation is, and now imagine having to run a production system atop it.



  • @tarunik said:

    Theoretically speaking -- yes, you're right, this is a consequence of the universality of Turing machines...but practically speaking? Just look at how dog slow that emulation is, and now imagine having to run a production system atop it.

    Well, I basically do embedded chips in actual products. Honestly for what the video shows, its pace is actually perfectly acceptable in certain applications.

    Heck, I recently reversed a competitor's product where they had a 210ms hard delay before they executed any further logic on a PIC every main loop cycle. In other news, security bits are fun until someone lasers them off under a high power microscope.



  • @delfinom said:

    Well, I basically do embedded chips in actual products. Honestly for what the video shows, its pace is actually perfectly acceptable in certain applications.

    Heck, I recently reversed a competitor's product where they had a 210ms hard delay before they executed any further logic on a PIC every main loop cycle.

    *chuckles* If you can make your system work in the face of a 210ms hard delay per main loop cycle, more power to you man...either that, or they had some sort of situation where they were trying to run their CPU too fast for the rest of their world. :P



  • @delfinom said:

    In other news, security bits are fun until someone lasers them off under a high power microscope.

    ;) They sure are...(I talk with someone on IRC on a semi-regular basis who specializes in IC reverse engineering, hardware-level security, and related areas...)



  • @delfinom said:

    An Ancient AVR can run Linux via emulating ARM.

    Just run Node.JS on top of that.

    To emulate an x86 to run Windows Vista on it.


    Filed under: gotta go slow



  • @blakeyrat said:

    Because hardware gets cheaper every year, and if your competitor can finish the product in 6 months while you're dicking around trying to solve memory allocation errors, they'll eat your lunch.

    You are forgetting that hardware costs more than the money you spend for the chip. Upsizing hardware can cost you dearly in power consumption (translation: battery life!), heat output (read: heatsinks, fans, temperature limitations), and hardware NRE (applications processors are almost always in fine-pitch BGA packages that require multilayer, sub-6/6 pitch boards and high-speed layout techniques to match -- which makes your prototyping awful expensive!).

    Oh, and after that six months? They might have the product feature-complete, but if they can't get the high-speed layout and EMC stuff right, all bets are off because they'll have to do a costly respin after it fails FCC testing. As to you who chose the slower processor and did a bit more software slogging? You pass with flying colors as basic layout best practices and a bit of up-front testing got you a practical guarantee of first-pass Part 15 compliance.

    @delfinom said:

    If you are solving memory allocation errors on a 8-bit micro for 6 months, you are doing it wrong (TM).

    Has Blakey ever considered what people did before malloc() was invented?


  • BINNED

    @mott555 said:

    Node.JS is actually not bad at all. I expected it to be another hipster Ruby thing that is difficult, buggy, makes no sense, and only done for hipster cred, but Node is actually a very nice platform.

    To be fair, I wanted to avoid yet another dependency, and the ONLY thing I needed it for would be to work as a message bus between a C++ service and a web interface. Installing yet another piece of software on the serrver just to do that didn't sit right with me.

    @mott555 said:

    I just wish it used something other than JavaScript, but that's the only downside I've found so far.

    In this specific case JS is giving me enough grief by randomly casting my strings to integers already. Having it do that serverside as well... no, thank you.



  • Not all embedded systems require high performance. If you're working on a refrigerator thermostat, why not use Node.js or some other system instead of native? Or if you want to start learning embedded systems, why not use something that's halfway easy and familiar to learn some basics instead of diving straight into C and assembler?



  • @mott555 said:

    Not all embedded systems require high performance. If you're working on a refrigerator thermostat, why not use Node.js or some other system instead of native? Or if you want to start learning embedded systems, why not use something that's halfway easy and familiar to learn some basics instead of diving straight into C and assembler?

    I'm personally of the take that we should have a better language than C and assembler for embedded work -- it's just that throwing the techniques we used in the PC world at 8-bit micros isn't going to get us there.



  • Just tried this: https://www.npmjs.com/package/nodebot-workshop
    It's like a node workshop, only for the nodebot stuff. It gives you an emulator and everything.
    Did the first example. API is pretty nice and all... but

    I remember listening a Hansleminutes podcast with a woman who was pimping out this javascript arduino stuff. It all sounded good, until she revealed that you can't actually run javascript code on the hardware itself. As far as I understood, you basically need to have your widget physically connected to a PC. So node.js code is running on the box and transmitting commands over to the microcontroller.

    If that's true, that's some weak sauce, man. I suppose you can make a web server that controls an airconditioner or something, but no little robots walking around, which was my first impulse. If Lego Mindstorm can beat your robotics framework, you have a problem.



  • @tarunik said:

    I'm personally of the take that we should have a better language than C and assembler for embedded work

    How's that micro .NET thing doing? I've heard it's okay, but I haven't really seen any entry-level boards around, while you can pick up an ATMega in your local electronic store.



  • @tarunik said:

    I'm personally of the take that we should have a better language than C and assembler for embedded work -- it's just that throwing the techniques we used in the PC world at 8-bit micros isn't going to get us there.

    Rust has promise in that direction, if only they would fix their horrid alien syntax.

    Or you can join the ADA crowd and go buy a "secret price" compiler to actually compile shit.



  • @Maciejasjmj said:

    How's that micro .NET thing doing? I've heard it's okay, but I haven't really seen any entry-level boards around, while you can pick up an ATMega in your local electronic store.

    I wouldn't know -- if it can't run on a sub-ten-dollar board like mine, it's probably not going to get any attention from me. You probably hit on why it isn't taking off like wildfire, though... ;)



  • @delfinom said:

    Rust has promise in that direction, if only they would fix their horrid alien syntax.

    I have one eye on Rust as well...

    @delfinom said:

    Or you can join the ADA crowd and go buy a "secret price" compiler to actually compile shit.

    GCC has had an Ada front end for how long now?



  • @tarunik said:

    I have one eye on Rust as well...

    @delfinom said:

    Or you can join the ADA crowd and go buy a "secret price" compiler to actually compile shit.

    GCC has had an Ada front end for how long now?

    If you want features from 6 years ago only sure.



  • @cartman82 said:

    I remember listening a Hansleminutes podcast with a woman who was pimping out this javascript arduino stuff. It all sounded good, until she revealed that you can't actually run javascript code on the hardware itself. As far as I understood, you basically need to have your widget physically connected to a PC. So node.js code is running on the box and transmitting commands over to the microcontroller.

    I was wondering about this - they don't really mention it on their homepage as far as I could see (although, reading between the lines, and extrapolating from the fact that their library is mainly a serial communications module, sort of hints at it).

    Now, running javascript+node code directly on an arduino would have been impressive, even if it were a heavily scaled down version. Mainly because it's rather difficult given the kind of hardware you get with an arduino (i.e. something like 16MHz, ~32k flash and 2k RAM).



  • @cvi said:

    Mainly because it's impossible given the kind of hardware you get with an arduino

    Can't you just get it compiled? I mean, you won't be able to do much, but you should be able to do something.

    @tarunik said:

    I have one eye on Rust as well...

    It's a little ugly, but those abstractions are sweet for a systems language.



  • @mott555 said:

    If you're working on a refrigerator thermostat, why not use Node.js or some other system instead of native?

    Because of cost.
    Most low-performance embedded devices are high-volume-low-margin devices.

    If you can trim 50cents off the manufacturing cost by burning a couple of person-weeks or even months of programming time, then you will because it is absolutely worth it.


  • I survived the hour long Uno hand

    @cartman82 said:

    As far as I understood, you basically need to have your widget physically connected to a PC. So node.js code is running on the box and transmitting commands over to the microcontroller.

    Wut.

    Look, I'll admit I have the least standard needs ever: I'm looking at incorporation of minor robotics elements into costuming to make moving parts in a mobile situation (walking around a convention center). It can be nice to do complex logic in something I've used at least once since school. But I'm not sticking a raspberry pi into a costume just so it can run a node interpreter for my servos. That's insane.



  • But think about the garbage collector! The amount of time you'll save!



  • @Yamikuronue said:

    sticking a raspberry pi into a costume just so it can run a node interpreter for my servos. That's insane.way cool

    FTFY

    For the record, if you don't have any sensors or whatnot, you'd probably be able to get away with a MSP430 (the whole kit with a programmer is like $9 or so, and you get two uCs proper in it), a servo driver and a few wires.


  • I survived the hour long Uno hand

    @Maciejasjmj said:

    MSP430

    The idea was I'd need something that can run linux so I can hook it up to puppet the arduino constantly, whereas I just want a LilyPad or some such



  • @Maciejasjmj said:

    Can't you just get it compiled? I mean, you won't be able to do much, but you should be able to do something.

    Yeah, that's the reason I ended up changing "impossible" to "rather difficult".

    I'm questioning how much such a compiled/reduces node.js would actually resemble normal node.js code. You'd probably have to throw out some of JS:s language features. You'd also run into a lot limitations with the very small amount of memory you have. The whole everything-is-a-associative-array might be tricky to support due to that. Dynamic allocations would be severely limited (so, no closures for you). Pre-compiling means that eval() is impossible (but chances are you shouldn't be doing that anyway). And so on.

    And then there's still a whole world of performance problems to be had. Most Arduinos are 8bit, so all that 32 bit arithmetic is going to be rather sluggish, and floating point stuff even more so.



  • They could still go for something asm.js-y, and while emulating most of the features would be slow as hell, at least you're CPU-bound and not memory-bound like with an interpreter.

    Besides, aren't JS engines doing JIT nowadays? I'm guessing compiling things like associative arrays is a solved problem. Scratch that, it might indeed be tricky with RAM measured in bytes.



  • The associative arrays is really a weird one, the more I think about it. If you say that you only accept the a.b syntax (and maybe a["b"] where "b" must be a literal string/value), you could get down the cost quite a bit. It still would have to be a dynamic structure (hash table, or whatever) though, unless you somehow prevent people from randomly inserting or removing keys+values into/from objects and can figure out what fields should be there in the first place.

    Doesn't asm.js throw out most of Javascript as well? IIRC you can only use typed variables and arrays from a handful of primitive types - from what I understand is that it isn't exactly to be meant to be written by hand, but rather generated by a compiler?



  • Minimal node.js RAM usage in my experience is 8MB. Just the source code text (which javascript keeps in memory in entirety) will probably blow adruino's RAM out of the water.



  • More thinking about hypothetically, how much would you have to abuse Javascript (the language) to create something that could imaginably run on an arduino. And I'm thinking it's going to be a whole lot.

    (Not keeping the source in RAM would be trivial compared to all the other atrocities that would be committed if this were attempted.)



  • @cvi said:

    Doesn't asm.js throw out most of Javascript as well?

    Yeah, and that's the point - there's simply no way to have a full feature set on a uC with bytes of RAM. Even the best optimizing compiler won't be able to do that - you cannot change the laws of physics, Jim. But if you cut it down to a workable minimum, there might just be a chance.

    I'm not saying to use asm.js, since it's in general optimized for speed and not for memory, but some kind of thing like that.



  • I like the various mbed ARM boards for this kind of costume thing.
    The boards are small to tiny, cheap to very cheap and have an onboard USB mass-storage interface for programming and a virtual COM port endpoint for basic debugging.
    Some of them have interesting on board peripherals like gyros and accelerometers.



  • @cartman82 said:

    As far as I understood, you basically need to have your widget physically connected to a PC. So node.js code is running on the box and transmitting commands over to the microcontroller.

    I'm a complete newbie to node.js, but as I understood introductions, it does just-in-time compilation for all Javascript anyway; it doesn't interpret the code. I wonder if the reason for this setup is that the compilation is done on the PC and the Arduino only executes the final code?



  • @CoyneTheDup said:

    I wonder if the reason for this setup is that the compilation is done on the PC and the Arduino only executes the final code?

    I think it just loads some basic serial port listener on the Arduino and it's Node which does the actual processing, only sending commands like "raise pin 10" or something like that.

    That would be horribly inefficient, but it's pretty much the only way to make it workable.



  • Yes, that's what it looks like. Hard to be sure of anything,

    I tried reading some docs in that community and it's like they speak some foreign language or something. It's worse than business buzzwords.


  • Discourse touched me in a no-no place


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.