Your math teacher lied to you
-
This seems like an interesting approach...getting computer hardware closer to our hardware.
-
@boomzilla How is this fundamentally different than floating point numbers?
-
@Yamikuronue FP operations are deterministic. This is not.
-
@boomzilla Huh, interesting. I'll have to see if it goes anywhere.
-
@Yamikuronue It is less like (flush-to-zero) FTZ, and more like
floating point
of the signal processing domain. It cures the computer's obsessive compulsive disorder. Current computers work great for astrophysicists and researchers who want high accuracy, but not for data scientists who want a global picture but very quickly.I would buy one.
-
@boomzilla AFAICT this is still deterministic. It's just less accurate.
Although, the article doesn't say one way or the other. It's possible that they're running at the ragged edge of the logic levels, so it makes non-deterministic bit errors and just doesn't care.
-
@dse said in Your math teacher lied to you:
It cures the computer's obsessive compulsive disorder.
Sentences like this one are why I have trouble understanding these articles sometimes. What? My computer doesn't have OCD; it neither obsesses nor exhibits compulsive behaviors. It is a complete and utter math nerd though, is that what you mean?
-
@Yamikuronue it's the fact that its arithmetic operations take a long time to get the closest possible to exact result, and for the type of image processing they're doing they really want rough, fast approximations. It's like Quake's sqrt hack, but in the CPU itself.
-
@anotherusername said in Your math teacher lied to you:
AFAICT this is still deterministic. It's just less accurate.
FTFA:
Bates is not the first to pursue the idea of using hand-wavy hardware to crunch data more efficiently, a notion known as approximate computing (see “10 Breakthrough Technologies 2008: Probabilistic Chips”). But DARPA’s investment in his chip could give the fuzzy math dream its biggest tryout yet.
The title of that linked article: TR10: Probabilistic Chips
The resulting decrease in signal-to-noise ratio means those circuits would occasionally arrive at the wrong answer, but engineers can calculate the probability of getting the right answer for any specific voltage.
-
@boomzilla I knew that dude wasn't really her "pool cleaner".
-
@boomzilla Okay, yeah, that's exactly what they're doing: running at the ragged edge of the logic levels, so it makes non-deterministic bit errors and just doesn't care.
That link has much more detail:
In order to overcome noise and ensure that their transistors register the correct values, most chips run at a relatively high voltage. Palem’s idea is to lower the operating voltage of parts of a chip–specifically, the logic circuits that calculate the least significant bits
[Cryptography and machine learning algorithms] are typically designed to arrive quickly at an approximate answer. [PCMOS - the P stands for probabilistic] chips ... could achieve in hardware what must be done with software today–with a significant gain in both energy efficiency and speed.
-
@Yamikuronue said in Your math teacher lied to you:
Sentences like this one are why I have trouble understanding these articles sometimes. What? My computer doesn't have OCD; it neither obsesses nor exhibits compulsive behaviors. It is a complete and utter math nerd though, is that what you mean?
So you're complaining not that the article weirdly anthropomorphizes computers, but that it doesn't weirdly anthropomorphize them in the way you would have.
My computer's just a box of metal and plastic. It doesn't have opinions, emotions, or DSM-5 diagnosees.
This post contains 3 legit words that Chrome's moron dictionary doesn't know, one word I apparently made-up, and anotherusername can go fuck himself.
-
What's sure is that we'll be seeing a lot of new chip types in the following decades, now that processors aren't getting any faster.
I wonder if FPGAs can adopt this technology too?
-
@anotherusername said in Your math teacher lied to you:
[Cryptography [...] algorithms] are typically designed to arrive quickly at an approximate answer.
? If cryptography used approximate answers it wouldn't be cryptography!
-
@blakeyrat said in Your math teacher lied to you:
diagnosees.
This post contains 4 legit words that Chrome's moron dictionary doesn't know
It isn't a word. Webster's says it's not. But if it was a word, then Wiktionary would have an opinion, because it thinks it means:
Noun
diagnosee (plural diagnosees):
One who is diagnosedYour computer hasn't diagnosed any mental patients? Slacker.
Did you by happenstance intend to mean the plural of "diagnosis"? Because that's "diagnoses".
-
@Yamikuronue said in Your math teacher lied to you:
It is a complete and utter math nerd though, is that what you mean?
No that would be me(?), but I wont try to obsessively compare
a < b
below Planck's number. If the new chip could do some fast and more importantly low-power vector arithmetic I could use it right away in a product.
-
@boomzilla At the point where it says 1+1 = 2.06, we don't really need to do anything more than round, right?
I'd like to think of it like this, and maybe I'm wrong.
But if I'm looking at a picture of 2 apples, and one is slightly bigger than the other, and I ask it to count the apples, it gives me 2.06, because it just doesn't really care to realize that apples don't have to be the same size.
Then, you realize that when you look at the picture of 2 apples, you kind of do the same thing. You recognize the volume is bigger than one apple and smaller than three (2.06) and then the discrete part kicks in and says it must be 2 apples.
My example may be totally off what's actually happening, but is it pointed in the right direction?
-
@xaade They talk about how it would be useful for things like music players or images, where we already have lossy compression. This would be like that, except the compression losses would be slightly different every time. The trade off is that you can run the chip with a lot less power.
If you were adding 1+1, sometimes you'd get something other than 2. Based on the voltage, you'd be able to predict how often that would happen but not necessarily detect it in specific instances.
-
@ben_lubar said in Your math teacher lied to you:
@anotherusername said in Your math teacher lied to you:
[Cryptography [...] algorithms] are typically designed to arrive quickly at an approximate answer.
? If cryptography used approximate answers it wouldn't be cryptography!
Fuzzy logic is how something like a retina or a fingerprint is reliably encrypted. The logic has to be fuzzy, because the input contains a lot of noise. Instead of wasting a lot of time and effort filtering out noise, you just build an algorithm that's slightly imprecise and it doesn't even see the noise.
Fuzzy Identity-Based Encryption (Fuzzy-IBE) is the next evolution of identity-based schemes, adding error-tolerance feature to the scheme, making it a convenient choice for IBE systems using biometric identities. The motivation derives from the fact that biometric readings are not exact and always carry an amount of noise.
http://www.diva-portal.se/smash/get/diva2:725347/FULLTEXT01.pdf
-
@anotherusername cryptography and passwords that can never be changed don't go together well.
-
@boomzilla said in Your math teacher lied to you:
@xaade They talk about how it would be useful for things like music players or images, where we already have lossy compression.
Or, I'm guessing, for image processing. You have to deal with noise, artifacts and so on in your images anyway, so if some of the processing you do adds a bit of noise, you might be OK with that, especially if it means that your stuff will run faster or with less power.
-
@ben_lubar well, you asked. It would also apply to some things that you can change, like a spoken password or a pass-"gesture" (e.g. Stride, although I don't know if it uses fuzzy logic to encrypt the gesture; it would certainly be applicable though).
There are actually a bunch of results if you just google "fuzzy logic cryptography".
-
@cvi said in Your math teacher lied to you:
Or, I'm guessing, for image processing. You have to deal with noise, artifacts and so on in your images anyway, so if some of the processing you do adds a bit of noise, you might be OK with that, especially if it means that your stuff will run faster or with less power.
That was actually one of the examples given in the source. In fact, it's what they're being funded for (by the military, of course).
Bates reports promising results for applications such as high-resolution radar imaging, extracting 3-D information from stereo photos, and deep learning ...
In a simulated test using software that tracks objects such as cars in video, Singular’s approach was capable of processing frames almost 100 times faster than a conventional processor restricted to doing correct math—while using less than 2 percent as much power.
DARPA funded Singular’s chip as part of a program called Upside, which is aimed at inventing new, more efficient ways to process video footage. Military drones can collect vast quantities of video, but it can’t always be downloaded during flight, and the computer power needed to process it in the air would be too bulky.
-
So, you simply combine them - one processor for logic that has to be correct and one to deal with the fuzzy stuff :)
-
This smells like they're trying to make processors that understand Common Core.
-
@Tsaukpaetra We need something that does, clearly, since it's apparently impossible to stop teaching it at this point. Hopefully our new fuzzy logic overlords will be able to teach it in a way that is understandable.
-
@Fox said in Your math teacher lied to you:
in a way that is understandable.
Maybe. After all, apparently the researchers that produced the AlphaGo instance can't get it to explain how or why it beat that grandmaster Go player, so hopes are somewhat muted on that front...
-
@Tsaukpaetra So what you're saying is that we should take these processors and implant them into our brains directly so that we can understand Common Core instead of trying to learn it from them. Brillant! It'd save a lot on education costs, too.
-
@Fox said in Your math teacher lied to you:
take these processors and implant them into our brains directly so that we can understand Common Core instead of trying to learn it from them
Welcome to the Collective, @Fox! Your reservation ticket has been verified. You are currently assigned to Um19N3W4s4G1.
The tutorial will automatically launch while installation is in progress as part of the calibration and installation setup routines. If you wish to skip the tutorial, don't think about skipping the tutorial in 3.... 2.... 1....
-
@Yamikuronue said in Your math teacher lied to you:
@dse said in Your math teacher lied to you:
It cures the computer's obsessive compulsive disorder.
Sentences like this one are why I have trouble understanding these articles sometimes. What? My computer doesn't have OCD; it neither obsesses nor exhibits compulsive behaviors. It is a complete and utter math nerd though, is that what you mean?
No, even math nerds make mistakes. Only people with OCD are "perfectionists" that never make mistakes. (The analogy falls flat because a person with OCD would double/triple/quadruple check their work, while a computer "knows" it's right from the get go.)
-
@Captain Huh. Okay.
-
@Captain autism would be a better analogy.
-
@anotherusername said in Your math teacher lied to you:
@Captain autism would be a better analogy.
Yes and no. Only some autistics are savants and only some autistic-savants never make mistakes about their specialty.
-
@tharpa what about the ones who aren't savants but get bogged down easily when trying to process way too much information, without knowing how to filter out noise and ignore it.
-
@anotherusername Better make it a car analogy. Those are always the best.
-
@blakeyrat said in Your math teacher lied to you:
My computer's just a box of metal and plastic. It doesn't have opinions, emotions, or DSM-5 diagnosees.
The thing is, though, sometimes that kind of thing is a useful abstraction, as long as you're not the kind of person with shoulder aliens that take said abstraction too far.
-
@boomzilla What do you think this is, Slashdot?
-
@RaceProUK listen if I'm driving from Chiswick to Newcastle, I don't go via Birmingham, do I?
-
@RaceProUK said in Your math teacher lied to you:
@boomzilla What do you think this is, Slashdot?
Where do you think I got the link to the article?
-
@boomzilla Definitely not Slashdot. Nobody bothers to read the article there. Sometimes they don't even read the summary. Or the comment they're replying to.
-
@anotherusername said in Your math teacher lied to you:
That was actually one of the examples given in the source. In fact, it's what they're being funded for (by the military, of course).
I was more or less guessing that (didn't read the source [obviously], work etc etc). The idea has been around for a while and tends to pop up every now and then. A few years back I had some colleagues who were (briefly) looking into applications of this in computer graphics (that is, generating the images rather than analyzing them). IIRC the conclusion was that it was a bit more tricky there, with visible artifacts being somewhat undesirable. Also, they weren't hardware people, so the point was a bit moot.
-
@Buddy said in Your math teacher lied to you:
@RaceProUK listen if I'm driving from Chiswick to Newcastle, I don't go via Birmingham, do I?
For all I know, maybe you do.
-
@FrostCat Hmm… Chiswick to Newcastle-under-Lyme would route via Birmingham
-
@RaceProUK said in Your math teacher lied to you:
@FrostCat Hmm… Chiswick to Newcastle-under-Lyme would route via Birmingham
You know that I'm not going to bother to look up the routes, right, and that's what I meant?
-
@FrostCat that's like putting a Ferrari on top of a Porsche: you don't go any faster, but somebody, somewhere gets turned on by it.