WTFs from History



  • I've just been reading Singh's "The Code Book", and came across a good WTF in the German implementation of key-generation.

    Okay, a bit of background - there are two aspects of enigma's operation that we're concerned with here.

    The first is that each daily key specifies which scrambler-disks, of the available 5, are to be used in the 3 slots of machine. So the key for this aspect might be 4-3-5, meaning "place disk 4 in slot A, disk 3 in slot B and disk 5 in slot C".

    The second aspect is that up to 20 letters could be swapped by another part of the machine prior to encoding via the scrambers. So, the key for this aspect might be ag,eb,pv,sy,cm,etc.

    All pretty good.... except the Germans recognised that the more random these arrangements, the harder the key would be to crack. Unfortunately, the true meaning of "random" seems to have eluded them. They specfied that no key two days running could include a disk in the same position. They also specified that no letter-swap key could swap two letters that were adjacent in the alphabet.

    I guess they wanted it Random 2.0



  • I don't know much of anything about cryptography and code-breaking, so be gentle...

    If the potential code-breaker knew those 2 specifications, it would obviously make his/her job a little easier. But as long as those specifications were kept secret, did this make the encryption stronger or not?



  • It limits the amount of possibilities and technically weakens the encryption. Though not by a whole lot.

    I think.



  • The trouble was that the Bletchley Park team could crack the keys, so they noticed the pattern, which subsequently made cracking each day's daily key much easier.



  • @dhromed said:

    It limits the amount of possibilities and technically weakens the encryption. Though not by a whole lot.

    I think.


    It makes a difference of a significant order of magnitude. Bare in mind that they were using mechanical devices to go through the possible keys, so the difference between 10,000 keys and 100,000 is more important than we are used to today.



  • There are some good articles on Wikipedia aout how the Enigma machine worked, and about the cryptanalysis used against it.

    Later versions of the Enigma actually provided very good resistance to cryptanalysis, and if it weren't for various operational lapses, they probably would never have been "cracked" at all.

     



  • Several orders of magnitude.  I'd need to do a bit of mechanical checking, but I think it's more like going from 128-bit keys to 32-bit keys.

    I mean, the specification of that they can't use the disk in the same space two days running alone reduces the entire problem size by almost 2/3s.  [Though not the initial problem size, assuming you have et minima a set of two days and are trying to decrypt one, or better, have one decrypted.  For a single day, knowing that helps none.  Also assuming no overlap, which.... wouldn't work in their favor either] Quite a bit more if the scrambler disks are used for generation of whatever they used for a key stream, and I imagine they were.  In that case, it's closer to cube-rooting the problemsize [this depends on operational mechanics], and there are further useful patterns since the N+2 date key has some pattern basis on the N date.

    The second operational restriction... well, I don't know much of how enigma worked, but with a rotating keystream, that stricture could... well, it doesn't affect the problem size as much [like, maybe a 10% decrease if you have two days], but it makes some changes for the pattern-matching... though since I don't enigma well, I don't understand how much.  Basically, it would make both finding known strings and attempting to find the original key much easier.



  • @TheDauthi said:

    I mean, the specification of that they can't use the disk in the same space two days running alone reduces the entire problem size by almost 2/3s.


    Less, I think.

    Without the restriction there are 543 = 60 combinations of disks.

    With the restriction, there are (I think) 33+14 = 13 combinations for disks 1 and 2.

    Of those 13, 6 of those allow 3 different choices for the 3rd disk, and 7 allow 2 choices, which makes a total of 18 + 14 = 32. I think....



  • @dhromed said:

    It limits the amount of possibilities and technically weakens the encryption.


    Yes. A bit like saying "to make your password more secure, it must contain at least one numeric character", which always makes me slap my forehead: now the password hacker knows something about my password.



  • @tomandlu said:

    @dhromed said:
    It limits the amount of possibilities and technically weakens the encryption. Though not by a whole lot.

    I think.


    It makes a difference of a significant order of magnitude. Bare in mind that they were using mechanical devices to go through the possible keys, so the difference between 10,000 keys and 100,000 is more important than we are used to today.


    AH

    I always forget how quickly powers can increase or reduce a number.

    It's like that chessboard request: one grain of rice on the first, two on the second, four on the third, eight, sixteen etc.
    Many people will end up at 2^64, but they forget that you've to add all chessboard squares which doubles the amount of rice.



  • @teedyay said:

    @dhromed said:
    It limits the amount of possibilities and technically weakens the encryption.


    Yes. A bit like saying "to make your password more secure, it must contain at least one numeric character", which always makes me slap my forehead: now the password hacker knows something about my password.


    Maybe I should ask my bank about that for my online banking password.



  • @teedyay said:


    Yes. A bit like saying "to make your password more secure, it must contain at least one numeric character", which always makes me slap my forehead: now the password hacker knows something about my password.


    Hmm, probably six of one, half-a-dozen of the other in this case, since it limits the ease of a dictionary attack. Of course, in an ideal world, it shouldn't be necessary, since users would understand what makes a good password, and a decent length password would probably contain at least one numeric.

    Still, in an ideal world, we wouldn't need passwords anyway....



  • @dhromed said:


    It's like that chessboard request: one grain of rice on the first, two on the second, four on the third, eight, sixteen etc.
    Many people will end up at 2^64, but they forget that you've to add all chessboard squares which doubles the amount of rice.


    Hmmm...
    1st field: 2^0
    2nd field: 2^1
    .
    .
    .
    64th field: 2^63
    ----------------------
    Sum: 2^64-1





  • If you enjoyed Singh (on my shelf here) you might also enjoy Stepehenson's Cryptonomicon.  It's good fiction with a solid foundation in crypto (including an excellent crypto algorithm developed just for the book) and a good number of laughs. If you absolutely adore Cryptonomicon (and only if you do) try the Baroque Cycle Trilogy. There are many tie-ins between them but the Baroque Cycle is a bit tough going at times.



  • Considering that all of this happened 60 years ago, as the real birth
    of cryptography, I feel that the Germans did a commendable job of
    making things difficult. The cat and mouse games played in the German
    and British intelligence communities in that war were amazing.



    An yes, if kept the notion about the adjacent letters and the disk
    positions secret then all would be good. But then security by obscurity
    always is.



    To their credit, the second version of the enigma machine had additional disks, making the possibilities all that much harder.



    And to their credit, the whole damn thing was mechanical, widely
    distributed, and incredibly secret. It would have been interesting to
    see what would happen if the Germans discovered that they had been
    hacked.



  • @Stan James said:

    If you enjoyed Singh (on my shelf here) you might also enjoy Stepehenson's Cryptonomicon.  It's good fiction with a solid foundation in crypto (including an excellent crypto algorithm developed just for the book) and a good number of laughs. If you absolutely adore Cryptonomicon (and only if you do) try the Baroque Cycle Trilogy. There are many tie-ins between them but the Baroque Cycle is a bit tough going at times.


    Loved Cryptonomicon (bug Stephenson fan in general - The Diamon Age is my favourite). Not yet got around the Baroque Cycle, but on my list.

    As an aside, is it me or does Stephenson write great books with crappy endings?



  • @ohng69 said:

    Considering that all of this happened 60 years ago, as the real birth
    of cryptography, I feel that the Germans did a commendable job of
    making things difficult. The cat and mouse games played in the German
    and British intelligence communities in that war were amazing.



    An yes, if kept the notion about the adjacent letters and the disk
    positions secret then all would be good. But then security by obscurity
    always is.


    Well, I wouldn't call it the birth - Singh has many examples from BC, and the renaissance in particular was a very fertile period.

    That aside, the adjacent letters/disk position thing was just plain dumb. Together, they represented a significant weakness and offered no benefits.

    The odd thing, IMHO, is that surely the Germans knew that this was dumb (i.e. not random) - it's only little kids that think rolling two sixes in a row is a less random result than two distinct numbers.



  • @ammoQ said:

    @dhromed said:

    It's like that chessboard request: one grain of rice on the first, two on the second, four on the third, eight, sixteen etc.
    Many people will end up at 2^64, but they forget that you've to add all chessboard squares which doubles the amount of rice.


    Hmmm...
    1st field: 2^0
    2nd field: 2^1
    .
    .
    .
    64th field: 2^63
    ----------------------
    Sum: 2^64-1





    What a great way to find out that javascript craps out at 2^54. From that point on, the total is equal to 2^squares.


    var boardsize = 53;
    var rice = 1;
    var board = [];

    for (var i=0; i < boardsize; i++)
    {
    board.push(rice); //put rice on board
    if (i < (boardsize-1))
    rice = rice * 2; //double grains
    }

    document.write('maxrice: ' + rice + '
    ');

    var twoplus = Math.pow(2,boardsize);

    document.write('plusone: ' + twoplus + '
    ');

    summa = 0;
    for (var i=0; i < board.length; i++)
    {
    summa += board[i];
    }

    document.write('total: ' + summa + '
    ');



  • Wonderful Forum, slices posts

    What a great way to find out that javascript craps out at 2^54. From that point on, the total is equal to 2^squares.


    var boardsize = 53;
    var rice = 1;
    var board = [];

    for (var i=0; i < boardsize; i++)
    {
    board.push(rice); //put rice on board
    if (i < (boardsize-1))
    rice = rice * 2; //double grains
    }

    document.write('maxrice: ' + rice + '
    ');

    var twoplus = Math.pow(2,boardsize);

    document.write('plusone: ' + twoplus + '
    ');

    summa = 0;
    for (var i=0; i < board.length; i++)
    {
    summa += board[i];
    }

    document.write('total: ' + summa + '
    ');



  • Re: WTFs from History - Enigma you can play with

    For those who are in the Washington DC area, NSA sponsors a Cryptologic Museum with a working model of an Enigma Machine that you can play with.....the museum is fascinating and well worth visiting.  (Don't know if all your calls are monitored, tho :-O  )



  • @dhromed said:


    What a great way to find out that javascript craps out at 2^54. From that point on, the total is equal to 2^squares.


    Please tell me you didn't have to write a program to prove my point.

    Anyway, here is the proof (by induction)...

    statement: for all n>=1, sum(i in 1..n; 2^(i-1)) = 2^n -1

    basis: sum(i in 1..n; 2^(i-1)) = sum(2^0) = 1 = 2^1 -1
    inductive step: sum(i in 1..(n+1); 2^(i-1)) = sum(i in 1..n; 2^i-1) + 2^n = 2^n-1 + 2^n = 2*(2^n) -1 = 2^(n+1) -1



  • @tomandlu said:

    @TheDauthi said:
    I mean, the specification of that they can't use the disk in the same space two days running alone reduces the entire problem size by almost 2/3s.


    Less, I think.

    Without the restriction there are 5*4*3 = 60 combinations of disks.

    With the restriction, there are (I think) 3*3+1*4 = 13 combinations for disks 1 and 2.

    Of those 13, 6 of those allow 3 different choices for the 3rd disk, and 7 allow 2 choices, which makes a total of 18 + 14 = 32. I think....

    You're correct.  I forgot the obvious case that the item used in disk 1 is the item used in disk 2 on the prior day, which makes the first two disks 3*4+1 instead of 3*4.  From there, it goes downhill, giving those blasted germans 8 more combinations than I expected.  Honestly, I just didn't think it all the way out. 



  • @ammoQ said:

    @dhromed said:

    What a great way to find out that javascript craps out at 2^54. From that point on, the total is equal to 2^squares.


    Please tell me you didn't have to write a program to prove my point.



    I didn't.

    I just wanted to see the numbers, and enjoy writing code in general.

    I'm also not much of a mathhead.



  • @TheDauthi said:

    @tomandlu said:

    @TheDauthi said:
    I mean, the specification of that they can't use the disk in the same space two days running alone reduces the entire problem size by almost 2/3s.


    Less, I think.

    Without the restriction there are 543 = 60 combinations of disks.

    With the restriction, there are (I think) 33+14 = 13 combinations for disks 1 and 2.

    Of those 13, 6 of those allow 3 different choices for the 3rd disk, and 7 allow 2 choices, which makes a total of 18 + 14 = 32. I think....

    You're correct.  I forgot the obvious case that the item used in disk 1 is the item used in disk 2 on the prior day, which makes the first two disks 3*4+1 instead of 3*4.  From there, it goes downhill, giving those blasted germans 8 more combinations than I expected.  Honestly, I just didn't think it all the way out. 



    I sympathise - when I first read that calculating the number of valid sudokus is not simple, I played with that to understand the problem. This is similiar, so I was ready for it.

    BTW I didn't find the real number by anything but brute force - anyone know if there is a more algorithmy way?


  • Wow! An enigma-based google-ad...



  • @ammoQ said:

    @dhromed said:

    It's like that chessboard request: one grain of rice on the first, two on the second, four on the third, eight, sixteen etc.
    Many people will end up at 2^64, but they forget that you've to add all chessboard squares which doubles the amount of rice.


    Hmmm...
    1st field: 2^0
    2nd field: 2^1
    .
    .
    .
    64th field: 2^63
    ----------------------
    Sum: 2^64-1


    2^64-2



    The first square has one grain of rice, not two.




  • @Thanny said:

    @ammoQ said:
    @dhromed said:

    It's like that chessboard request: one grain of rice on the first, two on the second, four on the third, eight, sixteen etc.
    Many people will end up at 2^64, but they forget that you've to add all chessboard squares which doubles the amount of rice.


    Hmmm...
    1st field: 2^0
    2nd field: 2^1
    .
    .
    .
    64th field: 2^63
    ----------------------
    Sum: 2^64-1


    2^64-2



    The first square has one grain of rice, not two.




    BEEP.
    That's actually the opposite of the mistake I made. :)

    The total on a board of N fields is always  (2^N) - 1

    For example, three fields:

    [1] [2] [4]
    final field = 2^(N-1) = 2^2 = 4
    total = (2^N) - 1

    Four fields:



    [1] [2] [4] [8]

    final field = 2^(N-1) = 2^3 = 8

    total = (2^N) - 1 = 15

    Etcetera.



  • @Thanny said:

    @ammoQ said:
    @dhromed said:

    It's like that chessboard request: one grain of rice on the first, two on the second, four on the third, eight, sixteen etc.
    Many people will end up at 2^64, but they forget that you've to add all chessboard squares which doubles the amount of rice.


    Hmmm...
    1st field: 2^0
    2nd field: 2^1
    .
    .
    .
    64th field: 2^63
    ----------------------
    Sum: 2^64-1


    2^64-2



    The first square has one grain of rice, not two.



    I'm sorry, I've learned the other kind of mathematics. Where 2^0=1.



  • ul@ohng69 said:



    An yes, if kept the notion about the adjacent letters and the disk
    positions secret then all would be good. But then security by obscurity
    always is.


    Let's say you and I choose a secret crypto algorithm.  You choose
    SHA-256, but don't tell me.  I, in my brilliance and wisdom,
    choose ROT-13.  Nobody knows what my algorithm is!  But
    keeping that a secret would be useless, because cryptanalysis would
    crack it instantly.



    Security by obscurity is not always good; cryptanalysis can find
    patterns in the encrypted signal without knowing the original
    algorithm.  Having a smaller set of encryption keys ultimately
    means that those patterns will be easier to detect.




  • @ammoQ said:

    @Thanny said:
    @ammoQ said:

    Hmmm...
    1st field: 2^0
    2nd field: 2^1
    .
    .
    .
    64th field: 2^63
    ----------------------
    Sum: 2^64-1


    2^64-2



    The first square has one grain of rice, not two.



    I'm sorry, I've learned the other kind of mathematics. Where 2^0=1.



    Powers between 0 and 1 are weird.
    Back in preschool, I was tought that the ^N is the [i]amount[/i] of times a number should be multiplied with itself, and that interpretation doesn't lend itself to anything other than integers > 0 for N. Powers not in that range just don't compute in my mind and I'm very curious how it's done.

    In a moment of clarity I came up with x^y = x / x^(-y + 1) but, um, that doesn't actully help anything.



  • @dhromed said:



    Powers between 0 and 1 are weird.
    Back in preschool, I was tought that the ^N is the [i]amount[/i] of times a number should be multiplied with itself, and that interpretation doesn't lend itself to anything other than integers > 0 for N. Powers not in that range just don't compute in my mind and I'm very curious how it's done.

    In a moment of clarity I came up with x^y = x / x^(-y + 1) but, um, that doesn't actully help anything.


    In the case of 0, it helps to consider x^(n-1) = (x^n)/x ; in the case of n=1, you get x^0 = x/x = 1

    For non-integer values of y:

    exp(y) = lim(n->inf; (1+y/n)^n)

    x^y = exp(y*ln(x))




  • @dhromed said:

    @ammoQ said:
    @Thanny said:
    @ammoQ said:

    Hmmm...
    1st field: 2^0
    2nd field: 2^1
    .
    .
    .
    64th field: 2^63
    ----------------------
    Sum: 2^64-1


    2^64-2



    The first square has one grain of rice, not two.



    I'm sorry, I've learned the other kind of mathematics. Where 2^0=1.



    Powers between 0 and 1 are weird.
    Back in preschool, I was tought that the ^N is the [i]amount[/i] of times a number should be multiplied with itself, and that interpretation doesn't lend itself to anything other than integers > 0 for N. Powers not in that range just don't compute in my mind and I'm very curious how it's done.

    In a moment of clarity I came up with x^y = x / x^(-y + 1) but, um, that doesn't actully help anything.

    Assuming you remember some algebra, I'll go over it in medium gear, and if it needs clarification, I can go over it in more detail tomorrow. I don't remember the formal proofs anymore, but I can show why it works. This works a lot better with a chalkboard...
    X^0 is defined as 1 via some ugly logarithmic summation and some recursive logic that basically says, 'because that's the only way it makes sense'. [The one I remember really is a reductio ab absurdium]. It doesn't seem to make a lot of sense, but it's really whatever your multiplicative identity is. However, the other way to approach the problem kinda follow from the explaination of the rest of the problem. [It don't REALLY work without the limit, though, for you technical people. And also breaks at X=0]


    X^N 0<N<1 is a simpler case.  Just remember that, from Algebra, the three exponent rules:

    (X^M) * (X^N) = X^(M+N)

    X^(M-N) = (X^M)/(X/N)

    (X^M)^N = X^(M*N)

    Now, take that first rule, then substitute (1/2) as both M and N. This shows that X^(1/2) is the same as sqrt(1/2), by definition.


    Expanded, that handles your simple roots, such as X^(1/N), where N is any whole number greater than 1: it's the Nth root of that number.
    So, we know how to get a root, which tells us what the denominator does. Now, take the third rule. From that, with a little rearranging of formulae, you can see that any number X^(N * 1/M) is the same as (X^N)^(1/M), which, from above, is EXACTLY the same as the Mth Root Of X to the Nth power.



    IOW, X to a fractional power is the same as denominator's root of (X to the numerator's power).
    Or, in javascript, X^(N/M) is pow(pow(X,N), 1/M), assuming I didn't screw that up. It's possible, I'm suffering from insomnia again.

    Example: 9^(5/2) should be the same as (9^5)^(1/2). That becomes, as earlier, sqrt(9^5), which becomes sqrt(59049), which becomes 243. Checking that on a calculator shows it to be correct.



  • @ammoQ said:

    @dhromed said:


    Powers between 0 and 1 are weird.
    Back in preschool, I was tought that the ^N is the [i]amount[/i] of times a number should be multiplied with itself, and that interpretation doesn't lend itself to anything other than integers > 0 for N. Powers not in that range just don't compute in my mind and I'm very curious how it's done.

    In a moment of clarity I came up with x^y = x / x^(-y + 1) but, um, that doesn't actully help anything.


    In the case of 0, it helps to consider x^(n-1) = (x^n)/x ; in the case of n=1, you get x^0 = x/x = 1

    For non-integer values of y:

    exp(y) = lim(n->inf; (1+y/n)^n)

    x^y = exp(y*ln(x))

    Bah.  Beat me to it, and with the more technically accurate version.  Still, doesn't work at x=0, as 0/0 is indeterminate(sp) form, and thus undef.  And, as I recall, the limit is 1, but it's needed to equal 1 by little things like the binomial theorem, probability, set theory.



  • @Thanny said:

    @ammoQ said:
    @dhromed said:

    It's like that chessboard request: one grain of rice on the first, two on the second, four on the third, eight, sixteen etc.
    Many people will end up at 2^64, but they forget that you've to add all chessboard squares which doubles the amount of rice.


    Hmmm...
    1st field: 2^0
    2nd field: 2^1
    .
    .
    .
    64th field: 2^63
    ----------------------
    Sum: 2^64-1


    2^64-2



    The first square has one grain of rice, not two.




    The real WTF is that nobody's ever going to eat that much rice.



  • @TheDauthi said:

    X^(M-N) = (X^M)/(X/N)


    Oeh, small typo:

    x^m-n = x^m / x[b]^[/b]n


    I'm very much enjoying this.



  • @RayS said:

    @Thanny said:
    @ammoQ said:
    @dhromed said:

    It's like that chessboard request: one grain of rice on the first, two on the second, four on the third, eight, sixteen etc.
    Many people will end up at 2^64, but they forget that you've to add all chessboard squares which doubles the amount of rice.


    Hmmm...
    1st field: 2^0
    2nd field: 2^1
    .
    .
    .
    64th field: 2^63
    ----------------------
    Sum: 2^64-1


    2^64-2



    The first square has one grain of rice, not two.




    The real WTF is that nobody's ever going to eat that much rice.


    Dunno.
    Maybe if you bring a few relatives?



  • @TheDauthi said:

    Still, doesn't work at x=0, as 0/0 is indeterminate(sp) form, and thus undef.  And, as I recall, the limit is 1, but it's needed to equal 1 by little things like the binomial theorem, probability, set theory.

    0^0 is undefined

    look at
    x^y = exp(y*ln(x))

    for x=0, we get the subterm ln(0) which is -inf; exp(-inf)=0

    therefore:
    lim(x->0; x^0) = 1
    lim(y->0; 0^y) = 0









  • @ammoQ said:

    @TheDauthi said:
    Still, doesn't work at x=0, as 0/0 is indeterminate(sp) form, and thus undef.  And, as I recall, the limit is 1, but it's needed to equal 1 by little things like the binomial theorem, probability, set theory.

    0^0 is undefined

    look at
    x^y = exp(y*ln(x))

    for x=0, we get the subterm ln(0) which is -inf; exp(-inf)=0

    therefore:
    lim(x->0; x^0) = 1
    lim(y->0; 0^y) = 0



    exp(n) is e^n, right?



  • yes, exp(n) = e^n



  • @ammoQ said:

    yes, exp(n) = e^n


    Oh good.

    Them 6 years of highschool math haven't completely been wasted. It's been 7 years now.

    Pity that we only covered e and ln with a single small chapter. e is the base number whose graph for e^x is the same as (e^x)'.

    Though that may be enough to extrapolate all the mathy goodness.

    There was a huge one on matrix calculations but I've forgotten most of that because let's just say I have plenty of HD space but little cache on my brain's CPU and matrixes require that you remember above-average amounts of numbers stuff while you work on 'em.



  • @ammoQ said:

    @TheDauthi said:
    Still, doesn't work at x=0, as 0/0 is indeterminate(sp) form, and thus undef.  And, as I recall, the limit is 1, but it's needed to equal 1 by little things like the binomial theorem, probability, set theory.

    0^0 is undefined

    look at
    x^y = exp(yln(x))

    for x=0, we get the subterm ln(0) which is -inf; exp(-inf)=0

    therefore:
    lim(x->0; x^0) = 1
    lim(y->0; 0^y) = 0

    @ammoQ said:
    @TheDauthi said:
    Still, doesn't work at x=0,
    as 0/0 is indeterminate(sp) form, and thus undef.  And, as I recall,
    the limit is 1, but it's needed to equal 1 by little things like the
    binomial theorem, probability, set theory.

    0^0 is undefined

    look at
    x^y = exp(y
    ln(x))

    for x=0, we get the subterm ln(0) which is -inf; exp(-inf)=0

    therefore:
    lim(x->0; x^0) = 1
    lim(y->0; 0^y) = 0


    Yes, which is why it SHOULD be undef.  However, per Knuth, it must be defined as 1 for the binomial theorem to work for some X/Y values.  There's also some strong backing in set theory that says that it should be 1.  I understand why it should be undefined from calculus, but it's defined as 1 because a bunch of things don't work if it isn't.  Basically, the value is context-sensitive.



  • @TheDauthi said:


    @ammoQ said:
    @TheDauthi said:
    Still, doesn't work at x=0,
    as 0/0 is indeterminate(sp) form, and thus undef.  And, as I recall,
    the limit is 1, but it's needed to equal 1 by little things like the
    binomial theorem, probability, set theory.

    0^0 is undefined

    look at
    x^y = exp(y*ln(x))

    for x=0, we get the subterm ln(0) which is -inf; exp(-inf)=0

    therefore:
    lim(x->0; x^0) = 1
    lim(y->0; 0^y) = 0


    Yes, which is why it SHOULD be undef.  However, per Knuth, it must be defined as 1 for the binomial theorem to work for some X/Y values.  There's also some strong backing in set theory that says that it should be 1.  I understand why it should be undefined from calculus, but it's defined as 1 because a bunch of things don't work if it isn't.  Basically, the value is context-sensitive.


    One day, we will find the unified theorem that will remove the silly "context sensitivity" of 0^0 and 0/0


Log in to reply