Karmic Revenge: should I screw this guy or not?



  • IMO you are better of if you just let that guy work. Making the life of your subordinates miserable will most likely backfire at you; at least it shows a lack of leadership. If that guy is really an arrogant a**hole,  he will give you a better reason to get rid of him sooner or later.



  • @Lexarius said:

    @mister said:
    There you have it, the full function. It works flawlessly if you call it in the correct universe (the one where the algorithm is already sorted). If you run it in the wrong universe... well, I can't help you with that, can I?

    Ah, I see. That works, but my QA guys aren't happy with your error handling. I guess if they want it to work in all (remaining) universes, they'll just have to deal with O(n).

     

    Ah, you puny humans with your aleph-zero multiverse ! In aleph-one space, the spaghetti sort algorithm can be extended to yield the desired sorted array in one machine cycle, for O(1). There is no need for error handling since the algorithm is guaranteed to yield the correct result for all arrays with less than aleph-zero elements.  8-))

    And could someone please open that box - the cat knows whether it's alive or dead, already. 



  • @operagost said:

        Do not repay anyone evil for evil. Be careful to do what is right in the eyes of everybody. If it is possible, as far as it depends on you, live at peace with everyone. Do not take revenge, my friends, but leave room for God's wrath, for it is written: "It is mine to avenge; I will repay,"says the Lord. On the contrary:
       "If your enemy is hungry, feed him;
          if he is thirsty, give him something to drink.
       In doing this, you will heap burning coals on his head."

    Ok, I have trouble believing that this guy will burn in hell just because he was irritated by the OP suggestion that it would be possible for a very smart person to cross the absolute O(nlogn) theoritical barrier.



  • For a guy like that, the fact that you're his boss is punishment enough.  Even if you're the nicest boss in the world, he'll think you're out to get him, and will look for every opportunity to discredit you behind your back.  Now, it seems that your company is unique in that people actually know this guy is full of shit.  So I say be the best boss you can be, then silently wait for him to self destruct.

    When the time comes for you to hand him his layoff papers, then you can be condescending :)



  • @snoofle said:

    In the context of the interview, the whole sorting question was totally off base w.r.t. the kinds of things that he was supposed to be checking me out on (knowledge of MQ and TIBCO,
     

    People focus too much on that sort of thing in interviews.  What are you using MQ for?  You're putting messages on it and getting messages from it.  So what do you need to know about it?  Queue.put() and Queue.get().  TIBCO?  Same thing.  

    The other attributes you mentioned are definitely important, but I'd put algorithm efficiency over knowledge of a particular product and whether they named the function enqueue or put.  Without stuff like that you'll end up with the guy creating a "connection pool" of MQ connections that are only ever used one at a time by a single thread (i.e. a single connection could have been used)...

    If you can prove that you know the fundamentals, the specific implementations can be learned by reading a farking book.

    /I love the interviews that go like....
    "What kind of databases do you have experience with?"
    "Ummm...job A used Oracle and job B used MySQL, I guess?"
    "Oohhhhh, so you don't have any Sybase experience?  Well, we only use Sybase."
    "I'm interviewing for a developer job, not a DBA, right?"

    If I ever have one of those again, I'm walking out.

     



  • @poopdeville said:

    particular, some data can be sorted in less than n log n time, using QS.
     

    I seriously doubt that, to put it mildly.



  • @ounos said:

    @poopdeville said:

    particular, some data can be sorted in less than n log n time, using QS.
     

    I seriously doubt that, to put it mildly.

    Your doubts, though evidently serious, are trivially wrong.

    QuickSort has an AVERAGE run time in O(n log n). It has a WORST CASE run time in O(n^2). That is to say, there is a list in the set of lists of length n that requires O(n^2) comparisons to sort with QS. Since n^2 is bigger than n log n, the AVERAGE, we can conclude that there is a list that can be sorted in fewer comparisons than O(n log n). Otherwise O(n log n) wouldn't be the AVERAGE.



  • QuickSort has a best case of Θ(nlogn). Enough said.



  • @bohica61 said:

    Tell him if he can't design a sort algorithm with consistently better performance than a quicksort, his next performance review will not go well. 

    seconded. There is no better way to remind him of the interview...

     

    Edit: added qoute 



  • @ounos said:

    QuickSort has a best case of Θ(nlogn). Enough said.

    Your assertion implies that O(n log n) = O(n^2). Trivially false.



  • @poopdeville said:

    @ounos said:

    @poopdeville said:

    particular, some data can be sorted in less than n log n time, using QS.
     

    I seriously doubt that, to put it mildly.

    Your doubts, though evidently serious, are trivially wrong.

    QuickSort has an AVERAGE run time in O(n log n). It has a WORST CASE run time in O(n^2). That is to say, there is a list in the set of lists of length n that requires O(n^2) comparisons to sort with QS. Since n^2 is bigger than n log n, the AVERAGE, we can conclude that there is a list that can be sorted in fewer comparisons than O(n log n). Otherwise O(n log n) wouldn't be the AVERAGE.

     

    If I understand you correctly, you are saying that quicksort's best case time complexity is less than Θ(n
    log(n))
    I hope I interpreted your statement correctly.  If so, tell it to these guys:

     

    http://www.inf.fh-flensburg.de/lang/algorithmen/sortieren/quick/quicken.htm<o:p></o:p>


     The <font color="#ff0000">best-case behavior of the quicksort algorithm</font> occurs when in each recursion step the partitioning produces two parts of equal length. In order to sort n elements, in<font color="#ff0000"> this case the running time is in Θ(n log(n))</font>. This is because the recursion depth is log(n) and on each level there are n elements to be treated (Figure 2 a).


    http://www.cs.swan.ac.uk/~csfm/Courses/CS_332/quicksort.pdf<o:p></o:p>

     

    Informal Analysis: PARTITION performs Θ(n) comparisons (where n = r − p + 1). <o:p></o:p>

    <font color="#ff0000">Best case</font>: even split every time:<o:p></o:p>

    T(n) = Θ(n) + T(n/2) + T(n/2−1) = T(n) = <font color="#ff0000">Θ(n lg n)</font>.<o:p></o:p>

    “Average case”: probably Θ(n lgn), since even with a bad 99-1 split every time, we get<o:p></o:p>

    [snip un-pasteable formula] = Θ(n lg n).

     


    http://en.wikipedia.org/wiki/Quicksort#Formal_analysis


    In the<font color="#ff0000"> best case</font>, each time we perform a partition we divide the list into two nearly equal pieces. This means each recursive call processes a list of half the size. Consequently, we can make only logn nested calls before we reach a list of size 1. This means that the depth of the call tree is Θ(logn). But no two calls at the same level of the call tree process the same part of the original list; thus, each level of calls needs only Θ(n) time all together (each call has some constant overhead, but since there are only Θ(n) calls at each level, this is subsumed in the Θ(n) factor). The result is that the algorithm uses only <font color="#ff0000">Θ(nlogn) time</font>.


    http://www.urch.com/forums/gre-computer-science/7425-why-quicksort-average-case-close-its-best-case.html

    <font size="1"> Note: O(g(n)) is used instead of BigTheta(g(n)) for easier readability.</font>

    <font size="4">Why Quicksort average-case is close to its best-case?</font>

    <font color="#ff0000"> As we know, Quicksort runs in the worst case in O(n^2) and in the best case in O(n*log n). Its average case still runs in O(n*log n).  </font>

     

     

     



  • @poopdeville said:

    @ounos said:

    QuickSort has a best case of Θ(nlogn). Enough said.

    Your assertion implies that O(n log n) = O(n^2). Trivially false.

     

    Please go to college. You are making a fool of yourself.

    If college is too much work, at the very least look up what Theta actually  means before making "trivial" statements.



  • @poopdeville said:

    @ounos said:

    QuickSort has a best case of Θ(nlogn). Enough said.

    Your assertion implies that O(n log n) = O(n^2). Trivially false.

     

    Maybe this article will shed some light on your fallacious assertion:

    http://pages.cs.wisc.edu/~hasti/cs367-common/notes/COMPLEXITY.html#best

    When do Constants Matter?

    Recall that when we use big-O notation, we drop constants and low-order terms. This is because when the problem size gets sufficiently large, those terms don't matter.

    However, this means that
    two algorithms can have the same big-O time complexity,
    <font color="#ff0000">even though one is always faster than the other.</font>
    For example, suppose algorithm 1 requires N2 time, and
    algorithm 2 requires 10 * N2 + N time.
    For both algorithms, the time is O(N2), but
    algorithm 1 will always be faster than algorithm 2.
    In this case, the constants and low-order terms do matter in terms
    of which algorithm is actually faster.

    I can apply the same reasoning to the best, worst and average case complexity.  In the case of quicksort, the best case will (obviously) always be faster or just as fast as any average case.   And the difference between the two could be a constant factor or a lower order term, which will not show up once the time complexity is expressed in O-notation.

     



  • Thanks for proving that a google search is easy. Some people just don't know that answers exist few clicks away.

    This is quite to the point: 


    (With a small substitution of "asking dumb questions" to "making dumb statements"). 



  • Call analysis and comparison analysis are not the same. And TIME is relativized to the sort of operation you are analyzing.

    QS can sort a list in n comparisons, even though it goes through O(n log n) calls.



  • @poopdeville said:

    Call analysis and comparison analysis are not the same. And TIME is relativized to the sort of operation you are analyzing.

    QS can sort a list in n comparisons, even though it goes through O(n log n) calls.

     

    ??? Time complexity analysis has to take into account the number of (important) steps a program runs through to achieve its goal.  Unless you claim that each of the O(log n) calls takes 0 time?

    Really, when comparing algorithms, computer scientists and mathematicians look at time and/or space complexity.  When you are looking at time complexity, not sure why you would want to restrict yourself to only looking at a certain kind of step and ignoring other (non-trivial) steps.



  •  @poopdeville said:

    Call analysis and comparison analysis are not the same. And TIME is relativized to the sort of operation you are analyzing.

    QS can sort a list in n comparisons, even though it goes through O(n log n) calls.

    Oh the troubles one goes not to accept being "trivially" wrong... :)



  • Well, I was mistaken anyway, but the reason you want to restrict yourself to some operations and not others is that some are slow (and thus significant) and some are fast. Time is always relativized to the number of times an operation of a certain class is performed. A function call is fast, but interchanging array values is slow.



  • @poopdeville said:

    Well, I was mistaken anyway, but the reason you want to restrict yourself to some operations and not others is that some are slow (and thus significant) and some are fast. Time is always relativized to the number of times an operation of a certain class is performed. A function call is fast, but interchanging array values is slow.
     

    I understand that, but it has nothing to do with your argument.  My previous reply to you was a bit off-base, because I was confused by your response.  The reason the number of function calls is counted is not because the act of calling the function itself is slow, but because the body of function is doing something significant.

    If my algorithm's (let's call it...quicksort) helper function (let's call it...partition) takes O(n) time for each call, and partition() is called O(log n) times, then I guess my entire algorithm takes O(n log n) time, regardless of whether I am looking at the function call overhead or not.  And I agree that, in this case, the time analysis only looks at operations such as comparisons and value swapping, not function calls.  Not that it would make a difference, since counting the function calls would only introduce a constant multiplicative factor into the analysis. 



  • It doesn't make a difference in this case, since each function call to your QS implementation partitions and compares. So you will have O(n log n) comparisons if you have O(n log n) calls. I admitted my mistake when I realized this. Combinatorics can be a tricky thing, especially when combined with Landau symbols.

    On the other hand, conditionals in code can make the number of "significant" operations different from the number of function calls for that operation. Algorithms that cache the results of expensive computations, for example.



  • @poopdeville said:

    It doesn't make a difference in this case, since each function call to your QS implementation partitions and compares. So you will have O(n log n) comparisons if you have O(n log n) calls.
     

    Yup, I agree.  That is pretty much what I was trying to say (although I didn't express it very well).  I was previously thrown off by your assertion that only O(n) comparisons are made in the best case, and I replied too quickly without thinking, so I didn't address it directly.

    @poopdeville said:

    On the other hand, conditionals in code can make the number of "significant" operations different from the number of function calls for that operation. Algorithms that cache the results of expensive computations, for example.

    Good point.  You are right to say that function calls (in and of themselves) are usually not considered. I think that's why, IIRC, time complexity is usually fuzzily defined as number of algorithmic steps (I'm speaking informally here), rather than in specific terms like "recursive function calls"/"comparisons"/etc.  For each distinct algorithm, you have to decide what kind of operations count as significant steps (like you said). 

     



  • @utunga said:

    Just in case you are seriously, honestly going to abide by the opinions of TDWTF readers i want to voice my vote for clean slate.
     

    Seconded.

    I'm genuinely surprised how many people say the opposite. Two wrongs don't make a right (*). Presumably they put you in charge because you are a better boss than he is. Prove them right by acting like one. You're in a professional setting, you do not have to like each other but make the best of it, not the worst. Use the strengths of the people below you, not their weaknesses. 

    (*) but three lefts do. 



  • @SenTree said:

    And could someone please open that box - the cat knows whether it's alive or dead, already. 
     

    You open the box and the box is empty.

    Interpret the results. 

    The book "Dirk Gently's Holistic Detective Agency" lies on the table.

    Possible exists: NORTH, EAST, WEST 

    >  




  • @BertBert said:

    (*) but three lefts do. 
     

    not in every situation 



  • @Nelle said:

    @SenTree said:

    And could someone please open that box - the cat knows whether it's alive or dead, already. 
     

    You open the box and the box is empty.

    Interpret the results. 

    "I'm having this terrible difficulty with my ... search for pussy ... too".

    @Nelle said:

    The book "Dirk Gently's Holistic Detective Agency" lies on the table.

    Possible exists: NORTH, EAST, WEST 

    You are indeed correct, the possible does exist (sometimes) !

    > XYZZY - damn, wrong game !

     



  • Anybody who eschews bubble sort is an algorithm elitist. I bubble sort everything just to keep it real.



  • @shakin said:

    Anybody who eschews bubble sort is an algorithm elitist. I bubble sort everything just to keep it real.
    I don't sort at all.  Why fight entropy?



  • @bstorer said:

    I don't sort at all.  Why fight entropy?

    In fact, every time you do sort you accelerate the heat death of the universe, so you're saving lives by leaving your data how it falls.

     

    You, sir, are a hero. 



  • @snoofle said:

    The guy asks me what's the most efficient sort algorithm. I respond with quick-sort, or something similar. He asks for big-O performance. I fire back n-log(n). He asks if I can design something with consistently better performance. I think for a nanosecond before firing back: "If Knuth couldn't do better, what makes you think that I can?" He bristled and began to talk down to me.

    You're presuming a definition of "sort algorithm" that is not stated: one where the only way the items to be sorted can be examined is by comparison.

    Bucket sort.



  • You have a choice. You can be an asshole manager, or you can be a respected manager. If you mistreat the guy, other workers will notice it and distrust you. My recommendation is you try your best to find the best spot that matches this guy's skills and get the highest quality work he's able to deliver. Holding grudges will just give you ulcers.

    The wheel of life turns. One day he may be YOUR boss.

     



  • Does he do good work? Some guy has shitty attitudes, but manage to make up for it with exceptional work. I like to think I'm one of those guys. 



  • @zzo38 said:

    Do a clean slate, because he now works for you instead, so it is different, so you can forgive him. If he does the bad thing again, then you can fire him.
     

    He is likely to 'do the bad thing' with your other subordinates, when you aren't looking. You need to get rid of him as quickly as possible. As a new manager, you will be required to justify your decision to get rid of him. Keep a detailed list of his wrongdoings. Discuss these with your boss. Schedule documented meetings to discuss these wrongdoings with the a**hole. Take a management course if possible. Don't just attack. Attack with a vengeance. 


Log in to reply