Light sorting
-
Generally, comparison isn't done using the assignment operator, right?
-
I think you need to recheck your 'comparison'
oh.
stupid brain. stop autocorrecting what you see!
-
Maybe you should switch to VB.NET ...
-
In Go,
return x = y
is a syntax error.
-
In VB.NET,
return x = y
returns the result of an equality comparison between x and y ...
-
...which is exactly useless as as sort comparator. The canonical numeric value for True is -1 in all the MS BASICs, which means that if x is not equal to y the comparator will always claim that it's less.
-
-
@flabdablet said:
The canonical numeric value for True is -1 in all the MS BASICs
TRWTF
Why? That's all-1s in two's-compliment;False
is all-0s.
-
Or you could go with all numbers other than 0.
Then you can have truthiness.
-
true is 1 in most languages.
-
-
Or you could go with all numbers other than 0.
Then you can have truthiness.
Hey, kid, here's a nickel. Go buy yourself a real boolean.
Filed under: get off my lawn
-
Still better than accidental assignment, isn't it?
-
Not really, since if the level field isn't an int you'll at least get a compile-time error.
-
TRWTF
Eh? Using -1 for the canonical
true
has the upshot of unifying bitwise and logical operators at the cost of losing out on short-circuit evaluation...(which can work out rather well in a non-infix language).
-
Bool.FileNotFound
-
data wtfBool = True | False | FileNotFound
Filed under: crazy ideas thread...
-
Both are O(nlogn) sorts so they're both pretty fast on big datasets, and the extra memory is not usually a problem.
When you want to choose between two O(nlogn) algorithms, you'll have to look at the constants. That's where merge sort is faster than quicksort. And a LOT less tricky to implement, too.
-
When you want to choose between two O(nlogn) algorithms, you'll have to look at the constants.
While that's true…
That's where merge sort is faster than quicksort.
That's not. When done right (the pivot choice is critical) quicksort is incredibly fast even by comparison with other O(nlogn) algorithms.
And a LOT less tricky to implement, too.
And both algorithms are fairly tricky to implement well, enough so that you want to do it once in a library and then just use that. Practically though, the main reason for choosing mergesort is that it's both fast and stable. That's a nice combination. The other high-speed sorts require auxiliary data (i.e., a composite key that includes the original element order) to achieve that.
Unless you have everything sorted and are adding one element to the collection. That's what insertion sort is best for (and it's easy to get right).
-
Unless you have everything sorted and are adding one element to the collection. That's what insertion sort is best for (and it's easy to get right).
Wouldn't a binary search be faster?
-
Not significantly unless comparison is unusually expensive. If you're sorting a new item into an existing array, then you have to shuffle all the things after its correct position up one spot to make room; the major overhead is getting that stuff in and out of cache, and it usually costs almost nothing extra to do the comparisons as you go rather than as a separate binary search step ahead of time.
-
If you're sorting a new item into an existing array
And if you're sorting into a linked list where insertion is cheap, you won't be binary searching; that's expensive in linked lists (where indexing is O(n)). It's possible to create hybrid datastructures that don't have these problems, but they're getting complex and aren't usually justifiable.
-
And luckily, for the cases where they are justifiable, somebody else has already done the work.