Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not sure there is a new algorithmic approach to sorting like you're thinking of. From a high level you can divide sorting algorithms into roughly two camps, "those that use divide-and-conquer", and "those that do not". The divide-and-conquer (split into smaller sub-lists, sort the sub-lists, and merge them while preserving sort order) algorithms are better. From this perspective, algorithms that we think of as quite distinct like quicksort and mergesort are not that different - quicksort just does the "divide" step of divide-and-conquer in a smarter way.

In the end, no matter what divide-and-conquer sorting algorithm you pick, you will be doing lots of "small sorts". And it is those small sorts that DeepMind has optimized here.



> I'm not sure there is a new algorithmic approach to sorting like you're thinking of. From a high level you can divide sorting algorithms into roughly two camps, "those that use divide-and-conquer", and "those that do not". ...

I think the sci-fi-but-possibly-real hope is that for sorting (among other things), we may have the perspective that there isn't any new algorithmic approach available, but an AI finds one for us.


That would be awesome! Obviously it’s hard to imagine what that would look like (since a necessary part of it is “the AI comes up with something we couldn’t imagine”), but here’s one potential idea, based on these DeepMind discoveries being “exploiting guarantees of previous steps” and “you can use simpler sorts when the first part of the list is sorted”: the AI might be able to find some way to perform a cheap-but-weak “divide” step and a cheap-but-weak “merge” step, such that the guarantees from each step happen to interact in a way that produces fully correct sorting.


There's lot of accepted sorting algorithms [1]. I'm sure we can come up with novel new algorithms, even if they're not optimal. Like Wikipedia mentions, they all fall within some small number of higher level categories (eg. Partitioning, Merging, Selection, Insertion). I'm still not convinced that the optimizations presented in the article amount to the discovery of NEW sorting algorithms but merely optimizations of existing ones.

[1] https://en.wikipedia.org/wiki/Sorting_algorithm


Indeed, it's easy to prove that any sorting algorithm that works by comparing elements (unlike e.g. radix sort) requires Ω(n log n) time in the worst case.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: