I am a college student and I recently came across this in an online publication. Is this a fact?


I do feel recruiters believe that. I've not seen scientific evidence.

The way you're wording it, it seems like a choice between people who just program, and people who program but also study algorithms. In that case, it's easy, the latter will know more.

He'd also be a better programmer if he instead studied design patterns, learned some extra languages or learned about type systems...

There are a lot of things that you can learn that'll make you a better developer, but that you can avoid if you stay at a lot enough level. Advanced algorithms are one of those.

EDIT: there are of course some narrow applications where advanced algorithms are essential. Good algorithms can have a huge impact on performance, where some things are only possible with good algorithms (like FFT). The reason they're not essential to study for everyone is that they're often reusable.

(2 answers) Take me to the question