About Big-O Notation – The good, the bad, and the confusing

Often, prior to using a specific algorithm, it helps to understand information about how the algorithm scales.  In order to document this, computer science has borrowed a notation from mathematics called Big-O notation.

Understanding Big-O notation is critical in making good decisions about algorithms or libraries used to implement routines.  However, there is often a lot of misunderstanding about what it means, and I frequently see this used as justifications for poor decisions.

Read more