How would Big-O notation help in my day-to-day C# programming? Is it just an academic exercise?
I am reading answers and I (seriously) think that big-O is underestimated.
As coders who make money from coding, we need to know what big-O is and why we need it.
Let me explain what I think: Big-O notation is the efficiency/performance of your work. You have to know how fast your code works when the inputs get bigger because in real life you can't know the exact number of inputs. Furthermore, you can't compare two different algorithmic approaches without an asymptotic notation so if you want to choose the better one, you are going to compare them with big-O and see which one fits your situation. Both may be inefficient but you will know which one is better.