I am learning about Big O Notation running times and amortized times. I understand the notion of O(n) linear time, meaning that the size of the input affects the g
These 2 cases will take O(log n) time
case 1: f(int n) { int i; for (i = 1; i < n; i=i*2) printf("%d", i); } case 2 : f(int n) { int i; for (i = n; i>=1 ; i=i/2) printf("%d", i); }