I am refreshing on Master Theorem a bit and I am trying to figure out the running time of an algorithm that solves a problem of size n
by recursively solving 2 subp
Don't even think about Master's Theorem. You can only use Masther's Theorem when you're given master's theorem when b > 1 from the general form T(n) = aT(n/b) + f(n).
Instead, think of it this way. You have a recursive call that decrements the size of input, n, by 1 at each recursive call. And at each recursive call, the cost is constant O(1). The input size will decrement until it reaches 1. Then you add up all the costs that you used to make the recursive calls. How many are they? n. So this would take O(2^n).