问题
The formula for "minimum nodes for an avl tree in height h" is recursive: n(0)=1, n(1)=2 n(h)= 1+n(h-1)+n(h-2)
On the other hand, i found this in the internet for explanation of the complexity of adding N elements to a empty avl tree:
Well, imagine the tree being built.
One by one, the elements go through the tree nodes, and chose their abode by taking either a left or a right. The tree balances itself every time the heights are too skewed.
Of course, the cost of balancing the tree is an O(1) operation, so I do not consider that in the complexity analysis.
Complexity: log(1)+log(2)+log(3)+....+log(n)
=log(n!)
=O(nlogn-n+O(log(n)))
=O(nlogn)
But here is what i do not understand, why is the calculation log(n!) if NOT every time i add an element the height increases? since the recursive formula presented applies that for a big N, the avl height increases only after a lot of elements, so asymptotically shouldn't it be better then log(n!)?
Also, what is the worse case for this? Is there a worse case and best case in this situation of complexity? For example, assume i have specific N elements i am adding, could there be a better running time for different runs? Or can i say it is known from advanced where each element is going to be added so the running time is bounded tight?
回答1:
Simpler Upper Bound Explaination
If you have n elements, the most time one insert will take is log(n) time. If we assume this worse case insert time for all n items, then you get O(n*log(n))
without the complex explanation.
Another way of looking at it is:
log(1) + log(2) + log(3) + ... + log(n) < n*log(n) = log(n) + log(n) + ... + log(n)
来源:https://stackoverflow.com/questions/36809377/running-time-of-adding-n-elements-into-an-empty-avl-tree