I have following task (as part of bigger task):
I need to take an k
element from array like data structure and delete it (k is any possible index). Arra
The only data-structure that has a small overhead in adding and removing element is an hashtable. The only overhead is the cost of the hash function (and it is considered as O(1), if you take a purely theoretic approach).
But, if you want it to be extremely efficient, you will need to:
If you manage to get everything right, then you should be optimal.
Given the nature of the "dating" problem as given, it involves continuously choosing and removing the "best" member of a set--a classic priority queue. In fact, you'll need to build two of those (for men and women). You'll either have to build them in O(NlogN) time (a sorted list) for constant O(1) removal, or else build them in linear time (a heap) for O(logN) removal. Overall you get O(NlogN) either way, since you'll be removing all of one queue and most of the other.
So then the question is what structure supports the other part of the task, choosing the "chooser" from the circle and removing him and his choice. Since this too must be done N times, any method that accomplishes the removal in O(logN) time won't increase the overall complexity of your algorithm. You can't get O(1) indexed access with fast deletions given the re-indexing requirement. But you can in fact get O(logN) for both indexed access and deletion with a tree (something like a rope as mentioned). This will give you O(NlogN) overall, which is the best you can do anyway.
There is a solution, that may be satisfying in some cases. You have to use an array and a vector for saving deletions. Every time you delete an element, you put its index in a vector. Every time you read an element of some index, you recalculate its index depending on previous deletions.
Say, you have an array of:
A = [3, 7, 6, 4, 3]
You delete 3-rd element:
A = [3, 7, 6, 4, 3] (no actual deletion)
d = [3]
And then read the 4-th:
i = 4
3 < 4 => i += 1
A[i] = 3
This is not exactly O(1), but yet it does not depend on array length. Only on a number of deleted elements.
I believe what you're asking for is impossible.
However, if you can relax your requirement for indexing to O(log n), then ropes may be able to satisfy it, although I'm not sure if they have a probabilistic or deterministic guarantee (I think it's probabilistic).