I was given this problem in an interview. How would you have answered?
Design a data structure that offers the following operations in O(1) time:
Cant we do this using HashSet of Java? It provides insert, del, search all in O(1) by default. For getRandom we can make use of iterator of Set which anyways gives random behavior. We can just iterate first element from set without worrying about rest of the elements
public void getRandom(){
Iterator<integer> sitr = s.iterator();
Integer x = sitr.next();
return x;
}
Why don't we use epoch%arraysize to find random element. Finding array size is O(n) but amortized complexity will be O(1).
The best solution is probably the hash table + array, it's real fast and deterministic.
But the lowest rated answer (just use a hash table!) is actually great too!
People might not like this because of "possible infinite loops", and I've seen very smart people have this reaction too, but it's wrong! Infinitely unlikely events just don't happen.
Assuming the good behavior of your pseudo-random source -- which is not hard to establish for this particular behavior -- and that hash tables are always at least 20% full, it's easy to see that:
It will never happen that getRandom() has to try more than 1000 times. Just never. Indeed, the probability of such an event is 0.8^1000, which is 10^-97 -- so we'd have to repeat it 10^88 times to have one chance in a billion of it ever happening once. Even if this program was running full-time on all computers of humankind until the Sun dies, this will never happen.
For this Question i will use two Data Structure
Steps :-
Code :-
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.LinkedList;
import java.util.List;
import java.util.Random;
import java.util.Scanner;
public class JavaApplication1 {
public static void main(String args[]){
Scanner sc = new Scanner(System.in);
ArrayList<Integer> al =new ArrayList<Integer>();
HashMap<Integer,Integer> mp = new HashMap<Integer,Integer>();
while(true){
System.out.println("**menu**");
System.out.println("1.insert");
System.out.println("2.remove");
System.out.println("3.search");
System.out.println("4.rendom");
int ch = sc.nextInt();
switch(ch){
case 1 : System.out.println("Enter the Element ");
int a = sc.nextInt();
if(mp.containsKey(a)){
System.out.println("Element is already present ");
}
else{
al.add(a);
mp.put(a, al.size()-1);
}
break;
case 2 : System.out.println("Enter the Element Which u want to remove");
a = sc.nextInt();
if(mp.containsKey(a)){
int size = al.size();
int index = mp.get(a);
int last = al.get(size-1);
Collections.swap(al, index, size-1);
al.remove(size-1);
mp.put(last, index);
System.out.println("Data Deleted");
}
else{
System.out.println("Data Not found");
}
break;
case 3 : System.out.println("Enter the Element to Search");
a = sc.nextInt();
if(mp.containsKey(a)){
System.out.println(mp.get(a));
}
else{
System.out.println("Data Not Found");
}
break;
case 4 : Random rm = new Random();
int index = rm.nextInt(al.size());
System.out.println(al.get(index));
break;
}
}
}
}
-- Time complexity O(1). -- Space complexity O(N).
Consider a data structure composed of a hashtable H and an array A. The hashtable keys are the elements in the data structure, and the values are their positions in the array.
since the array needs to auto-increase in size, it's going to be amortize O(1) to add an element, but I guess that's OK.
O(1) lookup implies a hashed data structure.
By comparison: