问题
I've seen explanations for why RAM is accessed in constant time (O(1)
) and why it's accessed in logarithmic time (O(n)
). Frankly neither makes much sense to me; what is n
in the big-O notation and how does it make sense to measure the speed at which a physical device operates using big-O? I understand an argument for RAM being accessed in linear time is if you have an array a
then the kth element would be at address a+k*size_of_type
(point is address can be easily calculated). If you know the address of where you want to load
or store
from, wouldn't that be a constant amount of time in the sense it will always take the same no matter the location? Someone told me looking up something in RAM (like an element in an array) take longer than O(n)
because it needs to find the right page. This is wrong as paging pertains to the hard disk, not RAM.
回答1:
I think it is a nanoseconds value, which is way faster than accessing the disk (5 to 80 ms)
来源:https://stackoverflow.com/questions/34312820/how-much-time-does-a-program-take-to-access-ram