Using the google url shortener api, it was working fine till I started testing at load. Quickly started getting back 403 Rate Limit Exceeded errors from Google, even though
I don't think "1 request / per second / per user." as written in doc is 100% correct in my case, or the google url shortener case. (FYI: I am using "Public API access", not "OAuth")
I have almost the same problem but, for me, it is more likely to be "I get this error for some URLs for some period of times." What does it mean? Please continue reading.
These are what I found:
So, I guess that google's server may have cache of each url. If a url is fail, it must wait for a while to let the cache released.
So, I have to write some creepy code like this to solve my problem:
However, this creepy code is fine for my case because I am using ExecutorService with fixed-thread-pool size of 10. So, if there is a fail, the others still can get the shorten urls. It solves the problem...at least for me.