Avoid Rate limit with rtweet get_timeline()

后端 未结 2 1190
抹茶落季
抹茶落季 2021-01-06 18:18

Is there anyway to stop my loop from being interrupted by the rate limit? I would like my code to wait to execute until the time limit has passed if possible.

A side

相关标签:
2条回答
  • 2021-01-06 18:37

    I was able to resolve it by wrapping get_timeline() function in the following code. The function get_timeline_unlimited calls itself recursively after waiting the required time for the rate limit to reset. So far it worked well for me with no issues.

     get_timeline_unlimited <- function(users, n){
    
      if (length(users) ==0){
        return(NULL)
      }
    
      rl <- rate_limit(query = "get_timeline")
    
      if (length(users) <= rl$remaining){
        print(glue("Getting data for {length(users)} users"))
        tweets <- get_timeline(users, n, check = FALSE)  
      }else{
    
        if (rl$remaining > 0){
          users_first <- users[1:rl$remaining]
          users_rest <- users[-(1:rl$remaining)]
          print(glue("Getting data for {length(users_first)} users"))
          tweets_first <- get_timeline(users_first, n, check = FALSE)
          rl <- rate_limit(query = "get_timeline")
        }else{
          tweets_first <- NULL
          users_rest <- users
        }
        wait <- rl$reset + 0.1
        print(glue("Waiting for {round(wait,2)} minutes"))
        Sys.sleep(wait * 60)
    
        tweets_rest <- get_timeline_unlimited(users_rest, n)  
        tweets <- bind_rows(tweets_first, tweets_rest)
      }
      return(tweets)
    }
    
    0 讨论(0)
  • 2021-01-06 18:50

    What I ended up doing was create a while loop that checked the number of records I had left in my Users vector, ran my for loop, and then put the system to sleep for 15 mins. This approach is good, but there are some things to account for. I have the while loop breaking at 200 just in case there were users that didn't have any data to save into a csv. This turned out to be a good move because if you notice the for loop starts iterating at 80. As you start moving across your vector of users the good users are removed iteratively. This leaves only the users that cause errors. An improvement for someone up to the task would be to handle this programatically.

    Users <- usrs$user_id
    goodUsers <- substring(list.files(),1,nchar(list.files())-11)
    Users <- setdiff(Users,goodUsers)
    
    while(length(Users)>200){
        for(i in 80:length(Users)){
    
            a<-tryCatch({get_timeline(Users[i],usr=FALSE)},
                    error=function(e){message(e)}
               )
            tryCatch({save_as_csv(a,Users[i])
                    goodUsers <- append(goodUsers,Users[i])},
                        error=function(e){message(e)}
            )
    
    
        }
    Users <- setdiff(Users,goodUsers)
    Sys.sleep(900)
    }
    
    length(Users)
    length(goodUsers)
    
    0 讨论(0)
提交回复
热议问题