I am new to web scraping and am trying to scrape tables on multiple web pages. Here is the site: http://www.baseball-reference.com/teams/MIL/2016.shtml
I am able to
One way would be to make vector of all the urls
you are interested in and then use sapply
:
library(rvest)
years <- 1970:2016
urls <- paste0("http://www.baseball-reference.com/teams/MIL/", years, ".shtml")
# head(urls)
get_table <- function(url) {
url %>%
read_html() %>%
html_nodes(xpath = '//*[@id="div_team_batting"]/table[1]') %>%
html_table()
}
results <- sapply(urls, get_table)
results
should be a list of 47 data.frame
objects; each should be named with the url
(i.e., year) they represent. That is, results[1]
corresponds to 1970, and results[47]
corresponds to 2016.