What I need to do is to read data from hundreds of links, and among them some of the links contains no data, therefore, as the codes here:
urls <-paste0(\
Does this help?
dims <- sapply(myData, dim)[2,]
bad_Ones <- myData[dims==1]
good_Ones <- myData[dims>1]
If myData
still grabs something off the station page, the above code should separate the myData
list into two separate groups. good_Ones
would be the list you would want to work with. (assuming the above is accurate, of course)
Here are 2 possible solutions (untested because your example is not reproducible):
Using try
:
myData <- lapply(urls, function(x) {
tmp <- try(read.table(x, header = TRUE, sep = '|'))
if (!inherits(tmp, 'try-error')) tmp
})
Using tryCatch
:
myData <- lapply(urls, function(x) {
tryCatch(read.table(x, header = TRUE, sep = '|'), error=function(e) NULL)
})