I hope you don\'t frown on me too much, but this should be answerable by someone fairly easily. I want to read a file on a website into a string, so I can extract information fr
You need an HTTP Client library, one of many is libcurl
. You would then issue a GET
request to a URL and read the response back how ever your chosen library provides it.
Here is an example to get you started, it is C so I am sure you can work it out.
#include
#include
int main(void)
{
CURL *curl;
CURLcode res;
curl = curl_easy_init();
if(curl) {
curl_easy_setopt(curl, CURLOPT_URL, "http://example.com");
res = curl_easy_perform(curl);
/* always cleanup */
curl_easy_cleanup(curl);
}
return 0;
}
But you tagged this C++ so if you want a C++ wrapper for libcurl then use curlpp
#include
#include
#include
using namespace curlpp::options;
int main(int, char **)
{
try
{
// That's all that is needed to do cleanup of used resources
curlpp::Cleanup myCleanup;
// Our request to be sent.
curlpp::Easy myRequest;
// Set the URL.
myRequest.setOpt("http://example.com");
// Send request and get a result.
// By default the result goes to standard output.
myRequest.perform();
}
catch(curlpp::RuntimeError & e)
{
std::cout << e.what() << std::endl;
}
catch(curlpp::LogicError & e)
{
std::cout << e.what() << std::endl;
}
return 0;
}