问题
I use WWW::Mechanize library to get the content of URLs and save their data into mysql tables. But when the page's content is too large, it gives this error message:
DBD::mysql::st execute failed: MySQL server has gone away at F:\crawling\perl_tests\swc2.pl line 481.
For example, it throws this error when I try to extract the content of this page: https://www.e-conomic.com/secure/api1/EconomicWebService.asmx?wsdl
I added this code as well, but it still does not work
$connection->{max_allowed_packet}=1000000000;
回答1:
Are you using the latest DBI and DBD::mysql? It should have mysql_auto_reconnect enabled.
If you're using mysql under forks, you may need to set
$dbh->{InactiveDestroy} = 1;
回答2:
You might want to look at DBIx::Connector. From the docs:
You can store the connection somewhere in your app where you can easily access it, and for as long as it remains in scope, it will try its hardest to maintain a database connection. Even across forks (especially with DBI 1.614 and higher) and new threads, and even calls to $conn->dbh->disconnect. When you don't need it anymore, let it go out of scope and the database connection will be closed.
回答3:
Try to add
max_allowed_packet=32M
in MySQL configuration file
来源:https://stackoverflow.com/questions/10414682/mysql-server-has-gone-away-during-crawling-in-perl