MySQL server has gone away during crawling in Perl

独自空忆成欢 提交于 2019-12-04 05:26:55

问题


I use WWW::Mechanize library to get the content of URLs and save their data into mysql tables. But when the page's content is too large, it gives this error message:

DBD::mysql::st execute failed: MySQL server has gone away at F:\crawling\perl_tests\swc2.pl line 481.

For example, it throws this error when I try to extract the content of this page: https://www.e-conomic.com/secure/api1/EconomicWebService.asmx?wsdl

I added this code as well, but it still does not work

$connection->{max_allowed_packet}=1000000000;

回答1:


Are you using the latest DBI and DBD::mysql? It should have mysql_auto_reconnect enabled.

If you're using mysql under forks, you may need to set

$dbh->{InactiveDestroy} = 1;



回答2:


You might want to look at DBIx::Connector. From the docs:

You can store the connection somewhere in your app where you can easily access it, and for as long as it remains in scope, it will try its hardest to maintain a database connection. Even across forks (especially with DBI 1.614 and higher) and new threads, and even calls to $conn->dbh->disconnect. When you don't need it anymore, let it go out of scope and the database connection will be closed.




回答3:


Try to add

max_allowed_packet=32M

in MySQL configuration file



来源:https://stackoverflow.com/questions/10414682/mysql-server-has-gone-away-during-crawling-in-perl

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!