How do I transfer wget output to a file or DB?

北战南征 提交于 2019-12-25 11:26:56

问题


I'm trying to use a small script to download a field from multiple pages. For one thing, I'm only able to get it from one page..., but the real problem I'm having is that I don't know how to hand the output off to a database table? How can I take the output from curl/lynx|grep (which is going to be all the list items) and move it, list item by list item, to a table in my DB or to a CSV where it will be ready for import to the DB?

#!/bin/bash

lynx --source "http://www.thewebsite.com"|cut -d\" -f8|grep "<li>"

The database I would connect to would be a MySQL database. We could call the dummy table "listTable". Please, try to stick to bash? I'm not allowed to compile on the server I'm using, and I can't seem to get curl to work with PHP. Anyway, I'm thinking I need to make a variable and then systematically pass the contents of the variable to the database, right?


回答1:


Use something like awk, sed or perl to create INSERT statements, then pipe that to your sql client (psql or mysql).




回答2:


Just write a Python script which reads everything from the stdin an puts it into the database and do something like:

curl http://www.google.com | ./put_to_db.py



来源:https://stackoverflow.com/questions/4702535/how-do-i-transfer-wget-output-to-a-file-or-db

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!