Do you think is technically possible to take a screeshot of a website programmatically?
I would like to craft a scheduled Python task that crawls a list of websites
Are you looking for functionality like what browsershots.org offers?
Source code available at Google Code/Browsershots.
I used selenium and PhantomJS.
from selenium import webdriver
driver = webdriver.PhantomJS()
driver.get("http://anyurl.com")
driver.save_screenshot("/path/to/folder")
be sure to place the PhantomJS executable in your $PATH.
It's certainly technically possible.
You would probably have to render the HTML directly onto an image file (or more likely, onto an in-memory bitmap that's written to an image file once completed).
I don't know any libraries to do this for you (apart from a modified WebKit, perhaps)... but there's certainly websites that do this.
Of course, this is a bit more involved than just opening the page in a browser on a machine and taking a screenshot programatically, but the result would likely be better if you don't care about the result from a specific browser.
You can check webkit2png (only OS X) and khtml2png (Linux) and this post (use PyQt and WebKit).
How about pyGTK
import gtk.gdk
w = gtk.gdk.get_default_root_window()
sz = w.get_size()
print "The size of the window is %d x %d" % sz
pb = gtk.gdk.Pixbuf(gtk.gdk.COLORSPACE_RGB,False,8,sz[0],sz[1])
pb = pb.get_from_drawable(w,w.get_colormap(),0,0,0,0,sz[0],sz[1])
if (pb != None):
pb.save("screenshot.png","png")
print "Screenshot saved to screenshot.png."
else:
print "Unable to get the screenshot."