Hey everyone, I\'m a bit new to SEO. I built a one page website that initially utilized several pages (but are now removed), and it looks as if Google has indexed those so that
Make sure that your server/application is sending correct 404 HTTP header on requests of non existing pages and wait.
Use a robots.txt file: Robots.txt
example robots.txt:
User-agent: *
Disallow: /url1.html
Disallow: /url2.html
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
Google has a page describing how to remove your site from their results.
You can use Robots exclusion rules. This site details it : http://www.robotstxt.org/