Can search engines index JavaScript generated web pages?

前端 未结 8 2132
小鲜肉
小鲜肉 2020-12-01 14:30

Can search engines such as Google index JavaScript generated web pages? When you right click and select view source in a page that is generated by JavaScript (e.g using GWT)

相关标签:
8条回答
  • 2020-12-01 14:57

    Yes, Google (and most likely Bing) will index dynamically generated HTML. See more details here: http://searchengineland.com/tested-googlebot-crawls-javascript-heres-learned-220157.

    0 讨论(0)
  • 2020-12-01 14:57

    There are a few ways to handle this in GWT, this is a great discussion on the subject. Seems like the best option is to serve up static SEO content when the user-agent is a bot, as long as the SEO content is identical to what is served via the GWT route. This can be a lot of work, but if you really want a fully rich GWT app that is optimized for search engines it may be worth it.

    0 讨论(0)
  • 2020-12-01 14:58

    A good rule of thumb: If you can see it in Lynx, it can indexed by Google.

    Lynx is an excellent test because it also gives you an idea of how screen readers for the blind will see your page as well.

    0 讨论(0)
  • 2020-12-01 15:00

    Your suspicion is correct - JS-generated content cannot be relied on to be visible to search bots. It also can't be seen by anyone with JS turned off - and, last time I added some tests to a site I was working on (which was a large, mainstream-audience site, with hundreds of thousands of unique vistors per month), approx 10% of users were not running Javascript in any form. That includes search bots, PC browsers with JS disabled, many mobiles, blind people using screenreaders... etc etc.

    This is why content generated via JS (with no fallback option) is a Really Bad Idea.

    Back to basics. First, create your site using bare-bones (X)HTML, on REST-like principles (at least to the extent of requiring POST requests for state changes). Simple semantic markup, and forget about CSS and Javascript.

    Step one is to get that right, and have your entire site (or as much of it as makes sense) working nicely this way for search bots and Lynx-like user agents.

    Then add a visual layer: CSS/graphics/media for visual polish, but don't significantly change your original (X)HTML markup; allow the original text-only site to stay intact and functioning. Keep your markup clean!

    Third is to add a behavioural layer: Javascript (Ajax). Offer things that make the experience faster, smoother, nicer for users/browsers with Ajax-capable JS... but only those users. Users without Javascript are still welcome; and so are search bots, the visually impaired, many mobiles, etc.

    This is called progressive enhancement in web design circles. Do it this way and your site works, in some reasonable form, for everyone.

    0 讨论(0)
  • 2020-12-01 15:01

    Google is working on executing simple Javascript to uncover some content - but they certainly dont execute full scripts. If you are worried about SEO, then you need to consider providing static versions of pages.

    0 讨论(0)
  • 2020-12-01 15:05

    Even if they execute the basic JavaScript MOSTLY WEBSITES USES LIBRARIES AND FRAMEWORKS , I doesn't think so that a bot like google bot or any other spider will also load Js files linked with webpage and without loading them the JS code will produce errors.

    /*Correct Me If I am wrong*/
    
    0 讨论(0)
提交回复
热议问题