Angular2 SEO - How to make an angular 2 app crawlable

不打扰是莪最后的温柔 提交于 2019-11-28 03:29:20

The great thing about Angular2 is that when fired up, all content inside your root app-element goes away. This means that you can put whatever you want in there from the server which you want to be picked up by crawlers.

You can generate this content by using a server-rendered version of the content in your app, or have custom logic.

You can find some more information here: https://angularu.com/VideoSession/2015sf/angular-2-server-rendering and here: https://github.com/angular/universal

I just created ng2-meta, an Angular2 module that can change meta tags based on the current route.


const routes: Routes = [
  {
    path: 'home',
    component: HomeComponent,
    data: {
      meta: {
        title: 'Home page',
        description: 'Description of the home page'
      }
    }
  },
  {
    path: 'dashboard',
    component: DashboardComponent,
    data: {
      meta: {
        title: 'Dashboard',
        description: 'Description of the dashboard page',
        'og:image': 'http://example.com/dashboard-image.png'
      }
    }
  }
];

You can update meta tags from components, services etc as well.


class ProductComponent {
  ...
  constructor(private metaService: MetaService) {}

  ngOnInit() {
    this.product = //HTTP GET for product in catalogue
    metaService.setTitle('Product page for '+this.product.name);
    metaService.setTag('og:image',this.product.imageURL);
  }
}

While this caters to Javascript-enabled crawlers (like Google), you can set fallback meta tags for non-Javascript crawlers like Facebook and Twitter.

<head>
    <meta name="title" content="Website Name">
    <meta name="og:title" content="Website Name">
    <meta name="og:image" content="http://example.com/fallback-image.png">
    ...
</head>

Support for server-side rendering is in progress.

Serverside rendering is not a requirement for a decent google ranking ...

I had a forum with about 33.000 entries in its google sitemap files. This website was written using asp.net webforms, and had a decent stream of incoming requests from google. This website did have very bad mobile readability (something that is penalized by google, it actually mentioned this in my google "search console")

I rewrote everything with angular (deployed version is angular5). I am using the Title and Meta services to set my title and meta tags. All routes contain keywords extracted from the actual content. I also made sure that every element with a [routeLink] attribute was an A tag on which i also specified the href element (that is what a crawler looks for ...) And of course i paid a lot of attention to mobile readability.

Result: i actually get more incoming traffic than before, and in the search console i clearly see that my indexed pages went up: of the 30k+ pages, only about 10K were included in the index. Now i have almost 25k pages in the index.

I am not saying that serverside rendering is irrelevant. Using universal or other methods will result in faster download times, which will probably lead to a higher score. But google is definitely able to properly index an angular SPA.

edit: some proof: if you google "3ds max threadripper", you'll see that it actually outranks one of the biggest hardware sites on the internet.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!