问题
For marketing purposes do I maintain one identical website with two different domains, in TYPO3 v8 I would simply add a domain record on the root page and create a personalised robots.txt with typoscript for each site trough realurl) ...
With v9 I cannot find a way to do this, I tried to enter various anottations in config.yaml manually, but nothing works (i.e. I tried to replicate the annotation for the url)...
routes:
-
route: robots.txt
type: staticText
content: "User-agent: *\r\nDisallow: /"
contentVariants:
-
content: "User-agent: *\r\nAllow: /"
condition: 'getenv("HTTP_HOST") == "2dn-domain.com"'
does anyone know a working annotation, or a different approach ...
回答1:
In my opinion, there is no need to load the robots.txt with all the TYPO3 overhead, except you want to dynamically add content to it.
You can handle multiple robots.txt with Webserver rewrite rules, e.g. with Apache:
RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]
This rule loads the robots.txt depending on HTTP_HOST from a subdirectory:
- robots/domain-a.xy
- robots/domain-b.xy
回答2:
I like to stay within the 'regular' solutions so I found a middleground:
in the backend you enter a Route Type = Page, File or URL [uri]
with value t3://page?type=201
so as to address a page type for robots
and with Typoscript you define your conditional robots file:
# Theme robots.txt
robots = PAGE
robots {
typeNum = 201
config {
disableAllHeaderCode = 1
additionalHeaders.10.header = Content-Type:text/plain;charset=utf-8
xhtml_cleaning = 0
admPanel = 0
debug = 0
index_enable = 0
removeDefaultJS = 1
removeDefaultCss = 1
removePageCss = 1
INTincScript_ext.pagerender = 1
sourceopt.enabled = 0
}
10 = TEXT
10.value (
User-Agent: *
Allow: /
# indexed search
User-agent: googlebot
Disallow: /*?tx_indexedsearch
# folders
Disallow: /typo3/
Disallow: /typo3conf/
Allow: /typo3conf/ext/
Allow: /typo3temp/
# parameters
Disallow: /*?id=* # non speaking URLs
Disallow: /*&id=* # non speaking URLs
Disallow: /*cHash # no cHash
Disallow: /*tx_powermail_pi1 # no powermail thanks pages
Disallow: /*tx_form_formframework # no forms
# sitemap
Sitemap: {$theme.configuration.sitemap}
)
}
# Adwords Site closed
[globalString = ENV:HTTP_HOST=adw-domain.com]
robots.10.value (
User-Agent: *
Disallow: /
)
[global]
I also set a constant in constants.typoscript for the seo site:
theme.configuration {
sitemap = /?eID=dd_googlesitemap
sitemap = http://seo-domain.com/sitemap/seo-domain.xml
}
来源:https://stackoverflow.com/questions/59918427/multisite-typo3-v9-distinct-robots-txt-for-multiple-domains-on-one-rootpage