How Disallow a sub-domain using robots.txt?

in your root .htaccess file add the following

RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^Amazon.CloudFront$
RewriteRule ^robots\.txt$ robots-cdn.txt

And then create a separate robots-cdn.txt:

User-agent: *
Disallow: /


When accessed through via http://cdn.domain.com/robots.txt will
return the contents of the robots-cdn.txt file... otherwise the rewrite won't kick in
and the true robots.txt will kick in.

advanced seo interview questions
Related Posts:
  • Important crawlers? Google:Crawler DescriptionGooglebot Crawls web pages (it’s the most important of the bunch)Googlebot-Mobile Crawls pages specifically designed for mobile devicesGooglebot-Image Crawls images for inclusion in image search res… Read More
  • Blog Flipping? Practicing selling blog to other after attaining traffic and popularity  is known as Blog Flipping. … Read More
  • What Is a Sitemap? using XML Sitemaps can help showcaseyour most important links it wouldbe foolish not to utilize Webmaster Tools, as it is an indispensable resource for conductingSEO. … Read More
  • what does the abbreviation PPC stand for? Pay Per Click … Read More
  • Major Search Engine? The name of bots/spider of Google search engineis GoogleBot, Yahoo Slurp for Yahoo search andBingBot for Bingsearch engine. … Read More
  • function of the Robots.txt file? It provides instructions to robots which pages and directories not to index. … Read More
  • Web Directories? Web directories are not search engines in the typical sense; they are collections of linksthat have been organized by humans. The advantage of web directories over typicalsearch engines is that all the links are (usually) re… Read More
  • Black Hat SEO? several websites to get high ranking is brought numerous methods and techniques to achieve goal. … Read More
  • search-friendly URIs? One characteristic of a well-optimized site is the presence of search-friendly URIs.Search-friendly URIs include keywords related to the main subject of the pagedirectly within the address of the page. … Read More
  • How does Google Plus helps in Seo? Google Plus helps the promoted web pages to get plus ones which are considered as genuine votes by Google. Using Google Plus for Seo … Read More
  • What is CTR? It is Click Through Rate. … Read More
  • What are webmaster tools? we can get free Indexing data, backlinks information, crawl errors, search queries, CTR, website malware errors and submit the XML sitemap.Webmaster tools is a free service by Google … Read More
  • What is robots.txt? Yes, that’s right! If you are not very careful with the robots.txt file, you could be blocking web spiders from crawling content you do want to be indexed. Most reputable web spiders will obey your instructions within robots… Read More
  • What is 404? particular webpage or the file is missing from the webhost server. It is a server error code … Read More
  • Security issue for sitemap SEO? They typically name their sitemap something like sitemapRXTNAP.xml and submit it to Google using webmaster tools, rather than listing it in robots.txt … Read More