Robots. txt is a text file located in your root WordPress directory. You can access it by opening the your-website.com/robots.txt URL in your browser. It serves to let search engine bots know which pages on your website should be crawled and which shouldn’t.
- 1 Where is robots txt located in WordPress?
- 2 How do I find my robots txt file?
- 3 What is robots txt WordPress?
- 4 How do I use robots txt in my website?
- 5 Where is robots txt FTP?
- 6 Do I need a robots txt file?
- 7 What is robots txt in SEO?
- 8 How do I unblock robots txt in WordPress?
- 9 Where do robots find what pages are on a website?
- 10 Where do I put robots txt in cPanel?
- 11 What is robots txt file in websites?
- 12 WordPress Robots.txt Guide – What It Is and How to Use It
- 13 What Is a WordPress Robots.txt?
- 14 How To Create And Edit Your WordPress Robots.txt File
- 15 What To Put In Your Robots.txt File
- 15.1 How To Use Robots.txt To Block Access To Your Entire Site
- 15.2 How To Use Robots.txt To Block A Single Bot From Accessing Your Site
- 15.3 Want to know how we increased our traffic over 1000%?
- 15.4 How To Use Robots.txt To Block Access To A Specific Folder Or File
- 15.5 How to Use Robots.txt To Allow Access To A Specific File In A Disallowed Folder
- 15.6 How To Use Robots.txt To Stop Bots From Crawling WordPress Search Results
- 15.7 How To Create Different Rules For Different Bots In Robots.txt
- 16 Testing Your Robots.txt File
- 17 What Popular WordPress Sites Put In Their Robots.txt File
- 18 Use Robots.txt The Right Way
- 19 The Complete Guide to WordPress Robots.txt
- 20 What a WordPressrobots.txtFile Is (And Why You Need One)
- 21 Where the WordPress robots.txt File Is Located
- 22 What Rules to Include in Your WordPressrobots.txtFile
- 23 How to Create a WordPressrobots.txtFile (3 Methods)
- 24 How to Test Your WordPress robots.txt File and Submit It to Google Search Console
- 25 Conclusion
- 26 How To Access & Modify Robots.txt In WordPress (Guide)
- 27 Basic information about the robots.txt file
- 28 4 ways to access the robots.txt in WordPress
- 28.1 1: Use an SEO plugin
- 28.2 2: Use a dedicated robots.txt plugin
- 28.3 3: Access robots.txt via cPanel in your hosting
- 28.4 4: Use FTP to access robots.txt
- 29 How to test the robots.txt file on your website
- 30 How to Optimize Your Robots.txt for SEO in WordPress (Beginner’s Guide)
- 31 What is robots.txt in WordPress?
- 32 How to Optimize WordPress Robots.txt File for Better SEO
- 33 What is a WordPress robots.txt file and do I need to worry about it?
- 34 How to optimize WordPress robots.txt file for better SEO
- 35 Conclusion
Where is robots txt located in WordPress?
Robots. txt usually resides in your site’s root folder. You will need to connect to your site using an FTP client or by using your cPanel’s file manager to view it. It’s just an ordinary text file that you can then open with Notepad.
How do I find my robots txt file?
Test your robots. txt file
- Open the tester tool for your site, and scroll through the robots.
- Type in the URL of a page on your site in the text box at the bottom of the page.
- Select the user-agent you want to simulate in the dropdown list to the right of the text box.
- Click the TEST button to test access.
What is robots txt WordPress?
Robots. txt is a text file which allows a website to provide instructions to web crawling bots. It does this to see if a website’s owner has some special instructions on how to crawl and index their site. The robots. txt file contains a set of instructions that request the bot to ignore specific files or directories.
How do I use robots txt in my website?
How to use Robots. txt file?
- Define the User-agent. State the name of the robot you are referring to (i.e. Google, Yahoo, etc).
- Disallow. If you want to block access to pages or a section of your website, state the URL path here.
- Blocking sensitive information.
- Blocking low quality pages.
- Blocking duplicate content.
Where is robots txt FTP?
As the name suggests, Robots. txt is a simple text file. This file is stored in the root directory of your website. To find it, simply open your FTP tool and navigate to your website directory under public_html.
Do I need a robots txt file?
No, a robots. txt file is not required for a website. If a bot comes to your website and it doesn’t have one, it will just crawl your website and index pages as it normally would. txt file is only needed if you want to have more control over what is being crawled.
What is robots txt in SEO?
What is robots. txt? The robot exclusion protocol, better known as the robots. txt, is a convention to prevent web crawlers from accessing all or part of a website. It is a text file used for SEO, containing commands for the search engines’ indexing robots that specify pages that can or cannot be indexed.
How do I unblock robots txt in WordPress?
To unblock search engines from indexing your website, do the following:
- Log in to WordPress.
- Go to Settings → Reading.
- Scroll down the page to where it says “Search Engine Visibility”
- Uncheck the box next to “Discourage search engines from indexing this site”
- Hit the “Save Changes” button below.
Where do robots find what pages are on a website?
The robots. txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl.
Where do I put robots txt in cPanel?
How to create the robots. txt file
- Log into your cPanel account.
- Navigate to FILES section and click on File Manager.
- Browse File Manager to the website directory ( e.g public_html ) then Click on “New File” >> Type in “robots.
- Now you are free to edit the content of this file by double clicking on it.
What is robots txt file in websites?
A robots. txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, block indexing with noindex or password-protect the page.
WordPress Robots.txt Guide – What It Is and How to Use It
The vCard format is available for those who want to save the address in their address book. To save it to your device, click the vCard option on the My Blogs dashboard. The secret email address is associated with each individual user account, not with each individual blog post. Each of your blog’s users, regardless of their user role, will be able to create their own Post by Email address if you have a multiuser blog. All emails sent to users with the Contributor user role will be preserved as pending rather than published, so that they may be reviewed later.
A new secret email address will be provided to you by clicking on the Regenerate or Regenerate Address buttons.
- In this article, we will discuss what a WordPress Robots.txt file is and how it might benefit your website. How Do I Include a Robots.txt File in WordPress? How Do I Write Robots.txt Rules
- What Types Of Rules Can I Include? What Is the Best Way to Test My Robots.txt File
- What is the best way for large WordPress websites to use Robots.txt?
There’s a lot to cover, so let’s get started right away!
What Is a WordPress Robots.txt?
For the sake of this discussion, it is necessary to explain what a “robot” is in the context of WordPress’s robots.txt file. Robots are any sort of “bot” that accesses websites on the Internet and collects information about them. Crawlers for search engines are the most well-known example. In order to assist search engines such as Google in indexing and ranking the billions of pages on the Internet, these bots “crawl” throughout the web. As a result, bots are often seen as beneficial to the Internet.or at the very least as required.
- During the mid-1990s, the desire to be able to control how web robots interacted with websites led to the development of a standard known as the robots exclusion standard.
- Bots can be blocked completely, or their access to specific portions of your site can be restricted, among other things.
- The robots.txt file cannot compel a bot to obey its instructions.
- Furthermore, even respected firms will ignore some of the commands that you might include in your Robots.txt file.
- If you are seeing a high volume of bot activity, a security solution such as Cloudflare or Sucurican may be beneficial.
Why Should You Care About Your Robots.txt File?
The advantages of a well-structured robots.txt file may be divided into two groups, according to the majority of webmasters:
- It is possible to optimize search engines’ crawl resources by instructing them not to waste time on pages that are not intended to be indexed. This helps to ensure that search engines concentrate their efforts on crawling the sites that are most important to you. Increasing the efficiency of your server’s use by preventing bots from squandering resources
Robots.txt Isn’t Specifically About Controlling Which Pages Get Indexed In Search Engines
Robots.txt is not a failsafe method of controlling which pages are indexed by search engines. If your primary purpose is to prevent certain sites from appearing in search engine results, the most effective option is to use a meta noindex tag or another equally direct strategy. This is due to the fact that your Robots.txt file is not explicitly instructing search engines not to index material — it is just instructing them not to crawl it. In spite of the fact that Google will not crawl the highlighted sections from inside your site, Google itself notes that if an external site links to a page that you have excluded from search results, Google may nonetheless index the page in question.
What he had to say at a Webmaster Central hangout is summarized below: If these pages are restricted by robots.txt, it is possible that someone will link to one of these pages at random, which should be considered a concern in this situation.
As a result, we would have no way of knowing that you do not want certain pages to be indexed.
And if someone links to them, and we chance to crawl that connection and believe there could be anything helpful on the other end of the link, we’ll know that these sites don’t need to be indexed, and we’ll just skip them from the indexing process entirely.
So, in that case, if you have anything on these pages that you don’t want to be indexed, don’t use the disallow keyword; instead, use the noindex keyword instead.
How To Create And Edit Your WordPress Robots.txt File
When it comes to controlling which sites search engines index, robots.txt is not a failsafe method of doing so. If your primary purpose is to prevent certain sites from appearing in search engine results, the most effective option is to use a meta noindex tag or another equally direct strategy. Specifically, search engines will not index your material since your Robots.txt file tells them just not to crawl it, not to index it. Google explicitly explains that if an external site links to a page that you have excluded from indexing using your Robots.txt file, Google may nevertheless index that page even though it is not crawled from within your website.
- What he had to say at a Webmaster Central hangout is summarized here.
- It is possible that we will index this URL without any content since it is prohibited by the robots.txt file if they do this to us.
- In contrast, if they are not restricted by robots.txt, you may use a noindex meta tag on certain sites to prevent them from being indexed by search engines.
- We would then forgo indexing these pages entirely.
How to Create And Edit A Robots.txt File With Yoast SEO
If you’re using the famous Yoast SEO plugin, you can generate (and subsequently change) your robots.txt file straight from the Yoast interface, which makes it really convenient. Yoast SEO’s advanced features, on the other hand, must first be enabled. To do so, navigate toSEO>Dashboard>Features and toggle on the following advanced settings pages: How to make advanced Yoast functionalities available. Once that’s been enabled, you can navigate toSEO Tools and selectFile editor: HTML. How to go into the Yoast file editor For those without a real Robots.txt file, you will be given the opportunity to create one using the following instructions: Create robots.txt file How to add a Robots.txt file to your Yoast site You’ll be able to change the contents of your Robots.txt file straight from the same interface after you’ve clicked that button: How to make changes to the Robots.txt file in Yoast As you continue reading, we’ll go into further detail about the sorts of directives you should include in your WordPress robots.txt file.
How to Create And Edit A Robots.txt File With All In One SEO
Additionally, if you are using the almost-as-popular as YoastAll in One SEO Packplugin, you will be able to generate and change your WordPress robots.txt file directly from the plugin’s user interface. All you have to do is go to All in One SEO Feature Manager and activate the Robots.txt file that has been created.
All In One SEO includes the following feature:How to build Robots.txt. In All In One SEO, you can manage your robots.tx file by heading to All in One SEO Robots.txt: How to edit Robots.txt in All In One SEO and selecting Manage Robots.txt from the drop-down menu.
How to Create And Edit A Robots.txt File Via FTP
In the event that you aren’t utilizing an SEO plugin that has robots.txt capability, you may still build and maintain your robots.txt file through the use of Secure File Transfer Protocol. To begin, use any text editor and create an empty file titled “robots.txt” like follows: Learn how to construct your own Robots.txt file in this tutorial. Then, using SFTP, connect to your site and upload the file to the root folder of your site’s directory. You may make further changes to your robots.txt file by editing it via SFTP or by uploading fresh versions of the file to your web server.
What To Put In Your Robots.txt File
As a result, you now have a physical copy of the robots.txt file on your server, which you may modify as needed. But what exactly are you going to do with that file? Robots.txt, as you learned in the last part, allows you to govern how robots interact with your website. This is accomplished through the use of two fundamental commands:
- User-agent– this allows you to target certain bots with your campaign. Bots utilize user agents to identify themselves and communicate with one another. It is possible to, for example, define a rule that applies to Bing but not to Google using these tools. Disallow– this feature allows you to instruct robots not to access specific portions of your website.
Also included is the Allowcommand, which will be used to provide permissions in specific scenarios. By default, everything on your site is tagged with theAllow command, therefore in 99 percent of cases, it is not essential to use theAllowcommand to grant access. However, it can be useful in situations when you want toDisallowaccess to a folder and its child folders while allowingaccess to a single child folder on the other hand. In order to add rules, you must first indicate whichUser-agentthe rule should apply to, and then list down which rules to apply using the Disallow and Allow buttons.
- Also included is the Allowcommand, which will be used to provide permissions in specific circumstances. It is not essential to use theAllowcommand in 99 percent of instances because everything on your site is marked with theAllowtag by default. However, it can be useful in situations when you wish toDisallowaccess to a folder and its child folders while allowingaccess to a single child folder on the same network. In order to add rules, you must first indicate whichUser-agentthe rule should apply to, and then list out which rules to apply using the Disallow and Allow options. Some more commands, such as crawl-delay and sitemap, are available, but they fall into one of two categories:
Let’s have a look at some concrete use examples to see how it all comes together in practice.
How To Use Robots.txt To Block Access To Your Entire Site
Consider the following scenario: you wish to prevent all crawlers from accessing your site. This is unlikely to occur on a live site, but it can be useful on a development site if the site is under construction. In order to accomplish this, you would include the following code in your WordPress robots.txt file: User-agent: * Allow: / Disallow: What exactly is happening in that code? The * asterisk next toUser-agentindicates that it applies to “all user agents.” When you see an asterisk, it means that it applies to every single user agent on the system.
How To Use Robots.txt To Block A Single Bot From Accessing Your Site
Let’s shake things up a little. Assume for the sake of this example that you are unhappy with Bing’s ability to spider and index your sites. You’re firmly on Team Google’s side, and you don’t even want Bing to take a peek at your website. In order to prevent onlyBing from scanning your site, you would replace the wildcard*asterisk with the phrase Bingbot:
Want to know how we increased our traffic over 1000%?
Join over 20,000 other people who receive our monthly email, which contains insider WordPress advice! Now is the time to subscribe. Bingbot is the user-agent. / Is not permitted. Essentially, the code above states that theDisallowrule should only be applied to bots with theUser-agent “Bingbot.” While it’s rare that you’ll want to restrict access to Bing, this situation may come in helpful if there’s a specific bot that you don’t want to have access to your site in the future.
This website has a comprehensive list of the most commonly used User-Agent names for most services.
How To Use Robots.txt To Block Access To A Specific Folder Or File
For the sake of this example, assume that you simply wish to restrict access to a certain file or folder (and all of the subfolders within that folder). Let’s imagine you want to prevent the following from happening on WordPress: You might use the following instructions to accomplish your goal: * The following is the user-agent: Do not allow: /wp-admin/ Do not allow: /wp-login.php Do not allow:
How to Use Robots.txt To Allow Access To A Specific File In A Disallowed Folder
Assume, for the sake of argument, that you want to prohibit access to a whole folder, but you also want to enable access to a single file contained within that folder. This is where theAllowcommand comes in helpful, as previously stated. And it has a lot of application in the WordPress world. In fact, the WordPress virtual robots.txt file serves as an excellent illustration of this point: The following URLs are blocked: /wp-admin/admin-ajax.php Disallow: /wp-admin/ This snippet prevents access to the entire/wp-admin/folder, with the exception of the/wp-admin/admin-ajax.php file, from being granted.
How To Use Robots.txt To Stop Bots From Crawling WordPress Search Results
Say you want to prohibit access to an entire folder but still allowing access to a single file within that folder. You may do this by using the chmod command. Allowcommand comes in useful in this situation. Moreover, it has a lot of application in the world of WordPress. To exemplify this point, look no farther than the WordPress virtual robots.txt file. The following URLs are blocked: /wp-admin/admin-ajax.php Disallow: /wp-admin/admin With the exception of the /wp-admin/admin-ajax.php file, this snippet restricts access to the whole /wp-admin/ folder.
How To Create Different Rules For Different Bots In Robots.txt
All of the examples shown thus far have dealt with a single rule at a time. However, what if you want to apply different rules to different bots at various times? You just need to add each set of rules beneath theUser-agentdeclaration for any bot that you intend to use. For example, if you want to create a rule that applies to all bots and another rule that applies just to Bingbot, you could write it like this: If you want to create a rule that applies to all bots and another rule that applies only to Bingbot, you could write it like this: The following user-agents are disallowed: /wp-admin/ The following user-agent is Bingbot / Is not permitted.
Testing Your Robots.txt File
You may check the status of your WordPress robots.txt file in Google Search Console to confirm that it is properly configured. Simply log onto your site and choose “robots.txt Tester” from the “Crawl” drop-down menu. After that, you can input any URL you choose, including your own. If everything is crawlable, you should see the word “Allowed” in green. In addition, you might test URLs that you have blocked to check that they are, in fact, blocked and orDisallowed. Examine the robots.txt file
Beware of the UTF-8 BOM
BOM is an abbreviation for byte order mark, and it is a non-visible character that is sometimes added to files by old text editors and other similar software programs. If this occurs with your robots.txt file, it is possible that Google will not recognize it correctly. This is why it is critical to double-check your document for flaws. If, as seen below, our file contained an invisible character, Google will complain about the syntax not being interpreted by the search engine.
This effectively nullifies the first line of our robots.txt file, which is not a good thing at all! Glenn Gabe has written a fantastic essay on how aUTF-8 BOM in your robots.txt file may harm your SEO.UTF-8 BOM in your robots.txt file can harm your SEO.
Googlebot is Mostly US-Based
It’s also vital not to restrict the Googlebot from the United States, even if you’re targeting a local region outside of the United States, because this will hurt your SEO efforts. They conduct some local crawling on occasion, but the Googlebot is mostly located in the United States. Googlebot is mostly centered in the United States, although we also perform some local crawling from time to time. On November 13, 2017, Google Search Central (@googlesearchc) tweeted:
What Popular WordPress Sites Put In Their Robots.txt File
Following is an example of how some of the most popular WordPress sites use their robots.txt files to offer some context for the problems raised above:
TechCrunch Robots.txt is a text file created by TechCrunch. In addition to restricting access to a limited number of unique pages, TechCrunch specifically prohibits crawlers from doing the following actions: They’ve also placed extra limitations on two bots in particular: In case you’re curious, IRLbot is a crawler developed as part of a research project at Texas A&M University. That’s strange!
The Obama Foundation
Robots.txt file from the Obama Foundation The Obama Foundation hasn’t made any notable changes to the site, preferring instead to limit access to the /wp-admin/ directory.
Angry Birds Robots.txt is a text file created by Angry Birds. The default configuration of Angry Birds is the same as that of The Obama Foundation. There is nothing unique added.
Drift Robots.txt is a text file. Finally, Drift chooses to designate its sitemaps in the Robots.txt file, but otherwise adheres to the same default constraints as The Obama Foundation and Angry Birds, according to the company.
Use Robots.txt The Right Way
As we come to the end of our robots.txt guide, we’d like to remind you once again that using aDisallowcommand in your robots.txt file is not the same as using anoindextag in your robots file. Robots.txt is a text file that prevents crawling but not necessarily indexing. You may use it to add particular rules to your website that will affect how search engines and other bots interact with it, but it will not directly control whether or not your material is indexed by search engines. The default virtual robots.txt file is not required to be modified by the vast majority of WordPress users who are just getting started.
We hope you found this article to be helpful, and please feel free to leave a comment if you have any further questions regarding how to use your WordPress robots.txt file.
- Instant assistance from WordPress hosting professionals, available 24 hours a day, seven days a week
- Integration of Cloudflare Enterprise Edition
- With 29 data centers across the world, you can access a global audience. Application Performance Monitoring (APM) is embedded into our platform, allowing for optimization.
That and much more is included in a single plan that includes no long-term obligations, aided migrations, and a 30-day money-back guarantee, among other things. Check out our options or speak with a sales representative to select the plan that is suitable for you.
The Complete Guide to WordPress Robots.txt
02nd of February, 2022 Will M.7min Read a Book? It is essential that you make it simple for search engine ‘bots’ to explore your site’s most significant pages in order to ensure that your site ranks well in Search Engine Result Pages (SERPs).
It will be easier to guide those bots to the sites you want them to index if your robots.txtfile is well-structured (and avoid the rest). Obtain a copy of the WordPress cheat sheet In this tutorial, we’ll go over the following topics:
- 02.02.2022, Tuesday, February 2nd How Long Does M.7min Read? Make it simple for search engine ‘bots’ to navigate through your site’s most crucial pages if you want your site to rank well in Search Engine Result Pages (SERPs). It will assist to direct those bots to the sites you want them to index if your robots.txtfile is well-structured (and avoid the rest). You can get a WordPress cheat sheet here. We’ll go through the following topics in this article:
If you follow up with our conversation, you’ll have all you need to set up a perfectrobots.css file for your WordPress website. Let’s get started!
What a WordPressrobots.txtFile Is (And Why You Need One)
WordPress’ defaultrobots.txtfile is rudimentary, but you may simply update it with something more sophisticated. As soon as you publish a new website, search engines will dispatch their minions (also known as bots) to ‘crawl’ through it in order to compile a map of all the pages it has. As a consequence, they’ll be able to choose which pages to display as search results when people search for similar terms. This is straightforward on a fundamental level. It is an issue, however, as current websites have many more features than simply pages.
- You don’t want these to appear in your search engine results, though, because they aren’t relevant to your search terms and phrases.
- “Hey, you may have a peek here, but don’t go into those rooms over there!” the message warns them.
- On the ground, search engines will continue to crawl your website even if your robots.txt file is not properly configured.
- By not include this file, you’re leaving it up to the bots to index all of your information, and because they’re so thorough, they may wind up revealing sections of your website that you don’t want other people to see.
- This might have a detrimental influence on its overall performance.
- After all, there are few things that people despise more than a website that is too sluggish (and this includes us!).
Where the WordPress robots.txt File Is Located
When you build a WordPress website, the software automatically creates a virtualrobots.txt file in the main folder of your server. Using the example above, if your site is situated at yourfakewebsite.com, you should be able to access the address yourfakewebsite.com/robots.cfm and see a file that looks something like this: * Disallow: /wp-admin/ Disallow: /wp-includes/ User-agent: * Disallow: /wp-includes/ An very simple robots.txtfile is shown in this example. To put it another way, the portion immediately followingUser-agent:declares which bots are subject to the regulations that follow.
Specifically, the file instructs such bots that they are not permitted to access yourwp-adminandwp-includesdirectories.
You may, on the other hand, choose to include additional rules in your own file.
Most of the time, the WordPress robots.txt file is located under yourrootdirectory, which is sometimes referred to as public html or is named after your site): However, the robots.txt file that WordPress creates for you by default is not accessible from any directory, making it impossible to use it.
In a moment, we’ll go through a few different approaches to creating a new robots.txt for WordPress. For the time being, though, let’s speak about how to identify which rules should be included in yours.
What Rules to Include in Your WordPressrobots.txtFile
It was demonstrated in the last part how WordPress generates robots.txt files for you. It simply had two brief regulations, but most websites have a lot more restrictions in place than that. Allow me to demonstrate this by comparing and contrasting two robots.txtfiles and discussing what they each do differently. Here is the first WordPressrobots.txtexample we’ll be looking at: The following user-agents are allowed: /Disallowed Sub-Directories /checkout/ is prohibited. /images/ is prohibited.
- This is an example of a genericrobots.txt file for a website that includes a forum.
- If your forum is intended for a certain purpose, you may wish to restrict access to it.
- You might also create rules that indicate which sub-forums should be avoided, and then allow search engines to scan the remainder of the forum.
- That line instructs bots that they are permitted to crawl through all of your website pages, with the exception of those sites for which you have specified exclusions below.
- Check out this other WordPressrobots.txtexample: WordPressrobots.txt The following URLs are blocked: * /wp-admin/ Disallow: /wp-includes/ The following URLs are blocked: * Bingbot / Is not permitted.
- However, we have also implemented a new set of controls that prevent the Bing search crawler from crawling through our website.
- You have the ability to be quite exact about which search engine’s bots have access to your website and which ones do not, depending on your needs.
- There are, nevertheless, certain malevolent bots that exist on the internet.
- Please bear in mind that, while most bots will follow the instructions you supply in this file, they are not obligated to do so by you or the server administrator.
- if you do some research on the matter, you’ll discover that there are several recommendations for what to allow and what to prohibit on your WordPress website.
Consider the following as an example of the format in which we propose your firstrobots.txtfile should be written: * Allow: /wp-content/uploads/ Disallow: /wp-content/plugins/ User-Agent: * Allow: /wp-content/uploads/ Traditionally, WordPress has preferred to restrict access to thewp-adminandwp-includesdirectories, among other places.
Furthermore, if you include metadata in your photographs for the sake of Search Engine Optimization (SEO), it does not make sense to prevent bots from crawling that information as well.
Instead, the two guidelines listed above cover the requirements for the majority of simple sites. What you add in your robots.txtfile, on the other hand, will be determined by your individual site and requirements. So please feel free to conduct more research on your own!
How to Create a WordPressrobots.txtFile (3 Methods)
Once you’ve established what should be included in your robots.txtfile, the only thing left to do is to actually build one. Editing robots.txt in WordPress may be accomplished either with the use of a plugin or by hand. In this part, we’ll show you how to utilize two popular plugins to do the task at hand, as well as how to build and upload the file on your own computer. Let’s get this party started!
1. Use Yoast SEO
Yoast SEO is a well-known name in the SEO community. It’s the most widely used SEO plugin for WordPress, and it allows you to optimize your articles and pages so that they make better use of your keyword phrases. Aside from that, it may also assist you in improving the readability of your material, which means that more people will be able to benefit from it as well. In terms of simplicity of use, we choose Yoast SEO above other similar tools. That holds true for the creation of an arobots.txtfile as well.
There’s also an useful button that says “Create robots.txt file,” which does exactly what you’d expect: it creates a robots.txt file with the information you provide.
Take note that Yoast SEO creates its own default rules that take precedence over your current virtualrobots.txt file, so be sure to keep that in mind.
Let’s have a look at how another popular plugin accomplishes the same task.
2. Through the All in One SEO Pack Plugin
When it comes to WordPress SEO, the All in One SEO Pack is the other big brand to know. It provides the majority of the functions offered by Yoast SEO, however some users prefer it since it is a smaller and more lightweight plugin. In terms of robots.txt, the process of creating the file with this plugin is similarly straightforward. Once you’ve installed the plugin, go to theAll in One SEOFeature Manager page in your dashboard to begin using it. A file named Robots.txt is located within the folder, and a prominentActivatebutton is located directly beneath it.
If you click on it, you’ll be presented with the following options: add new rules to your file, save the modifications you’ve made, or remove the file entirely: Please keep in mind that you will not be able to manually modify yourrobots.txtfile with this plugin.
More crucially, All in One SEO Pack contains a function that can assist you in blocking ‘evil’ bots, which can be accessed from your All in One SEO tab: Bot Blocking Tool.
If you want to go with this strategy, there is nothing further you need to do. However, let’s speak about how to manually generate a robots.txtfile if you don’t want to install an additional plugin simply to perform this activity.
3. Create and Upload Your WordPressrobots.txtFile Via FTP
It couldn’t be much easier to create an atxtfile. To get started, just open a text editor of your choice (such as Notepad or TextEdit) and write in a few lines of text. After that, you may save the file with whatever name you wish and with thetxtfile extension. It literally takes seconds to accomplish this, therefore it stands to reason that you would want to change robots.txt in WordPress without the use of a plugin to accomplish this. Here’s a brief sample of one of these types of files: For the sake of this lesson, we saved the file to our computer’s hard drive immediately.
If you’re not sure how to go about it, we offer a tutorial on how to accomplish it using the FileZilla client, which is designed for beginners.
You only need to upload the robots.txt file from your computer to the server, which is all that has to be done.
As you can see, utilizing this way is almost as straightforward as using a plugin in some cases.
How to Test Your WordPress robots.txt File and Submit It to Google Search Console
Nothing could be easier than creating an atxtfile. To get started, just open a text editor of your choice (such as Notepad or TextEdit) and enter in a few lines of information. Once you’ve finished, you may save the file with whatever name you choose and with thetxtfile type selected. As a result, it makes sense that you would want to change robots.txt in WordPress without the need of a plugin, as it simply takes seconds. As an illustration, consider the following file: For the sake of this lesson, we saved the file to our computer’s hard drive immediately from the web.
For those of you who are unfamiliar with how to do so, we offer a guide on using the FileZilla client, which is designed for beginners.
You only need to upload the robots.txt file from your PC to the server, which is all that’s required.
As you can see, utilizing this technique is almost as straightforward as using a plugin to accomplish the same results.
It is essential that you guarantee that search engine bots are crawling the most relevant material on your website in order to enhance its visibility online. As we’ve seen, a properly designed WordPressrobots.txtfile will allow you to specify exactly how those bots interact with your website, which is really useful. They’ll be able to provide searchers with more relevant and valuable material as a result of doing so.
Is there anything more you’d like to know about editing robots.txt in WordPress? Please share your thoughts in the comments box below! Will Morris works as a writer on the WordCandy team. Aside from blogging about WordPress, he enjoys performing his stand-up comedy routine around the local area.
How To Access & Modify Robots.txt In WordPress (Guide)
Ensure that search engine bots are crawling the most relevant material on your site in order to maximize the visibility of your site. Because of everything we’ve learned so far, you can control how those bots interact with your site through the use of a properly set WordPress robots.txtfile. They’ll be able to provide searchers with more relevant and valuable material as a result of their efforts. Is there anything more you’d want to know about how to editrobots.txt in WordPress? Fill in the blanks with your thoughts in the comments area below!
When he isn’t blogging about WordPress, he enjoys doing his stand-up comedy routine on the local comedy scene.
Basic information about the robots.txt file
Robots.txt is a text file that tells search engine bots which pages or files should and should not be crawled. It is located in the root directory of the website.
- This file is intended to prevent a website from being overloaded with crawler queries (see my comprehensive tutorial on optimizing the crawl budget)
- Nonetheless, it has been found to be ineffective. This file is not a method of preventing web pages or files from being indexed and appearing in search results
- Rather, it is a technique to prevent web pages or files from being indexed. Search engine crawlers always begin scanning your website by verifying the directives in the robots.txt file
- However, this is not always the case. Search engines should, but do not always, adhere to the directions included in the robots.txt file (Google, for example).
In case you want a quick refresher, the following is some basic technical information concerning robots.txt:
- The only valid placement for the robots.txt file is the website’s root directory (the main directory). This holds true for every website, regardless of whether it is a WordPress website. A robots.txt file can only be found on one website at a time. Robots.txt is the only name that is permissible for the file
- Robots.txtmust be a text file that is encoded in UTF-8.
Keeping this in mind, the placement of the WordPress robots.txt file is the same as it is for any other website. It serves as the foundation of the website. This is the situation in the case of my website. This is an example of what a standard WordPress robots.txt file looks like. If you don’t have FTP access, one of the advantages of WordPress is that it allows you to access yourrobots.txt in a variety of ways. ☝️ If you want to understand more about the robots.txt file, how it works, and what it is, be sure to read the introduction to robots.txt in Google Search Central, which will teach you all you need to know.
4 ways to access the robots.txt in WordPress
In addition, below are the four different methods by which you may access and alter your WordPress site’s robots.txt file.
1: Use an SEO plugin
There are a plethora of WordPress SEO plugins available, however there are only around two that are truly effective: The All in One SEO plugin is also available, however I am not a huge fan of this particular plugin. All of those SEO plugins make it simple to gain access to and edit robots.txt files.
Access robots.txt with Rank Math
If you are employing Rank Math, the following is what you must do:
- As an administrator, log onto your WordPress Dashboard. It should be noted that only administrators have the ability to edit plugins. Go to Rank Math SEO for more information.
- In the left sidebar, under Rank Math, you can find a number of various Rank Math configuration options. Select General Settings from the drop-down menu.
- You will now be presented with a range of general SEO configuration options. Select Edit robots.txt from the drop-down menu.
- Because Rank Math will automatically manage robots.txt for you if you do nothing, as seen in the screenshot below:
- If you want to make changes to robots.txt, all you have to do is start entering in the text field and then click Save Changes.
- If you are unsure about what to include in robots.txt or if you make a mistake, simply click Reset Options.
That concludes the discussion of Rank Math and robots.txt.
Access robots.txt with Yoast SEO
If you are using Yoast SEO, the following is what you need do:
- As an administrator, log onto your WordPress Dashboard. Keep in mind that only administrators have the ability to edit plugin settings. Go toSEO.com for more information.
- In the SEO section of the left sidebar, you will find a few options to configure. Select Tools from the drop-down menu.
- When you’re finished, just click Save changes to robots.txt
- To save your modifications.
These two strategies, which make use of these two SEO plugins, will enough for the vast majority of WordPress websites. There are, however, alternative options.
2: Use a dedicated robots.txt plugin
There are also a plethora of other WordPress plugins that are designed expressly to allow you to change the robots.txt document. For your convenience, we’ve compiled a list of the most popular robots.txt WordPress plugins that you might want to consider:
- Robots.txt Editor
- WordPress Robots.txt Optimization
- Virtual Robots.txt
- Robots.txt Editor
Each of these allows you to quickly access and alter the robots.txt file. Follow the simple instructions below to gain access to the robots.txt file using one of these three WordPress plugins. It is important to note that some plugins do produce a robots.txt file that is put at the root of your website, while others build it dynamically on your page.
If robots.txt is generated dynamically, you will not be able to discover it in the root directory of your website (e.g., when accessing it using FTP). Keep in mind that you should not use more than one robots.txt plugin at the same time!
The Virtual Robots.txt plugin may be used in the following ways:
- Select Virtual Robots.txt from the WordPress dashboard
- Once there, click on it to save it.
- Select Virtual Robots.txt from the WordPress dashboard
- Once there, click on it to open the file.
It’s finally here!
This is the end of the road.
- Better Robots.txt should be installed and activated. WordPress Robots.txt optimization is the name of another plugin developed by the same developer.
- Better Robots.txt is the file to open. The plugin will be accessible from the left-hand sidebar of the website.
- You will now be presented with a plethora of useful and user-friendly options
- Manually allowing or disabling individual crawlers is also possible with the plugin.
- You may also make changes to some of the settings to safeguard your data or to improve the loading speed.
- There are also some more intriguing options available through the plugin. I propose that you look into them.
Better Robots.txt is a tool that allows you to generate a robots.txt file that is completely customizable without having to write a single line of code.
Better Robots.txt is a tool that allows you to generate a robots.txt file that is completely customizable without having to write a single line of code yourself.
- Better Robots.txt is a tool that allows you to generate a robots.txt file that is completely customizable without having to write a word of code.
- You now have access to and control over your robots.txt file. When you’re finished, click Save Changes to save your work.
❗ Keep an eye on things and avoid using several robots.txt editors at the same time. This has the potential to cause major problems. If you are using an SEO plugin, you should only utilize the robots.txt editor provided by the plugin.
3: Access robots.txt via cPanel in your hosting
I’ll teach you how to add robots.txt to WordPress without the need for FTP access in this tutorial. Robots.txt can also be created or modified using the cPanel provided by your hosting provider. It should be noted that several of the plugins listed above produce a robots.txt file on the fly. In other words, you won’t be able to locate the file in the root directory of your website. If you utilize a plugin to create and maintain robots.txt, you should avoid attempting to manually add a new robots.txt file via cPanel or FTP unless absolutely necessary (below).
The following is an example of how to accomplish it using Bluehost:
- It is possible to upload robots.txt to WordPress without having FTP access, and I will demonstrate this in this tutorial. Robots.txt may be created or modified using the cPanel provided by your hosting provider. It should be noted that some of the plugins listed above will produce a robots.txt file on their own initiative. This implies that you will not be able to locate the file in the root directory of your website. A robots.txt file should not be manually added to a website if you utilize a plugin to create and maintain the file in your cPanel or FTP (below). For individuals who do not rely on such plugins, you have the option of manually adding a robots.txt file to their WordPress website and updating it as needed. Using Bluehost, here’s how you go about it.
- Go to the root directory of your website and save the file. Unless you’re using a plugin that produces robots.txt on the fly, you should notice a robots.txt file at the root of your website.
- You can now make changes to the contents of the robots.txt file. When you’re finished, click Save Changes
If your website does not already have a robots.txt file, you will need to build and submit one as soon as possible. Here’s what you should do:
- Notepad or Notepad++ are good text file editors to get started with. Make a robots.txt file for your website.
- The file should be uploaded to the root directory of your website. For Bluehost, you just click theUpload button while you are in the File Manager. Just be sure you’re getting to the source of the problem.
Both WordPress and non-WordPress websites will be able to benefit from this strategy.
4: Use FTP to access robots.txt
I’m going to demonstrate how to access robots.txt through FTP in this section. Using FTP to access robots.txt is the quickest and most “pro” method for doing so. This strategy is also applicable to any website type, including WordPress-based websites. ❗ In the case when you are using a plugin to create robots.txt for you, I do not advocate using this technique. This approach is identical to what you would do using the cPanel in your hosting account, with the exception that you will be accessing your website using an FTP client.
What you need to do is as follows:
- Utilize an FTP program to connect to your website. If you haven’t previously done so, navigate to the root directory of your website.
- If there is a robots.txt file, you should download it. Open the robots.txt file in a text editor such as Notepad++ to see what it contains. Make any necessary changes to the file. Save your changes. Upload the file to the root directory of your website. Essentially, this will overwrite the existing robots.txt file.
This is also the method for creating and uploading a robots.txt file if your website does not already have one.
How to test the robots.txt file on your website
If you have made any changes to robots.txt, I highly advise that you test the file to ensure that it is correct and that it accomplishes what it is designed to do in the first place. Here’s what you should do:
- Learn how to test a robots.txt file by reading this brief tutorial. Make use of the robots.txt testing tool (which is a component of the Google Search Console).
Here are some additional guides that you might find interesting:
- In this tutorial, we’ll show you how to validate Google Search Console using WordPress (in three easy steps). How to locate the sitemap of a website (in eight different methods)
- Find organic traffic in Google Analytics (and filter out spam bot traffic) by following these steps:
Have ACCESS TO PREMIUM SEO CONTENTMembers of SEOSLY Pro get access to premium SEO content receive access to Olga’s SEO audit video guide, in which she records her 500+ step SEO audit process, as well as Olga’s minivideo SEO audits, SEO audit templates, and expert SEO auditing recommendations, among other things. If you joinSEOSLY Proto, you will quickly become a technical SEO specialist. Olga Zarzeczna is a senior SEO professional with more than eight years of industry experience in search engine optimization.
So far, she has completed more than 150 SEO audits.
She also finished Moz Academy, which was a great achievement!
In addition, he has Google credentials, of course. She is continuing to study SEO, which she enjoys. Besides being a Google Product Expert, Olga is also a Google Webmasters Certified Professional with expertise in Google Search and Google Webmasters.
How to Optimize Your Robots.txt for SEO in WordPress (Beginner’s Guide)
Members of SEOSLY Pro have access to premium SEO content. access to Olga’s SEO audit video guide, in which she details her 500+ step SEO audit process, Olga’s minivideo SEO audits, Search Engine Optimization (SEO) Audit templates, professional SEO auditing recommendations, and much more! Become a technical SEO master in no time by joiningSEOSLY Proto. A senior SEO professional with more than eight years of experience, Olga Zarzeczna is a valuable asset to any organization. She has worked on SEO projects for some of the world’s most well-known companies as well as for small businesses in her previous career.
Olga has completed SEO courses and degrees at universities such as the University of California, Davis, the University of Michigan, and Johns Hopkins University, among other institutions.
It also happens to be Google-certified, of course.
– Elsewhere in the Google product team, Olga is known for her expertise in areas such as search and webmaster tools.
What is robots.txt file?
Search engine bots crawl and index pages on websites based on the contents of robots.txt files, which website owners can create and store on their servers. This file is normally saved in the root directory of your website, which is also known as the main folder. In its most basic form, a robots.txt file looks something like this: User-agent: disallowUser-agent: disallowUser-agent: disallowUser-agent: disallowUser-agent: disallow Allow: Sitemap: You can have numerous lines of instructions to allow or block certain URLs, as well as different sitemaps, on a single page.
Example of a robots.txt file, as seen in the following example: User-Agent: * Allow: /wp-content/uploads/ Disallow: /wp-content/plugins/ Disallow: /wp-admin/Sitemap: * Allow: /wp-content/uploads/ Disallow: /wp-content/plugins/ Sitemap: As seen in the preceding robots.txt example, we have granted search engines permission to crawl and index the files in our WordPress uploads directory.
Finally, we’ve included the URL to our XML sitemap for your convenience.
Do You Need a Robots.txt File for Your WordPress Site?
Despite the fact that you do not have a robots.txt file, search engines will crawl and index your website anyhow. You will not, however, be able to advise search engines which sites or directories they should avoid crawling. As a new blogger with little or no material, this will have little or no effect on your blog’s visibility. However, when your website expands and you have a large amount of material, you will most likely want to have greater control over how your website gets scanned and indexed by search engines.
- Each website has a crawl quota that search bots must adhere to.
- If they do not finish crawling all of the pages on your site in one session, they will return and continue crawling in the following session.
- If you want to avoid this, you may prevent search engines from crawling sites that aren’t required, such as your WordPress administration pages, plugin files, and themes folder.
- This allows search engines to crawl and scan even more pages on your site, allowing them to index your content as rapidly as possible.
Although it is not the safest method of keeping material hidden from the wider public, it will assist you in preventing it from appearing in search results.
What Does an Ideal Robots.txt File Look Like?
Many well-known blogs make use of a straightforward robots.txt file. Their content may differ based on the requirements of a certain site. For example: Robots.txt (robots.txt file): * Disallow:Sitemap:Sitemap:This robots.txt file allows all bots to index all material on the website and gives them with a link to the website’sXML sitemaps. In the robots.txt file of a WordPress site, we propose the following rules for inclusion: * Allow: /wp-content/uploads/ Disallow: /wp-admin/ Disallow: /readme.html User-Agent: * Allow: /wp-content/uploads/ Disallow: /wp-admin/ Allow: /refer/Sitemap:Sitemap:This instructs search engines to index all of the pictures and files in the WordPress installation.
By include sitemaps in your robots.txt file, you make it easier for Google bots to crawl your site and locate all of the pages on it.
How to Create a Robots.txt File in WordPress?
It is possible to generate a robots.txt file in WordPress in two different methods. You have the option of selecting the technique that best suits your needs. Modifying the Robots.txt file with All in One SEO is Method 1. It is called as AIOSEO, and it is the greatest WordPress SEO plugin on the market, having been utilized by more than 2 million websites to date. Simple to use, it includes a robots.txt file generator and is free to download. Please follow our step-by-step instruction on how to install a WordPress plugin if you do not already have the AIOSEO plugin installed on your site.
- As soon as the plugin is installed and functional, you will be able to use it to generate and change your robots.txt file right from the WordPress administration area.
- To begin, you’ll need to flip the ‘Enable Custom Robots.txt’ toggle to the blue position, which will enable the editing feature.
- ‘Robots.txt Preview’ area at the bottom of your screen will display the contents of your existing robots.txt file created by All in One SEO.
- When you use these default settings, the search engines are informed that they should not scan your core WordPress files, but they are allowed to index all of your content and are given a link to your site’s XML sitemaps.
- In the ‘User Agent’ section, provide the name of the user agent that will be used to create the rule.
- Then choose whether you want the search engines to be able to crawl your site or not by selecting Allow or Disallow.
- The rule will be implemented to your robots.txt file on its own accord.
We propose that you continue to add rules until you get the optimum robots.txt structure that we discussed before.
Don’t forget to click on the ‘Save Changes’ button after you’re finished to ensure that your changes are saved.
FTP is being used.
Using an FTP program, login to your WordPress hosting account and upload the files.
If you don’t see one, it’s probable that you don’t have a robots.txt file installed.
Robots.txt is a plain text file, which means that you may download it to your computer and edit it with any plain text editor, such as Notepad or TextEdit, without any special software. After you’ve saved your modifications, you may upload the file back to the root folder of your website.
How to Test Your Robots.txt File?
Following the creation of your robots.txt file, it’s usually a good idea to run it through a robots.txt testing program to ensure that it’s working properly. However, we recommend that you use the one that is built within Google Search Console rather than any other robots.txt testing tool. First and foremost, you’ll want to make sure that your website is linked to Google Search Console. If you haven’t already, have a look at our advice on how to submit your WordPress site to Google Search Console.
Simply choose your home from the drop-down menu that appears.
Search engine crawling of non-publicly available pages is discouraged by optimizing your robots.txt file. Pages in your wp-plugins folder, for example, or pages in your WordPress admin folder. An SEO expert’s widespread misconception is that limiting WordPress category, tags, and archive pages would increase crawl rate, which will result in faster indexing and higher ranks. This is not true. This isn’t correct at all. It’s also against Google’s webmaster standards, which you can read here. We recommend that you generate a robots.txt file for your website using the robots.txt format described above.
- In addition, you may be interested in our comprehensive WordPress SEO guide and the finest WordPress SEO tools for growing your website.
- On top of that, you can follow us on Twitter and Facebook.
- This means that if you click on one of our affiliate links, we may receive a fee.
- The Editorial Staff at WPBeginner is a group of WordPress specialists, lead by Syed Balkhi, who provides guidance and support.
What is robots.txt in WordPress?
Web crawling bots can read robots.txt, which is a text file that allows a website to offer instructions to them. Internet search engines such as Google make use of web crawlers, also known as web robots, in order to archive and classify websites. When visiting a website, most bots are programmed to look for a robots.txt file on the server before reading any other files from the site. It performs this to determine whether the owner of a website has provided any unique instructions on how to crawl and index their website.
These files and folders may be hidden for the sake of privacy, or they may be hidden because the website’s owner feels that the contents of such files and directories is irrelevant to the website’s classification in search engines.
It is crucial to remember that not all bots will obey the rules set out in a robots.txt configuration file.
It is also possible for sites to appear in search results despite the fact that a robots.txt file advises bots to avoid certain pages on a website if the pages in question are referred to by other pages that are crawled.
- How to Publish Your WordPress Site in Google Search Console
- Search Engine Optimization
The Editorial Staff at WPBeginner is a group of WordPress specialists, lead by Syed Balkhi, who provides guidance and support. Over 1.3 million readers around the world put their trust in us.
How to Optimize WordPress Robots.txt File for Better SEO
The content on Themeisle is completely free. When you make a purchase after clicking on one of our referral links, we receive a commission. Read on to find out more When it comes to WordPress robots.txt file optimization for improved search engine optimization, you’ve come to the perfect spot. In this fast article, I’ll explain what a robots.txt file is, why it’s crucial to boost your search rankings, and how to amend it and submit it to Google. I’ll also show you how to make changes to your robots.txt file.
What is a WordPress robots.txt file and do I need to worry about it?
Robots.txt is a text file that you may place on your website that allows you to deny search engines access to specific files and directories. Using it, you may prevent Google’s (and other search engines’) bots from crawling specific pages on your website. The following is an example of the file: So, how can restricting search engines access to your website truly benefit your SEO? This appears to be counter-intuitive. It operates in the following way: The greater the number of pages on your site, the greater the number of pages Google must crawl.
- Crawl budget is crucial because it impacts how soon Google picks up on updates to your site – and, consequently, how quickly you are ranked in search results.
- Just make sure you do it correctly, since if you don’t, it might have a negative impact on your SEO.
- So, do you need to make any changes to the robots.txt file in your WordPress installation?
- If you’re just getting started with your blog, though, developing links to your content and producing a large number of high-quality posts should be your top goals.
How to optimize WordPress robots.txt file for better SEO
It is possible to block search engines access to specific files and directories by using a robots.txt file on your website. Using it, you may prevent Google’s (and other search engines’) bots from crawling specific pages on your site. As an illustration, here’s a sample file: As a result, how can blocking search engines access to your website truly help your SEO? At first glance, this appears to be counter-productive. As an example, here’s how it works: As your website grows in size, so does Google’s need to crawl and index more pages on it.
Crawl budget is crucial because it impacts how soon Google picks up on updates to your website – and, consequently, how quickly you are ranked in search results.
Make sure you execute it correctly, as doing it incorrectly might have a negative impact on your search engine optimization (SEO).
So, do you need to make any changes to the robots.txt file in your WordPress installation?
If so, read on. This is most likely true if you are in a highly competitive niche and have a huge website. The creation of numerous high-quality posts and the establishment of external connections are more important considerations if you are just starting off with your blog.
What does an ideal robots.txt file look like?
The format of a robots.txt file is really straightforward. The first line of code often refers to a user agent. The name of the search bot with which you are attempting to contact is represented by the user agent. Consider the Googlebot or Bingbot, for example. You can teach all bots by using the asterisk* symbol. Instructions for search engines are provided in the next line, with AlloworDisallowinstructions letting them know which portions of your website you want them to index and which sections you don’t want them to index.
If it does not work, you may manually enter it, as shown in the sample above.
What should I disallow or noindex?
As stated in Google’s webmaster guidelines, it is recommended that webmasters refrain from using their robots.txt file to conceal low-quality material. In this case, utilizing your robots.txt file to prevent Google from indexing your category, date, and other archive pages isn’t necessarily a sensible decision. Keep in mind that the objective of robots.txt is to tell bots on what to do with the material that they crawl on your website when they reach it. It has no effect on their ability to crawl your website.
The readme.html file should, however, be excluded from your robots.txt file, as recommended by Google.
If this is a human, he or she may readily view the file by just navigating to it on their computer.
How do I submit my WordPress robots.txt file to Google?
Search engine giant Google has said in its webmaster guidelines that webmasters should refrain from using their robots.txt file to conceal low-quality material. In this case, utilizing your robots.txt file to prevent Google from indexing your category, date, and other archive pages isn’t always a smart move. Keep in mind that the objective of robots.txt is to inform bots on what to do with the material that they crawl on your website when they arrive. The crawling of your website is not prevented by this measure.
You should restrict the readme.html page from being seen by search engines such as Google, though.
The file may be easily accessed by a person just by navigating to it in their computer’s file system. A prohibit tag can also be used to prevent harmful attacks from being launched against a website.
You now understand how to optimize the robots.txt file in WordPress for better SEO. Remember to use caution when making significant modifications to your website’s robots.txt file. While these modifications might increase your search traffic, if you are not careful, they can also cause more harm than good to your website. Check out our comprehensive list of WordPress tutorials if you’re still hungry for more knowledge! Please let us know if you have any queries on how to optimize the WordPress robots.txt file in the comments section.
Guide is available for free download.