When we’re launching a new SEO campaign at FirmFinder, one of the first things we check is a website’s robots.txt file. The robots.txt file is designed to communicate whether you want your website indexed by search engines. If the file is set to “disallow,” then an entire website or some of its most important pages won’t show up in search results. Until the robots.txt file is updated, the website is essentially telling Google to act like it doesn’t exist.
Every site’s robots.txt file looks a little different based on the design platform, general setup, and the client or designer’s preferences. At FirmFinder, we design your law firm’s website within best SEO practices for the industry. So, we’ll always make sure that your website’s robots.txt file is set up properly, and we’ll make sure Google indexes your new site quickly.
It doesn’t really make a lot of sense for a website to have a disallow set up in its robots.txt file unless it’s an active redesign. If a development site is not yet complete, it can be helpful to tell Google not to index until the redesign is live. Another disallow scenario for robots.txt might be if a law firm added a new location or service, and is working on a new page about that change over the course of several weeks. The firm can add the URL for the page to the disallow section of robots.txt, but they need to make sure it’s removed when the page is ready.
If you have questions about your website’s robots.txt, or are interested in expert digital marketing services for your law firm, please get in touch with the skilled team at AdvancedMktgSolutions.