Easily create a robots.txt file within the NOP Admin area without needing FTP access or to edit a plain text file. The robots exclusion protocol or standard, otherwise known as the robots.txt file tells the web robots which pages on your site to crawl. Equally important, they also tell them which pages not to crawl. The Googlebot (Google’s web crawler) has a crawl budget. Website’s with a lot of pages can improve their search engine optimization by using the robots.txt file to help the Google bot spend its crawl budget for your site on the most valuable pages.
Add Google Analytics, Google Tag Manager, LeadFormix, Act On, and Web Trax scripts from the NOP Admin interface. No coding or accessing skin files required. Just simply add the script details and the tracking code manager will do the hard work for you and inject the code in the correct location throughout the site. Need to make adjustments? Codes can be removed and changed just as easily as they are added.
Whether from a website redesign or changing the page categorization, not applying a 301 redirect when your URL structure changes can be devastating to your traffic. You can avoid this with the FM SEO Helper and easily enter 301 redirects individually via an easy to use web interface or in bulk via a .csv import.
The “Rules Manager” provides an interface for creating custom rules that will be imported into the site’s web.config.
Typical uses for these rules will be to setup custom rewrite rules and/or custom redirect rules.
The “Outbound Rules Manager” provides an interface for creating custom outbound rules that will be imported into the site’s web.config.
Outbound rewrite rules modify HTTP responses. For example, if your Web site’s navigation structure has changed, you can create an outbound rule that modifies the URLs in your content so that the content of your Web pages points to the correct locations. You can then create inbound rules that redirect client requests that are based on cached locations to the new URLs.
The “Metadata” manager provides a web interface for quickly finding entities who need their search engine metadata updated or completed in a single location.
Simply use the search fields to query the database for topics, categories or products and edit the results via the inline editor fields in the results grid.