NoteTo use SEO URLs you need to rename the htaccess.txt to .htaccess and the Apache module mod-rewrite must be installed. If the .htaccess file is already renamed, you should see some data the Text Editor and skip the step.
Editing the .htaccess file
The file itself does not need any changes. However, if you want to make custom redirects or additional rewrites, you can use the Text Editor. Just make sure to have a backup of the file.
For NGINX servers, take a look at this article to see how they are configured.
Robots.txt is a text file in the root folder of your store, which search engines visit periodically. You can use it to add URLs of pages you don’t want to be indexed.
Editing the robots.txt file
If your robots.txt is empty, you can copy the snippet below, which contains the essential lines for the store. It will enable crawlers to visit every page of your store, except your admin panel, customer page, search pages, filter pages and others:
- User-agent: *
- Disallow: /admin
- Disallow: /*route=product/search
- Disallow: /*route=checkout/
- Disallow: /*route=account/
- Disallow: /*sort=
- Disallow: /*order=
- Disallow: /*limit=
- Disallow: /*page=
- Disallow: /*filter_name=
- Disallow: /*filter_description=
NoteRobots.txt is cached by the search engines and the latest cached version can be found in Google Webmaster Tools and/or Bing Webmaster Tools.