This content originally appeared on DEV Community and was authored by Bojan Stanojevic
One of the main reasons developers use popular JavaScript frameworks is the simplicity and speed to get a website up and running quickly which is also SEO optimized. But then, when you need to configure very basic things like sitemap, robots.txt or any script, turns out you need to investigate a bit to figure out how to do it. I've noticed people asking about the robots.txt location for Next.js so I thought I'd write a short post about it.
At least for setting up this file, this is very straightforward. All you have to do, if you're using Next.js version 13 or bigger and App router is to go to the App folder which is the root of your app, and inside of it create a new file called robots.txt
The most basic robots.txt looks like this:
User-agent: *
Allow: /
Sitemap: https://YOUR_DOMAIN_NAME.com/YOUR_SITEMAP.xml
- User-agent: This directive specifies the web crawler to which the rule applies.
- *: The asterisk * is a wildcard that represents "all" user agents. In other words, any web crawler or bot, regardless of its name or origin, should follow the rules defined after this line.
- Allow: /: This explicitly allows web crawlers to access the entire website, starting from the root directory.
- The Sitemap: directive in a robots.txt file is used to provide the location of your website's XML sitemap to search engine crawlers.
Inside the robots.txt file you can also define certain routes that should not be crawled with this directive, for example:
Disallow: /admin/
Disallow: /private/
It's not really clear to me why the Next.js team didn't include at least a basic robots.txt file, from Next.js docs they also explain how to generate dynamic robots.js/robots.ts file, but i think static robots.txt is sufficient for most of the websites out there.
Since we are talking about robots.txt I thought I'd mention sitemaps as well. Like robots.txt, sitemap.xml is also located at the root of your project inside the App folder and is where important for SEO and your website rankings. To generate a static sitemap, deploy your website and use some of the many free services online. I can recommend ScreamingFrog application that includes a sitemap generator tool that I use most often. From my experience this is the best tool to get a sitemap for smaller websites, especially when you are working with a JavaScript project, for larger projects, a better idea is to generate a sitemap using code.. When you get your xml file just upload it to the App folder and deploy your app. Finally, once you have your sitemap location you should upload that URL to Google Search Console inside the Sitemaps section to get your website indexed quicker.
If you need any help with your Next.js or SEO project we can work together, learn more here.
This content originally appeared on DEV Community and was authored by Bojan Stanojevic
Bojan Stanojevic | Sciencx (2024-08-17T15:28:35+00:00) Nextjs robots txt location. Retrieved from https://www.scien.cx/2024/08/17/nextjs-robots-txt-location-2/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.