Skip to content

Commit b95e063

Browse files
Adding robots txt for cloudflare pages (#418)
* Adding robots txt for cloudflare pages * template
1 parent 23f1df7 commit b95e063

File tree

3 files changed

+10
-1
lines changed

3 files changed

+10
-1
lines changed

.github/pull_request_template.md

+1
Original file line numberDiff line numberDiff line change
@@ -13,5 +13,6 @@ What part of the application were affected by the changes? What should be tested
1313
### Requirement checklist
1414

1515
- [ ] I have validated my changes on a test/local environment.
16+
- [ ] I have added my changes to the V1 and V2 documentations.
1617
- [ ] I have checked the SNYK/Dependabot reports and applied the suggested changes.
1718
- [ ] (Optional) I have updated outdated packages.

website/package.json

+1-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
"serve": "docusaurus serve",
1212
"gen-api-docs": "docusaurus gen-api-docs all && node ./api/cleanup-api-docs.js",
1313
"typecheck": "tsc",
14-
"cf": "npm run gen-api-docs && docusaurus build --out-dir build/docs && echo '/ /docs' > build/_redirects"
14+
"cf": "npm run gen-api-docs && docusaurus build --out-dir build/docs && echo / /docs > build/_redirects && node testrobots.js"
1515
},
1616
"dependencies": {
1717
"@docusaurus/core": "3.1.1",

website/testrobots.js

+8
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
const fs = require('node:fs');
2+
const content = 'User-agent: *\nDisallow: /';
3+
fs.writeFile('build/robots.txt', content, err => {
4+
if (err) {
5+
console.error(err);
6+
} else {
7+
}
8+
});

0 commit comments

Comments
 (0)