about summary refs log tree commit diff
path: root/src/view/shell/Drawer.tsx
diff options
context:
space:
mode:
authorbnewbold <bnewbold@robocracy.org>2023-05-07 12:19:56 -0700
committerGitHub <noreply@github.com>2023-05-07 12:19:56 -0700
commit6d9e23b1be92f00304ca267b49a8339e9b505ee7 (patch)
tree8696a178be0afa1655439b1fcda0326a82c69f4c /src/view/shell/Drawer.tsx
parent0c604ff1c22a7e12be358fbf5eda6d72259632af (diff)
downloadvoidsky-6d9e23b1be92f00304ca267b49a8339e9b505ee7.tar.zst
bskyweb: update robots.txt (#595)
This is to make crawling more explicitly allowed, communicating
expectations.

If we ever end up with "expensive" routes on this service, will want to
add Crawl-Delay.
Diffstat (limited to 'src/view/shell/Drawer.tsx')
0 files changed, 0 insertions, 0 deletions