Go to file
silentsilas cc1011a7bf
update package.json
2025-01-27 17:05:39 -05:00
src init commit, dynamically create robot txt with dark visitors api 2025-01-27 16:20:01 -05:00
.gitignore init commit, dynamically create robot txt with dark visitors api 2025-01-27 16:20:01 -05:00
.tool-versions init commit, dynamically create robot txt with dark visitors api 2025-01-27 16:20:01 -05:00
LICENSE init commit, dynamically create robot txt with dark visitors api 2025-01-27 16:20:01 -05:00
README.md init commit, dynamically create robot txt with dark visitors api 2025-01-27 16:20:01 -05:00
package-lock.json init commit, dynamically create robot txt with dark visitors api 2025-01-27 16:20:01 -05:00
package.json update package.json 2025-01-27 17:05:39 -05:00
tsconfig.json init commit, dynamically create robot txt with dark visitors api 2025-01-27 16:20:01 -05:00

README.md

@silentsilas/vite-plugin-ai-robots

npm version License: MIT

Vite plugin that automatically generates and updates a robots.txt file blocking AI agents using the Dark Visitors API.

Installation

Create a free account at Dark Visitors and obtain your access token.

npm install @silentsilas/vite-plugin-ai-robots --save-dev

Add the token to your environment, and pass it to the plugin's config. You should now see a robots.txt file in your output directory during builds.

Usage

Basic Configuration

// vite.config.ts
import { defineConfig } from "vite";
import { aiRobots } from "@silentsilas/vite-plugin-ai-robots";

export default defineConfig({
  plugins: [
    aiRobots({
      accessToken: process.env.DARK_VISITORS_TOKEN,
    }),
  ],
});

Configuration Options

Option Type Default Description
accessToken string Required Dark Visitors API token
agentTypes string[] ["AI Data Scraper", "Undocumented AI Agent"] Agent types to block
disallow string "/" Paths to disallow
cacheHours number 24 Cache duration in hours
outputDir string "static" Output directory
debug boolean false Enable debug logging

Troubleshooting

Common Issues:

  • 401 Unauthorized: Check your access token
  • Empty robots.txt: Enable debug: true
  • Cache not updating: Delete .ai-robots-cache.json
    • You may want to add this file to your .gitignore