DevDocs by CyberAGI
Perfect For
Enterprise Software Developers
Web Scrapers
Development Teams
Indie Hackers
Features
Intelligent Crawling
- Smart Depth Control: Choose crawl depth from 1-5 levels
- Automatic Link Discovery: Finds and categorizes all related content
- Selective Crawling: Pick exactly what you want to extract
- Child URL Detection: Automatically discovers and maps website structure
Performance & Speed
- Parallel Processing: Crawl multiple pages simultaneously
- Smart Caching: Never waste time on duplicate content
- Lazy Loading Support: Handles modern web apps effortlessly
- Rate Limiting: Respectful crawling that won't overload servers
Content Processing
- Clean Extraction: Get content without the fluff
- Multiple Formats: Export to MD or JSON for LLM fine-tuning
- Structured Output: Logically organized content
- MCP Server Integration: Ready for AI processing
Enterprise Features
- Error Recovery: Auto-retry on failures
- Full Logging: Track every operation
- API Access: Integrate with your tools
- Team Management: Multiple seats and roles
Why DevDocs?
The Problem
Our Solution
- Discovers all related pages to that technology
- Extracts meaningful content without the fluff
- Organizes information logically inside an MCP server ready for your LLM to query
- Presents it in a clean, searchable format in MD or JSON for finetuning LLM purposeWe want anyone in the world to have the ability to build amazing products quickly using the most cutting edge LLM technology.
Pricing Comparison
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
Getting Started
Prerequisites
- Docker installed on your system
- Git for cloning the repository
Quick Start with Docker (Recommended)
If you encounter permission issues, you may need to run the script as administrator or manually set permissions on the logs, storage, and crawl_results directories. The script uses theicacls
command to set permissions, which might require elevated privileges on some Windows systems.Manually Setting Permissions on Windows:If you need to manually set permissions, you can do so using either the Windows GUI or command line:Using Windows Explorer:Using Command Prompt (as Administrator):
If you encounter issues with the docker-compose.yml file (such as "Top-level object must be a mapping" error), thedocker-start.bat
script automatically fixes this by ensuring the file has the correct format and encoding. This fix is applied every time you run the script, so you don't need to manually modify the file.
- Create all necessary directories
- Set appropriate permissions
- Build and start all Docker containers
- Monitor the services to ensure they're running properly
Accessing DevDocs
- Frontend UI: http://localhost:3001
- Backend API: http://localhost:24125
- Crawl4AI Service: http://localhost:11235
Logs and Monitoring
- Container Logs (recommended for debugging):
Ctrl+C
in the terminal where docker-start is running.Scripts and Their Purpose
Startup Scripts
start.sh
/start.bat
/start.ps1
- Start all services (frontend, backend, MCP) for local development.
docker-start.sh
/docker-start.bat
- Start all services using Docker containers.
MCP Server Scripts
check_mcp_health.sh
- Verify the MCP server's health and configuration status.
restart_and_test_mcp.sh
- Restart Docker containers with updated MCP configuration and test connectivity.
Crawl4AI Scripts
check_crawl4ai.sh
- Check the status and health of the Crawl4AI service.
debug_crawl4ai.sh
- Run Crawl4AI in debug mode with verbose logging for troubleshooting.
test_crawl4ai.py
- Run tests against the Crawl4AI service to verify functionality.
test_from_container.sh
- Test the Crawl4AI service from within a Docker container.
Utility Scripts
view_result.sh
- Display crawl results in a formatted view.
find_empty_folders.sh
- Identify empty directories in the project structure.
analyze_empty_folders.sh
- Analyze empty folders and categorize them by risk level.
verify_reorganization.sh
- Verify that code reorganization was successful.
- Root directory: Main scripts for common operations
scripts/general/
: General utility scripts
scripts/docker/
: Docker-specific scripts
scripts/mcp/
: MCP server management scripts
scripts/test/
: Testing and verification scripts
Built for Developers, by Developers
- Saves Time: Turn weeks of research into hours
- Improves Understanding: Get clean, organized documentation
- Enables Innovation: Build faster with any technology
- Supports Teams: Share knowledge efficiently
- LLM READY: Modern times require modern solutions, using devdocs with LLM is extremely easy and intuitive. With minimal configuration you can run Devdocs and Claude App and recognizes DevDocs's MCP server ready to chat with your data.
Setting Up the Cline/Roo Cline for Rapid software development.
- Open the "Modes" Interface
- Name
- Role Definition Prompt
- Mode-Specific Custom Instructions Prompt
Join Our Community
- [Reach out to our founder on Linkedin](https://www.linkedin.com/in/shubhamkhichi/)
Success Stories
Technology Partners
Star History
DevDocs by CyberAGI
Perfect For
Enterprise Software Developers
Web Scrapers
Development Teams
Indie Hackers
Features
Intelligent Crawling
- Smart Depth Control: Choose crawl depth from 1-5 levels
- Automatic Link Discovery: Finds and categorizes all related content
- Selective Crawling: Pick exactly what you want to extract
- Child URL Detection: Automatically discovers and maps website structure
Performance & Speed
- Parallel Processing: Crawl multiple pages simultaneously
- Smart Caching: Never waste time on duplicate content
- Lazy Loading Support: Handles modern web apps effortlessly
- Rate Limiting: Respectful crawling that won't overload servers
Content Processing
- Clean Extraction: Get content without the fluff
- Multiple Formats: Export to MD or JSON for LLM fine-tuning
- Structured Output: Logically organized content
- MCP Server Integration: Ready for AI processing
Enterprise Features
- Error Recovery: Auto-retry on failures
- Full Logging: Track every operation
- API Access: Integrate with your tools
- Team Management: Multiple seats and roles
Why DevDocs?
The Problem
Our Solution
- Discovers all related pages to that technology
- Extracts meaningful content without the fluff
- Organizes information logically inside an MCP server ready for your LLM to query
- Presents it in a clean, searchable format in MD or JSON for finetuning LLM purposeWe want anyone in the world to have the ability to build amazing products quickly using the most cutting edge LLM technology.
Pricing Comparison
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
[object Object] | [object Object] | [object Object] |
Getting Started
Prerequisites
- Docker installed on your system
- Git for cloning the repository
Quick Start with Docker (Recommended)
If you encounter permission issues, you may need to run the script as administrator or manually set permissions on the logs, storage, and crawl_results directories. The script uses theicacls
command to set permissions, which might require elevated privileges on some Windows systems.Manually Setting Permissions on Windows:If you need to manually set permissions, you can do so using either the Windows GUI or command line:Using Windows Explorer:Using Command Prompt (as Administrator):
If you encounter issues with the docker-compose.yml file (such as "Top-level object must be a mapping" error), thedocker-start.bat
script automatically fixes this by ensuring the file has the correct format and encoding. This fix is applied every time you run the script, so you don't need to manually modify the file.
- Create all necessary directories
- Set appropriate permissions
- Build and start all Docker containers
- Monitor the services to ensure they're running properly
Accessing DevDocs
- Frontend UI: http://localhost:3001
- Backend API: http://localhost:24125
- Crawl4AI Service: http://localhost:11235
Logs and Monitoring
- Container Logs (recommended for debugging):
Ctrl+C
in the terminal where docker-start is running.Scripts and Their Purpose
Startup Scripts
start.sh
/start.bat
/start.ps1
- Start all services (frontend, backend, MCP) for local development.
docker-start.sh
/docker-start.bat
- Start all services using Docker containers.
MCP Server Scripts
check_mcp_health.sh
- Verify the MCP server's health and configuration status.
restart_and_test_mcp.sh
- Restart Docker containers with updated MCP configuration and test connectivity.
Crawl4AI Scripts
check_crawl4ai.sh
- Check the status and health of the Crawl4AI service.
debug_crawl4ai.sh
- Run Crawl4AI in debug mode with verbose logging for troubleshooting.
test_crawl4ai.py
- Run tests against the Crawl4AI service to verify functionality.
test_from_container.sh
- Test the Crawl4AI service from within a Docker container.
Utility Scripts
view_result.sh
- Display crawl results in a formatted view.
find_empty_folders.sh
- Identify empty directories in the project structure.
analyze_empty_folders.sh
- Analyze empty folders and categorize them by risk level.
verify_reorganization.sh
- Verify that code reorganization was successful.
- Root directory: Main scripts for common operations
scripts/general/
: General utility scripts
scripts/docker/
: Docker-specific scripts
scripts/mcp/
: MCP server management scripts
scripts/test/
: Testing and verification scripts
Built for Developers, by Developers
- Saves Time: Turn weeks of research into hours
- Improves Understanding: Get clean, organized documentation
- Enables Innovation: Build faster with any technology
- Supports Teams: Share knowledge efficiently
- LLM READY: Modern times require modern solutions, using devdocs with LLM is extremely easy and intuitive. With minimal configuration you can run Devdocs and Claude App and recognizes DevDocs's MCP server ready to chat with your data.
Setting Up the Cline/Roo Cline for Rapid software development.
- Open the "Modes" Interface
- Name
- Role Definition Prompt
- Mode-Specific Custom Instructions Prompt
Join Our Community
- [Reach out to our founder on Linkedin](https://www.linkedin.com/in/shubhamkhichi/)