Built for the AI search era
Every feature in Wrenda is designed to make AI search engines understand, trust and recommend your content.
Every request, intelligently routed
From edge interception to enriched response — all in under 50ms.
Incoming Request
Wrenda Edge
User Agent Detection
Route Decision
Block / Optimize / Pre-render / Pass-through
Cache Layer
Response Delivered
Request → Edge → Detection → Decision → Transform → Cache → Response
Everything you need in one proxy
Seven core pillars that cover the full lifecycle of AI crawler optimization.
AI Content Enrichment
Automatically enriches every page AI crawlers request with semantic depth that language models actually understand and value.
- Schema.org structured data injection
- Entity definitions and semantic context
- Comprehensive FAQ generation
- Relationship mapping and alternatives
Crawler Detection & Routing
Identifies 50+ crawlers with zero false positives in under 1ms — without ever slowing down real users.
- GPTBot, ClaudeBot, PerplexityBot, Googlebot, Bingbot and more
- Pattern-based user agent matching
- Custom rule priority ordering
JavaScript Pre-rendering
Full browser rendering for search bots that need real HTML. Your React, Next.js or SPA content lands in search.
- Headless browser rendering with full JavaScript execution
- Configurable viewport and wait conditions
- Pre-render scripts for dynamic content
- KV-cached renders for speed
Crawler Rules Engine
Granular control over every crawler interaction. Decide exactly what each bot sees, down to URL patterns and priority.
- Block / optimize / pre-render / pass-through actions
- URL pattern matching with wildcards
- Per-domain rule sets
SEO A/B Testing
Statistically valid experiments with Causal Impact analysis. Know exactly what moves the needle in AI search.
- URL template auto-detection
- Control/variant bucket assignment
- Google Search Console integration
- Causal Impact statistical analysis
Analytics Dashboard
Full visibility into every crawler interaction. See what they request, how fast you serve it, and where you can improve.
- Real-time activity feed
- Cache hit rates and latency
- Crawler type breakdown
- Weekly aggregated reports
UCP Commerce
Built on Google's UCP spec. AI agents on Gemini and Search discover your products and complete checkout with Google Pay.
- Google UCP spec compliant (v2026-01-23)
- Google Pay checkout with PSP token forwarding
- Shopify, WooCommerce, Merchant Center & browser adapters
- Full order lifecycle with Google webhook sync
AI Citations Tracking
See exactly when ChatGPT, Perplexity, Gemini, Copilot, Google AI Overview and Grok cite your brand. Daily visibility scores, share-of-voice vs competitors, and citation opportunity gaps.
- Multi-platform tracking across 6 AI search engines
- Persona fan-out (CMO, SEO Lead, Founder, Buyer)
- Categorised prompts with importance-scored opportunities
- Competitor source-gap analysis + niche query explorer
Markdown & MCP: Serve any AI format
NewWrenda automatically detects what format each AI consumer prefers. LLMs get clean markdown via Accept: text/markdown. AI agents get tool manifests via MCP. Search crawlers get enriched HTML. One proxy, every format.
- Accept: text/markdown support
- MCP server at /.well-known/mcp.json
- JSON-RPC 2.0 tool execution
- Format auto-detection
Built for global scale
Wrenda runs at the edge across 300+ locations — sub-50ms globally, 99.9% uptime, zero cold starts. Every request is handled in the data centre closest to the crawler, so enriched HTML lands fast enough to matter.
300+
Edge locations
<50ms
P99 latency
99.9%
Uptime SLA
Unlimited
Cache storage
Wrenda vs. alternatives
See how Wrenda stacks up against other approaches to AI crawler optimization.
Based on publicly available capabilities as of 2026. Custom SSR assumes a bespoke implementation.
Platform FAQs
Common questions about how Wrenda fits into your stack.
AI engines tokenise and weight HTML differently than humans skim a page. Pages built for human readability often perform poorly when an AI is trying to extract a clean answer. A reverse proxy lets you serve a different — richer, more structured — version to AI crawlers without rebuilding your site or changing what humans see.
Sign up, add your domain, and create a DNS record pointing to Wrenda. SSL provisions automatically — typically 5–15 minutes. Then enable a routing template and you're live. End-to-end setup for the first domain is usually under 30 minutes.
No. Human traffic takes a fast-path that bypasses the optimisation pipeline entirely — sub-50ms added latency, then your origin's response. Only AI crawler and search-bot traffic goes through the full pipeline, and even that's typically cached on the first hit.
AI optimisation transforms HTML — adds schema, FAQs, entity expansions — for AI crawlers like GPTBot and ClaudeBot that want richer semantic content. Pre-rendering executes JavaScript and produces static HTML for traditional search bots like Googlebot that can't reliably run JS. Most sites use both, routed by user-agent.
No code changes. Just point your domain at Wrenda via DNS. All optimisation rules, content templates, cache settings and tests are configured from the dashboard. Your origin keeps serving HTML the way it always has.
Generic plugins inject schema once at build time. Pre-rendering services like Prerender.io serve only Googlebot. Wrenda handles AI crawlers (GPTBot, ClaudeBot, Perplexity etc.), traditional bots (Googlebot, Bingbot), AND humans differently — all from one config — plus adds GEO citation tracking, MCP tool servers and UCP commerce on top.
Ready to win AI search?
Deploy in minutes. No infrastructure changes required.
Start Free Trial →14-day free trial — No credit card required