Trust & Security

How we handle your data during analysis.

DATA LIFECYCLE

T+0sURL submitted
Only the URL is stored in our database
T+5sSandbox crawl begins
Isolated browser context, in-memory only
T+30-60sAnalysis & scoring
All analyzers run on in-memory data
T+60sResults stored
Only scores, tier, and metadata are persisted
T+60sRaw data purged
HTML, network data, screenshots deleted from RAM

Zero-Retention Policy

  • We never store the pages we crawl. Analysis runs in an isolated browser sandbox.
  • Page content (HTML, screenshots, cookies, localStorage) is processed exclusively in-memory.
  • All crawled data is discarded immediately after scoring completes.
  • Your source code, assets, and page content never touch our disk or database.

What We Store

  • The URL you submitted
  • Computed scores (surface, functional, moat, overall)
  • Tier classification and tech stack fingerprint
  • Analysis metadata (timestamp, duration, confidence level)

What We Never Store

  • Page HTML, DOM snapshots, or rendered screenshots
  • Cookies, session tokens, or localStorage data
  • API responses or network request bodies
  • User credentials or authentication tokens from crawled sites
  • Uploaded file contents (processed in-memory, purged after analysis)

Upload Security

  • Uploaded files are validated against an allowlist (.md, .txt, .json, .yaml, images only).
  • All text files are scanned for secrets (API keys, tokens, credentials) before processing.
  • Files are processed via in-memory buffers (SpooledTemporaryFile) and never written to disk.
  • File contents are purged from memory immediately after context extraction.
  • Code files (.js, .py, .env, etc.) are blocked at upload time.

Browser Sandbox

  • Every analysis runs in an isolated Playwright browser context with no shared state.
  • Blocked URL schemes: file://, data:, javascript:// to prevent local file access.
  • 5-minute TTL auto-kill timer prevents runaway analyses.
  • Browser context is explicitly destroyed and purged after each analysis.
  • No cross-analysis data sharing is possible.

CLI & Localhost Analysis

  • Localhost analysis uses a temporary tunnel with a random UUID subdomain.
  • Tunnel auto-destructs after 5 minutes (configurable via --timeout).
  • Only HTTP traffic on your specified port is exposed through the tunnel.
  • No data is retained after the tunnel closes.