QualityPilot

Security at QualityPilot

We're a small team building a developer tool that touches your CI logs and your test code. Here's exactly what we do with that access — no marketing fluff, no NDA wall.

Last updated: April 19, 2026

What we read from your repo

QualityPilot uses GitHub OAuth (or, optionally, a GitHub App when installed) to:

  • Read failing-test logs from your CI runs (only those that surfaced via our reporter or our findAndFetchFailingTest ingest)
  • Read the failing test file's source (one file at a time)
  • Read the inferred production file under test (heuristic — we walk import paths backward; one file)
  • Open pull requests with proposed fixes (write access required only for this; the PR is on a branch we name qlens/auto-fix/*)

We do NOT:

  • Index your whole repository
  • Send your README, docs, or non-code files to OpenAI
  • Train any model on your code
  • Read private CI environment variables or secrets

OAuth scopes we request

repo
Full repo access (read code + open PRs). Required for the auto-fix flow.
read:user
Your GitHub username + email for account lookup.
workflow
Read CI workflow runs to surface recent failures. (Only requested if you opt into auto-scan.)

We do not request: admin:org, admin:repo_hook, delete_repo, gist, or any payment/billing scopes.

AI / LLM

Failed test code goes through OpenAI's gpt-4o-mini API to generate fix proposals.

  • OpenAI's API terms (as of April 2026) state the API does not train on your data.
  • We send: the test file source + (optionally) the inferred production source + a short failure trace. We do NOT send your full repo contents.
  • Average payload: ~3K tokens in, ~1K tokens out per fix. Logged for cost accounting only.
  • The prompt cache lives at OpenAI for ≤24 hours per their policy.

Data we store

Data typeStored whereRetention
Account info (GitHub id, email, plan)Supabase Postgres (us-east)until account deletion
Scan results (grades, flaky pattern matches, file names)Supabase Postgres12 months rolling
Auto-fix records (proposed source, PR URL, accept/reject, AI tokens)Supabase Postgres12 months rolling
Raw test file source we fetchedNOT stored (in-memory only during the fix request)
GitHub OAuth tokensSupabase Postgres, encrypted at restuntil disconnect
API keysSupabase Postgres, SHA-256 hashed; plaintext shown only at creationuntil revocation
Webhook secrets (per-repo HMAC-SHA256)Supabase Postgres, encrypted at restuntil rotation
Stripe customer IDsSupabase Postgresuntil subscription deletion

Sub-processors

Vendors we send your data to:

  • Vercel (US/global) — hosts the app + serverless functions (your code never lands here, just session metadata). Web Analytics + Speed Insights metrics (page paths + Web Vitals; no cookies, no PII).
  • Supabase (US-East, AWS) — Postgres + authentication; SOC 2 Type II + HIPAA-eligible
  • OpenAI (US) — LLM provider for fix generation (gpt-4o-mini); data not used for training (per their API agreement)
  • Stripe (US) — payments + billing; PCI-DSS Level 1
  • Resend (US) — transactional email (welcome, fix-proposed, trial reminders)
  • GitHub (US) — your repo provider; we use their API on your behalf
  • Cloudflare (global) — DNS only; does not see your data

Encryption

  • All transit: TLS 1.3 (HTTPS everywhere; no HTTP fallback).
  • All storage: AES-256 at rest via Supabase (Postgres) and Vercel.
  • OAuth tokens, webhook secrets, API key plaintext: never logged.

Compliance posture

  • We are not currently SOC 2 certified. We're early-stage and pursuing SOC 2 Type II once recurring revenue justifies the audit cost. If you need SOC 2 or ISO 27001 today, please email support@qlens.dev — we'll work with you.
  • GDPR: yes, we honor data export and deletion requests within 30 days. Email privacy@qlens.dev.
  • DPA available on request for paid plans.

Vulnerability reporting

Found a security issue? Email security@qlens.dev (PGP key fingerprint: TBD — we'll publish before public launch). We respond within 48 hours.

We don't run a paid bug bounty yet, but we acknowledge contributors publicly.

Incident history

April 2026 — None to disclose. We commit to publishing post-mortems for any production incident affecting customer data.