Nowl Resume · BYOK · ATS 98+ · Indian startup hiring

Tailor your resume.
Land the interview.

Bring your own key — Anthropic, OpenAI, Gemini, Bedrock, OpenRouter, or your own Ollama / Jan / OpenAI-compatible endpoint. Four specialised agents — JD analyst, resume builder, ATS auditor, output packager — return a tailored resume, cover letter, LinkedIn DM, cold email, and a filename. Loops until ATS hits 98.

Drop your resume.

PDF, DOCX, LaTeX, plain text, or markdown. Up to 5 MB. We extract your name, email, and phone, convert the document to markdown for the agents, and store everything in MongoDB. No fabrication — gaps in the JD are flagged honestly.

  • JD analyst reads the JD, identifies the ATS (Keka, Darwinbox, greytHR, Lever, Ashby, Greenhouse, Workday, TurboHire, SAP), extracts every keyword.
  • Resume builder rewrites your resume injecting MUST_HAVE keywords only where your real experience supports it.
  • ATS auditor scores the rewrite. If below 95, sends a fix brief back to the builder. Loops up to 3 rounds.
  • Output packager appends the white-text keyword layer (suppressed for Workday), writes the cover letter, LinkedIn DM, cold email, and filename.

Step 1 / 4

Resume file

Drop your resume here, or browse

PDF, DOCX, TXT, MD, TEX · max 5 MB

Step 2 / 4

Job description

Add notice period, salary, location, visa context (optional)

These details help Agent 4 tailor the cover letter and email. Skip if irrelevant.

Step 3 / 4

Bring your own key

Pick any provider — Anthropic, OpenAI, Google, AWS Bedrock, or OpenRouter. Your key powers all 4 agents and is sent only to that provider, never persisted on our servers.

CLOUD

LOCAL · BRING YOUR OWN ENDPOINT

Sent only to Anthropic when you validate. Never logged or stored on our servers.

Get an API key

Opens console.anthropic.com in a new tab.

Enter your API key to continue.

Step 4 / 4

Run the pipeline

Targets ATS score ≥ 98. Up to 5 revision rounds. Live status on the next screen.

How it works.

Behind the cream canvas: a 4-agent Python pipeline talking to Claude, with a Next.js upload layer on Vercel and QStash routing jobs to a FastAPI worker.

CLI · existing surface

Same engine, two faces.

The web app does not replace the CLI. It wraps the same agents.orchestrator.run_pipeline() you've been using from the terminal. Skill files, brain memory, and tracker updates all keep working.

bashpython
# CLI (still works)
python main.py --resume r.tex --jd jd.txt --company "Razorpay"

# OR via the web app
curl -F "resume=@r.pdf" -F "jd=The job description text..." \
     https://agent-cv.vercel.app/api/upload
# -> { "jobId": "...", "status": "queued" }