Python quickstart
Capture AI API calls from any Python application using the built-in MITM proxy—no code changes required.
1. Download and launch the desktop app
Download the LupisLabs desktop app for macOS or Windows.
The MITM proxy starts automatically when you launch the app on port 9010.
2. Open a proxy-configured terminal
In the LupisLabs desktop app, click "Open Terminal" (or use the keyboard shortcut). This opens a new terminal window with:
- ✅ Proxy environment variables automatically set (
HTTP_PROXY,HTTPS_PROXY) - ✅ SSL certificates automatically trusted (no manual installation needed)
- ✅ Works with Python, Node.js, curl, and other HTTP clients
You'll see a confirmation message like this:
================================================
Lupis Labs - Proxy Environment Configured
================================================
HTTP_PROXY: http://127.0.0.1:9010
HTTPS_PROXY: http://127.0.0.1:9010
CA Certificate: /path/to/ca.pem
NODE_EXTRA_CA_CERTS: Set (Node.js)
REQUESTS_CA_BUNDLE: Set (Python)
SSL_CERT_FILE: Set (Python/curl)
All requests (Node.js, Python) will use the proxy.
================================================
3. Run your Python application
In the proxy-configured terminal, run your Python application normally—no code changes needed:
# Run your script
python app.py
# Or start your server
uvicorn main:app --reload
# Or run in a notebook
jupyter notebook
Any AI SDK calls (OpenAI, Anthropic, etc.) will be automatically captured:
# Your existing code works as-is
from openai import OpenAI
client = OpenAI()
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}]
)
4. View captured requests in the app
Switch to the Proxy Monitor tab in the LupisLabs desktop app. You'll see:
- 📋 Complete request and response payloads
- 💰 Token usage and cost analytics
- ⏱️ Response times and status codes
- 💾 Save requests for replay later