Fiddler Proxy Debugging: Complete Web Traffic Analysis Guide
Fiddler Everywhere is a modern HTTP debugging proxy from Telerik that captures, inspects, and modifies web traffic. Originally Windows-only (Fiddler Classic), Fiddler Everywhere runs on macOS, Windows, and Linux with a clean modern UI. It is particularly strong for .NET developers and teams that need collaborative debugging features.
This guide covers how to use Fiddler for web scraping debugging and API analysis.
Installation
# Download from telerik.com/fiddler/fiddler-everywhere
# Available for macOS, Windows, Linux
# Fiddler Classic (Windows only, free)
# Download from telerik.com/fiddler/fiddler-classic| Version | Platform | Price | Best For |
|---|---|---|---|
| Fiddler Everywhere | All | $12/mo or free trial | Cross-platform debugging |
| Fiddler Classic | Windows only | Free | Windows power users |
| Fiddler Jam | Browser extension | Free tier | Quick captures |
Core Setup
Enable HTTPS Decryption
- Go to Settings → HTTPS
- Toggle Capture HTTPS traffic
- Click Trust root certificate (system prompt will appear)
- Restart browser if needed
Configure as System Proxy
Fiddler auto-configures as system proxy. For specific applications:
# Python requests through Fiddler
import requests
proxies = {
'http': 'http://127.0.0.1:8866', # Fiddler Everywhere default
'https': 'http://127.0.0.1:8866',
}
# Disable SSL verification or use Fiddler's cert
response = requests.get(
'https://api.example.com/data',
proxies=proxies,
verify=False # Or path to Fiddler cert
)# cURL through Fiddler
curl -x http://127.0.0.1:8866 -k https://httpbin.org/get
# Node.js
HTTP_PROXY=http://127.0.0.1:8866 HTTPS_PROXY=http://127.0.0.1:8866 \
NODE_TLS_REJECT_UNAUTHORIZED=0 node scraper.jsKey Features
1. Traffic Inspector
The main panel shows all captured requests with columns:
- # — Request number
- Result — HTTP status code (color-coded)
- Protocol — HTTP/HTTPS/WebSocket
- Host — Target domain
- URL — Request path
- Body — Response size
- Process — Application that made the request
Click any request to see detailed tabs:
- Headers — Request/response headers
- Body/Text — Response body (auto-formatted for JSON, XML, HTML)
- Cookies — Cookie details
- Raw — Raw request/response bytes
- Timeline — Timing breakdown (DNS, connect, TLS, transfer)
2. Filters
Reduce noise by filtering traffic:
# By domain
Host: api.target.com
# By status code
Result: 4xx (client errors only)
# By content type
Content-Type: application/json
# By process
Process: chrome.exe / python3 / node
# Combined
Host: *.target.com AND Result: 200 AND Content-Type: json3. AutoResponder
Map URL patterns to local responses or actions:
Rule 1: Return local mock data
Match: regex:api\.target\.com/products/\d+
Action: Return file /mocks/product.json (200)
Rule 2: Block tracking scripts
Match: google-analytics.com
Action: Return empty (200)
Rule 3: Add delay to test timeouts
Match: api.target.com/slow-endpoint
Action: Delay 5000ms then passthrough
Rule 4: Return custom status
Match: api.target.com/auth/token
Action: Return 401 Unauthorized4. Composer (Request Builder)
Build and send custom HTTP requests:
Method: POST
URL: https://api.target.com/v2/search
Headers:
Content-Type: application/json
Authorization: Bearer eyJ0eXAi...
User-Agent: Mozilla/5.0 ...
Body:
{
"query": "proxy servers",
"page": 1,
"limit": 20,
"filters": {
"category": "residential"
}
}
[Execute]5. Compare Sessions
Save two captures and diff them:
Workflow for debugging blocked requests:
- Capture browser traffic (successful requests) → Save as Session A
- Capture scraper traffic (blocked requests) → Save as Session B
- Compare the same endpoint between sessions
- Diff reveals missing headers, different cookie values, or timing differences
6. Rules (Fiddler Classic)
Fiddler Classic supports C# scripting via FiddlerScript:
// CustomRules.cs — Fiddler Classic
static function OnBeforeRequest(oSession: Session) {
// Add header to all requests
oSession.oRequest["User-Agent"] =
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) ...";
// Log API calls
if (oSession.HostnameIs("api.target.com")) {
FiddlerApplication.Log.LogString(
"API: " + oSession.fullUrl
);
}
// Block specific domains
if (oSession.HostnameIs("tracking.example.com")) {
oSession.utilCreateResponseAndBypassServer();
oSession.responseCode = 200;
}
}
static function OnBeforeResponse(oSession: Session) {
// Remove CORS headers for local testing
oSession.oResponse.headers.Remove("Access-Control-Allow-Origin");
oSession.oResponse.headers.Add(
"Access-Control-Allow-Origin", "*"
);
// Log rate limit headers
if (oSession.oResponse.headers.Exists("X-RateLimit-Remaining")) {
var remaining = oSession.oResponse.headers["X-RateLimit-Remaining"];
FiddlerApplication.Log.LogString(
"Rate limit remaining: " + remaining
);
}
}Scraping Debugging Workflow
Step 1: Capture the Browser Flow
Open Fiddler, browse the target site, and identify:
- Authentication endpoints (login, token refresh)
- Data API endpoints (the actual data you want)
- Required cookies and headers
- Request sequences (CSRF token fetch → login → data request)
Step 2: Reproduce in Composer
Copy a successful browser request to Composer and verify it works in isolation.
Step 3: Strip Headers One by One
Remove headers from the Composer request one at a time to find the minimum required set:
- Required:
Host,Authorization/Cookie,User-Agent - Often required:
Accept,Accept-Language,Referer,Origin - Sometimes required:
Sec-Fetch-*headers,X-Requested-With
Step 4: Implement in Code
Translate the minimal working request to your scraper:
import requests
session = requests.Session()
# Minimum headers discovered via Fiddler
session.headers.update({
'User-Agent': 'Mozilla/5.0 ...',
'Accept': 'application/json',
'Accept-Language': 'en-US,en;q=0.9',
'Referer': 'https://www.target.com/search',
})
# Step 1: Get CSRF token (discovered in Fiddler)
csrf_response = session.get('https://www.target.com/api/csrf')
csrf_token = csrf_response.json()['token']
# Step 2: Make API call with token
response = session.post(
'https://api.target.com/v2/search',
json={'query': 'residential proxy', 'page': 1},
headers={'X-CSRF-Token': csrf_token},
)
data = response.json()Fiddler vs Charles vs mitmproxy
| Feature | Fiddler Everywhere | Charles | mitmproxy |
|---|---|---|---|
| Price | $12/mo | $50 one-time | Free |
| Platform | All | All | All |
| UI | Modern | Classic | TUI/Web |
| Scripting | Limited | Rewrite rules | Full Python |
| Collaboration | Share sessions | Export | Export |
| .NET debugging | Excellent | Good | Good |
| Mobile debugging | Good | Excellent | Good |
Internal Links
- mitmproxy Tutorial — open-source alternative with full Python scripting
- Charles Proxy Guide — comparable GUI debugging proxy
- AJAX Request Interception — programmatic API discovery
- How Websites Detect Bots — understand what Fiddler reveals
- Building a Proxy Server from Scratch — understand proxy internals
FAQ
Is Fiddler Classic still free?
Yes, Fiddler Classic remains free for Windows users. Fiddler Everywhere (cross-platform) requires a subscription ($12/month) after the free trial. For Windows-only use, Fiddler Classic is fully functional and free.
Can I use Fiddler with mobile devices?
Yes. Configure your mobile device to use Fiddler as its proxy (your computer’s IP on port 8866), install the Fiddler root certificate on the device, and all mobile traffic will be captured by Fiddler.
Does Fiddler work with WebSocket traffic?
Yes, Fiddler Everywhere captures WebSocket connections and displays frames in the inspector. You can see both text and binary frames with timestamps, making it useful for debugging real-time data scraping.
How do I capture traffic from Docker containers?
Set the HTTP_PROXY and HTTPS_PROXY environment variables in your Docker container to point to your host machine’s Fiddler instance. Use the Docker host network IP (not localhost) since containers have their own network namespace.
Can I automate Fiddler for CI/CD testing?
Fiddler Classic supports FiddlerCore (.NET library) for programmatic use. For CI/CD, mitmproxy is generally a better choice due to its CLI-first design and Python scripting capabilities.
- AJAX Request Interception: Scraping API Calls Directly
- Bandwidth Optimization for Proxies: Reduce Costs & Increase Speed
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a Proxy Rotator in Python: Complete Tutorial
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)
- AJAX Request Interception: Scraping API Calls Directly
- Bandwidth Optimization for Proxies: Reduce Costs & Increase Speed
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a Proxy Rotator in Python: Complete Tutorial
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)
- AJAX Request Interception: Scraping API Calls Directly
- Azure Functions for Serverless Web Scraping: the Complete Guide
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a News Crawler in Python: Step-by-Step Tutorial
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)
Related Reading
- AJAX Request Interception: Scraping API Calls Directly
- Azure Functions for Serverless Web Scraping: the Complete Guide
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a News Crawler in Python: Step-by-Step Tutorial
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)