Building a Proxy Checker Tool: Python Tutorial
A proxy checker tests whether proxies are alive, measures their speed, determines anonymity level, and verifies geographic location. Whether you manage your own proxy pool or use free proxy lists, a checker saves hours of manual testing.
Features We Will Build
- Concurrent async checking (test 100+ proxies simultaneously)
- Speed measurement (connection time + response time)
- Anonymity detection (transparent, anonymous, elite)
- Geographic location via IP lookup
- Protocol detection (HTTP, HTTPS, SOCKS4, SOCKS5)
- Export results to CSV/JSON
- Continuous monitoring mode
Implementation
import asyncio
import httpx
import time
import json
import csv
from dataclasses import dataclass, asdict
from typing import List, Optional
from enum import Enum
class AnonymityLevel(Enum):
TRANSPARENT = "transparent" # Target sees your real IP
ANONYMOUS = "anonymous" # Target knows it's a proxy
ELITE = "elite" # Target can't detect proxy
@dataclass
class ProxyCheckResult:
proxy: str
protocol: str = "http"
alive: bool = False
latency_ms: float = 0
anonymity: str = "unknown"
country: str = "unknown"
city: str = "unknown"
isp: str = "unknown"
supports_https: bool = False
error: str = ""
class ProxyChecker:
def __init__(self, timeout=10, test_url="https://httpbin.org/ip"):
self.timeout = timeout
self.test_url = test_url
self.my_ip = None
async def get_my_ip(self):
async with httpx.AsyncClient(timeout=10) as client:
response = await client.get("https://httpbin.org/ip")
self.my_ip = response.json()["origin"]
return self.my_ip
async def check_proxy(self, proxy_url: str) -> ProxyCheckResult:
result = ProxyCheckResult(proxy=proxy_url)
# Determine protocol
if proxy_url.startswith("socks5://"):
result.protocol = "socks5"
elif proxy_url.startswith("socks4://"):
result.protocol = "socks4"
else:
result.protocol = "http"
try:
start = time.monotonic()
async with httpx.AsyncClient(
proxy=proxy_url,
timeout=self.timeout,
) as client:
response = await client.get(self.test_url)
latency = (time.monotonic() - start) * 1000
if response.status_code == 200:
result.alive = True
result.latency_ms = round(latency)
# Check anonymity
data = response.json()
visible_ip = data.get("origin", "")
if self.my_ip and self.my_ip in visible_ip:
result.anonymity = AnonymityLevel.TRANSPARENT.value
elif "via" in response.headers or "x-forwarded-for" in response.headers:
result.anonymity = AnonymityLevel.ANONYMOUS.value
else:
result.anonymity = AnonymityLevel.ELITE.value
# Get geolocation
try:
geo_response = await client.get(f"https://ipinfo.io/{visible_ip.split(',')[0].strip()}/json")
geo = geo_response.json()
result.country = geo.get("country", "unknown")
result.city = geo.get("city", "unknown")
result.isp = geo.get("org", "unknown")
except Exception:
pass
# Test HTTPS support
try:
await client.get("https://httpbin.org/ip")
result.supports_https = True
except Exception:
result.supports_https = False
except Exception as e:
result.error = str(e)[:100]
return result
async def check_batch(self, proxies: List[str], concurrency=50) -> List[ProxyCheckResult]:
if not self.my_ip:
await self.get_my_ip()
semaphore = asyncio.Semaphore(concurrency)
async def check_with_semaphore(proxy):
async with semaphore:
result = await self.check_proxy(proxy)
status = "ALIVE" if result.alive else "DEAD"
print(f" [{status}] {proxy} — {result.latency_ms}ms {result.country}")
return result
tasks = [check_with_semaphore(p) for p in proxies]
results = await asyncio.gather(*tasks)
alive = sum(1 for r in results if r.alive)
print(f"\nResults: {alive}/{len(results)} alive ({alive/len(results)*100:.1f}%)")
return results
def export_csv(self, results: List[ProxyCheckResult], filename="proxy_results.csv"):
alive_results = [r for r in results if r.alive]
with open(filename, 'w', newline='') as f:
writer = csv.DictWriter(f, fieldnames=asdict(alive_results[0]).keys())
writer.writeheader()
for r in alive_results:
writer.writerow(asdict(r))
print(f"Exported {len(alive_results)} alive proxies to {filename}")
def export_json(self, results: List[ProxyCheckResult], filename="proxy_results.json"):
with open(filename, 'w') as f:
json.dump([asdict(r) for r in results], f, indent=2)
# Usage
async def main():
checker = ProxyChecker(timeout=10)
proxies = [
"http://proxy1.example.com:8080",
"http://proxy2.example.com:8080",
"socks5://proxy3.example.com:1080",
]
results = await checker.check_batch(proxies, concurrency=20)
checker.export_csv(results)
checker.export_json(results)
asyncio.run(main())CLI Interface
import argparse
def main():
parser = argparse.ArgumentParser(description='Proxy Checker Tool')
parser.add_argument('input', help='File with proxy list (one per line)')
parser.add_argument('-o', '--output', default='results.csv', help='Output file')
parser.add_argument('-c', '--concurrency', type=int, default=50)
parser.add_argument('-t', '--timeout', type=int, default=10)
parser.add_argument('--format', choices=['csv', 'json'], default='csv')
args = parser.parse_args()
with open(args.input) as f:
proxies = [line.strip() for line in f if line.strip()]
print(f"Checking {len(proxies)} proxies...")
checker = ProxyChecker(timeout=args.timeout)
results = asyncio.run(checker.check_batch(proxies, concurrency=args.concurrency))
if args.format == 'csv':
checker.export_csv(results, args.output)
else:
checker.export_json(results, args.output)
if __name__ == '__main__':
main()Internal Links
- Building Your Own Rotating Proxy Pool — use checker results to build a pool
- Proxy Performance Benchmarks — benchmark methodology
- Self-Hosted Proxy Server — check your own proxies
- Free Proxy List: How to Use & Risks — test free proxies
- Best Proxy Checker Tools 2026 — compare with existing tools
FAQ
How fast can a proxy checker test proxies?
With async checking at 50 concurrent connections, you can test 1,000 proxies in about 30-60 seconds. Increase concurrency for faster results, but be aware that very high concurrency may trigger rate limits on the test endpoint.
What test URL should I use for proxy checking?
httpbin.org/ip is the standard. It returns your visible IP in JSON format, enabling anonymity detection. For production use, host your own endpoint to avoid rate limits on httpbin.
How do I determine proxy anonymity level?
Compare the IP visible to the target (from httpbin.org/ip) with your real IP. If your real IP is visible, it is transparent. If proxy headers (Via, X-Forwarded-For) are present, it is anonymous. If neither your IP nor proxy headers are visible, it is elite.
Should I check proxies before every use?
For free proxies, yes — they die frequently. For paid proxy services, periodic checks (hourly) are sufficient. Build the checker into your proxy pool manager for automated health monitoring.
Can I check SOCKS proxies with this tool?
Yes. httpx supports SOCKS4 and SOCKS5 proxies natively. Use the socks5://host:port URL format. The checker tests connectivity and measures latency the same way as HTTP proxies.
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a Proxy Rotator in Python: Complete Tutorial
- AJAX Request Interception: Scraping API Calls Directly
- Bandwidth Optimization for Proxies: Reduce Costs & Increase Speed
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a Proxy Rotator in Python: Complete Tutorial
- AJAX Request Interception: Scraping API Calls Directly
- Bandwidth Optimization for Proxies: Reduce Costs & Increase Speed
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a News Crawler in Python: Step-by-Step Tutorial
- AJAX Request Interception: Scraping API Calls Directly
- Azure Functions for Serverless Web Scraping: the Complete Guide
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)
Related Reading
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a News Crawler in Python: Step-by-Step Tutorial
- AJAX Request Interception: Scraping API Calls Directly
- Azure Functions for Serverless Web Scraping: the Complete Guide
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)