IPv6 Proxy Support: Complete Guide to IPv6 Proxies
IPv4 addresses are exhausted — all 4.3 billion have been allocated. IPv6 provides 340 undecillion addresses (3.4 x 10^38), and proxy providers are leveraging this massive address space for virtually unlimited IP rotation. A single /48 IPv6 subnet gives you 1.2 trillion unique addresses, making IPv6 proxies the most cost-effective option for large-scale operations.
This guide covers how IPv6 proxies work, when to use them, subnet rotation strategies, and practical implementation.
IPv4 vs IPv6 for Proxies
IPv4 Address: 192.168.1.1 (32 bits, ~4.3 billion total)
IPv6 Address: 2001:0db8:85a3:0000:0000:8a2e:0370:7334 (128 bits)
Available proxy IPs:
IPv4: ~4.3 billion total (most already allocated)
IPv6: ~340 undecillion (practically unlimited)
Typical allocation:
IPv4: Buy individual IPs ($1-5 each/month)
IPv6: Get a /48 subnet (1.2 trillion IPs for ~$50/month)| Feature | IPv4 Proxy | IPv6 Proxy |
|---|---|---|
| Address pool | Limited, expensive | Virtually unlimited, cheap |
| Website support | Universal (100%) | Growing (~40% of top sites) |
| Cost per IP | $1-5/month | $0.000001/month |
| Detection risk | Higher (known proxy ranges) | Lower (vast address space) |
| Speed | Standard | Often faster (modern routing) |
| Rotation options | Limited by pool size | Rotate across /48 or /64 subnet |
How IPv6 Proxies Work
Subnet-Based Rotation
Instead of rotating through a fixed list of IPs, IPv6 proxies generate random addresses within an allocated subnet:
import random
import ipaddress
class IPv6SubnetRotator:
"""Generate random IPv6 addresses from an allocated subnet."""
def __init__(self, subnet: str):
"""
Args:
subnet: IPv6 subnet like '2001:db8::/48'
"""
self.network = ipaddress.IPv6Network(subnet)
self.prefix_len = self.network.prefixlen
def get_random_ip(self) -> str:
"""Generate a random IPv6 address within the subnet."""
# Get the network address as integer
network_int = int(self.network.network_address)
# Calculate host bits
host_bits = 128 - self.prefix_len
# Generate random host portion
random_host = random.randint(1, (2 ** host_bits) - 1)
# Combine network + random host
ip_int = network_int | random_host
return str(ipaddress.IPv6Address(ip_int))
def get_sequential_ips(self, count: int, start_offset: int = 1):
"""Generate sequential IPv6 addresses."""
network_int = int(self.network.network_address)
return [
str(ipaddress.IPv6Address(network_int + start_offset + i))
for i in range(count)
]
@property
def total_addresses(self) -> int:
return 2 ** (128 - self.prefix_len)
# Example usage
rotator = IPv6SubnetRotator('2001:db8:abcd::/48')
print(f"Total IPs available: {rotator.total_addresses:,}")
# Total IPs available: 1,208,925,819,614,629,174,706,176
for _ in range(5):
print(f"Random IP: {rotator.get_random_ip()}")Binding to IPv6 Addresses
On a server with an IPv6 /48 allocation, you can bind to any address in the range:
import socket
import requests
from requests.adapters import HTTPAdapter
from urllib3.util.connection import create_connection
class IPv6SourceAdapter(HTTPAdapter):
"""HTTP adapter that binds to a specific IPv6 source address."""
def __init__(self, source_address, **kwargs):
self.source_address = source_address
super().__init__(**kwargs)
def init_poolmanager(self, *args, **kwargs):
kwargs['source_address'] = (self.source_address, 0)
super().init_poolmanager(*args, **kwargs)
def make_request_from_ipv6(ipv6_address, target_url):
"""Make HTTP request from a specific IPv6 source address."""
session = requests.Session()
adapter = IPv6SourceAdapter(source_address=ipv6_address)
session.mount('http://', adapter)
session.mount('https://', adapter)
response = session.get(target_url)
return response
# Configure the server to accept connections on entire /48
# ip -6 addr add 2001:db8:abcd::/48 dev eth0
# Make requests from different IPs
rotator = IPv6SubnetRotator('2001:db8:abcd::/48')
for _ in range(10):
ip = rotator.get_random_ip()
response = make_request_from_ipv6(ip, 'https://ipv6.icanhazip.com')
print(f"Request from {ip} → seen as {response.text.strip()}")Setting Up IPv6 Proxy Infrastructure
Server Configuration (Linux)
# Check IPv6 support
sysctl net.ipv6.conf.all.disable_ipv6
# Should be 0 (enabled)
# Add entire /48 subnet to interface
sudo ip -6 addr add 2001:db8:abcd::/48 dev eth0
# Verify
ip -6 addr show dev eth0
# Enable IPv6 forwarding (for proxy server)
sudo sysctl -w net.ipv6.conf.all.forwarding=1
# Make persistent
echo "net.ipv6.conf.all.forwarding=1" | sudo tee -a /etc/sysctl.conf
# Add IPv6 default route
sudo ip -6 route add default via 2001:db8::1 dev eth0Squid Proxy with IPv6
# squid.conf — IPv6 proxy configuration
# Listen on both IPv4 and IPv6
http_port 3128
http_port [::]:3128
# ACL for IPv6 clients
acl ipv6_clients src ipv6
# Outgoing IPv6 address rotation
# Use tcp_outgoing_address for IPv6 source selection
acl target1 dstdomain .google.com
tcp_outgoing_address 2001:db8:abcd::1 target1
tcp_outgoing_address 2001:db8:abcd::2 !target1
# DNS for IPv6
dns_nameservers 2001:4860:4860::8888 2001:4860:4860::88443proxy with IPv6 Rotation
# 3proxy is lightweight and supports IPv6 well
# Install
apt install 3proxy
# /etc/3proxy/3proxy.cfg
nscache 65536
timeouts 1 5 30 60 180 1800 15 60
# IPv6 outgoing addresses (rotate through these)
external 2001:db8:abcd::1
external 2001:db8:abcd::2
external 2001:db8:abcd::3
external 2001:db8:abcd::4
external 2001:db8:abcd::5
# Proxy with round-robin source rotation
proxy -6 -p3128 -e2001:db8:abcd::1
proxy -6 -p3129 -e2001:db8:abcd::2
proxy -6 -p3130 -e2001:db8:abcd::3Automated IPv6 Proxy Generator Script
#!/bin/bash
# generate_ipv6_proxies.sh
# Generate thousands of IPv6 proxy ports from a /48 subnet
SUBNET="2001:db8:abcd"
INTERFACE="eth0"
START_PORT=10000
NUM_PROXIES=1000
echo "Generating $NUM_PROXIES IPv6 proxies..."
for i in $(seq 1 $NUM_PROXIES); do
# Generate random IPv6 suffix
SUFFIX=$(printf '%x:%x:%x:%x:%x' \
$((RANDOM % 65536)) $((RANDOM % 65536)) \
$((RANDOM % 65536)) $((RANDOM % 65536)) \
$((RANDOM % 65536)))
IPV6="${SUBNET}:${SUFFIX}"
PORT=$((START_PORT + i))
# Add IPv6 address to interface
ip -6 addr add "${IPV6}/128" dev $INTERFACE 2>/dev/null
# Add proxy entry to 3proxy config
echo "proxy -6 -p${PORT} -e${IPV6}" >> /etc/3proxy/ipv6_proxies.cfg
echo "${IPV6}:${PORT}"
done
echo "Done. Restart 3proxy to activate."Using IPv6 Proxies for Scraping
Check Target IPv6 Support
Not all websites support IPv6. Always check first:
import socket
import httpx
def check_ipv6_support(domain):
"""Check if a domain supports IPv6 connections."""
try:
# Check for AAAA records
aaaa_records = socket.getaddrinfo(
domain, 443, socket.AF_INET6, socket.SOCK_STREAM
)
ipv6_addresses = [addr[4][0] for addr in aaaa_records]
# Test actual connectivity
client = httpx.Client()
response = client.get(
f"https://{domain}",
headers={"Host": domain}
)
return {
"domain": domain,
"ipv6_supported": True,
"ipv6_addresses": ipv6_addresses,
"status": response.status_code
}
except socket.gaierror:
return {"domain": domain, "ipv6_supported": False}
except Exception as e:
return {"domain": domain, "ipv6_supported": False, "error": str(e)}
# Check popular scraping targets
targets = [
'google.com', 'facebook.com', 'amazon.com',
'cloudflare.com', 'wikipedia.org', 'linkedin.com'
]
for target in targets:
result = check_ipv6_support(target)
status = "YES" if result['ipv6_supported'] else "NO"
print(f"{target}: IPv6 = {status}")Scraping with IPv6 Proxies
import httpx
import asyncio
async def scrape_with_ipv6_proxy(urls, proxy_url):
"""Scrape URLs through an IPv6 proxy."""
async with httpx.AsyncClient(
proxy=proxy_url,
timeout=30,
http2=True,
) as client:
tasks = [client.get(url) for url in urls]
responses = await asyncio.gather(*tasks, return_exceptions=True)
results = []
for url, response in zip(urls, responses):
if isinstance(response, Exception):
results.append({"url": url, "error": str(response)})
else:
results.append({
"url": url,
"status": response.status_code,
"ip_version": "IPv6" if ':' in str(response.url) else "IPv4"
})
return results
# IPv6 proxy format: http://user:pass@[2001:db8::1]:8080
proxy = "http://user:pass@[2001:db8:abcd::1]:8080"
urls = ["https://google.com", "https://cloudflare.com"]
results = asyncio.run(scrape_with_ipv6_proxy(urls, proxy))IPv6 Proxy Advantages for Specific Use Cases
| Use Case | Why IPv6 Works |
|---|---|
| Social media scraping | Massive IP pool avoids rate limits |
| SEO rank checking | Different /64 subnets appear as different users |
| Ad verification | Simulate real IPv6 users (growing segment) |
| Price monitoring | Low-cost high-volume checks |
| Account creation | Each account from unique /64 subnet |
Limitations and Workarounds
Not All Sites Support IPv6
Solution: Use dual-stack proxies that fall back to IPv4:
class DualStackProxy:
"""Try IPv6 first, fall back to IPv4."""
def __init__(self, ipv6_proxy, ipv4_proxy):
self.ipv6_proxy = ipv6_proxy
self.ipv4_proxy = ipv4_proxy
async def get(self, url):
try:
async with httpx.AsyncClient(proxy=self.ipv6_proxy) as client:
return await client.get(url, timeout=10)
except Exception:
async with httpx.AsyncClient(proxy=self.ipv4_proxy) as client:
return await client.get(url, timeout=10)IPv6 Subnet Detection
Some sites detect and block entire /64 or /48 subnets:
# Spread requests across multiple /64 subnets
# A /48 contains 65,536 /64 subnets
class SubnetAwareRotator:
def __init__(self, subnet_48):
self.base = subnet_48 # e.g., '2001:db8:abcd'
def get_ip_from_random_64(self):
"""Generate IP from random /64 within the /48."""
subnet_64 = random.randint(0, 65535)
host = random.randint(1, 2**64 - 1)
return f"{self.base}:{subnet_64:04x}:{host >> 48 & 0xFFFF:04x}:" \
f"{host >> 32 & 0xFFFF:04x}:{host >> 16 & 0xFFFF:04x}:" \
f"{host & 0xFFFF:04x}"Internal Links
- TCP/IP Proxy Internals — understand IPv6 at the network layer
- What Is a Datacenter Proxy? — IPv6 proxies are typically datacenter-based
- Proxy Cost Calculator — compare IPv6 vs IPv4 proxy costs
- Building Your Own Rotating Proxy Pool — set up IPv6 rotation infrastructure
- IP Lookup Tool — check your IPv6 proxy address
FAQ
Are IPv6 proxies as effective as IPv4 proxies for web scraping?
IPv6 proxies are highly effective for sites that support IPv6 (Google, Facebook, Cloudflare-fronted sites). They offer vastly more IPs at lower cost. However, about 60% of websites still lack IPv6 support, so you need IPv4 fallback for those targets.
How many IPv6 addresses can I get from a /48 subnet?
A /48 subnet contains 2^80 addresses, which is approximately 1.2 septillion IPs. Even a /64 subnet provides 18.4 quintillion addresses. This is effectively unlimited for any scraping operation.
Can websites detect that multiple IPv6 addresses belong to the same subnet?
Yes. Websites can check if IPs share the same /64 or /48 prefix, indicating they come from the same allocation. Sophisticated anti-bot systems track subnet patterns. To avoid this, spread requests across different /64 subnets within your /48.
Are IPv6 proxies cheaper than IPv4?
Significantly cheaper. IPv4 addresses cost $1-5 per IP per month due to scarcity. IPv6 proxies offer thousands or millions of IPs for the same price because IPv6 addresses are abundant. A /48 subnet with trillions of IPs can cost as little as $50-100/month.
Do I need to modify my scraping code for IPv6 proxies?
Minimal changes are needed. The main difference is the proxy URL format — IPv6 addresses use brackets: http://user:pass@[2001:db8::1]:8080. Most HTTP libraries (requests, httpx, aiohttp) handle IPv6 proxy addresses correctly.
- AJAX Request Interception: Scraping API Calls Directly
- Bandwidth Optimization for Proxies: Reduce Costs & Increase Speed
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a Proxy Rotator in Python: Complete Tutorial
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)
- AJAX Request Interception: Scraping API Calls Directly
- Bandwidth Optimization for Proxies: Reduce Costs & Increase Speed
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a Proxy Rotator in Python: Complete Tutorial
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)
- AJAX Request Interception: Scraping API Calls Directly
- Azure Functions for Serverless Web Scraping: the Complete Guide
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a News Crawler in Python: Step-by-Step Tutorial
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)
Related Reading
- AJAX Request Interception: Scraping API Calls Directly
- Azure Functions for Serverless Web Scraping: the Complete Guide
- Build an Anti-Detection Test Suite: Verify Browser Stealth
- Build a News Crawler in Python: Step-by-Step Tutorial
- How to Configure Proxies on iPhone and Android
- How to Use Proxies in Node.js (Axios, Fetch, Puppeteer)