Tracking Food Delivery Driver Availability and Wait Times
Behind every food delivery order is a complex logistics system matching restaurants, drivers, and customers. For F&B operators, understanding driver availability patterns, wait times, and delivery ETAs is critical for operational planning. Long delivery times hurt ratings, while driver shortages during peak hours mean lost orders.
This guide covers how to collect and analyze delivery driver and logistics data from food delivery platforms across Southeast Asia.
Why Delivery Operations Data Matters
Impact on Restaurant Performance
Delivery logistics directly affect restaurant success on food delivery platforms:
- Ratings: Late deliveries lead to negative reviews, even when food quality is excellent
- Order volume: Platforms deprioritize restaurants in areas with poor driver coverage
- Customer retention: Long wait times drive customers to competitors
- Platform ranking: Estimated delivery time is a key sorting factor on all platforms
Data You Can Track
| Metric | What It Reveals | Business Application |
|---|---|---|
| Estimated delivery time | Current logistics conditions | Peak hour identification |
| Driver availability | Supply-demand balance | Staffing and prep timing |
| Delivery fees | Dynamic pricing signals | Demand pattern recognition |
| Surge indicators | Peak demand periods | Menu preparation planning |
| Delivery radius | Platform coverage changes | Location strategy |
| Wait time at restaurant | Kitchen efficiency signals | Competitor benchmarking |
Collecting Delivery Data
Monitoring ETAs Across Platforms
import requests
import time
import random
from datetime import datetime, timedelta
from dataclasses import dataclass
from typing import Optional
@dataclass
class DeliverySnapshot:
platform: str
restaurant_id: str
restaurant_name: str
delivery_lat: float
delivery_lng: float
estimated_time_minutes: int
delivery_fee: float
surge_active: bool
surge_multiplier: float
timestamp: datetime
distance_km: Optional[float] = None
driver_available: Optional[bool] = None
class DeliveryTracker:
def __init__(self, proxy_user, proxy_pass):
self.proxy_user = proxy_user
self.proxy_pass = proxy_pass
self.snapshots = []
def _get_session(self, country):
session = requests.Session()
proxy_host = f"{country.lower()}-mobile.dataresearchtools.com"
session.proxies = {
"http": f"http://{self.proxy_user}:{self.proxy_pass}@{proxy_host}:8080",
"https": f"http://{self.proxy_user}:{self.proxy_pass}@{proxy_host}:8080"
}
session.headers.update({
"User-Agent": "Mozilla/5.0 (Linux; Android 14) AppleWebKit/537.36",
"Accept": "application/json"
})
return session
def capture_delivery_snapshot(self, country, restaurant_id, platform,
delivery_lat, delivery_lng):
"""Capture a single delivery data snapshot."""
session = self._get_session(country)
if platform == "grabfood":
data = self._get_grabfood_delivery(session, restaurant_id,
delivery_lat, delivery_lng)
elif platform == "foodpanda":
data = self._get_foodpanda_delivery(session, restaurant_id,
delivery_lat, delivery_lng)
else:
return None
if data:
snapshot = DeliverySnapshot(
platform=platform,
restaurant_id=restaurant_id,
restaurant_name=data.get("restaurant_name", ""),
delivery_lat=delivery_lat,
delivery_lng=delivery_lng,
estimated_time_minutes=data.get("eta_minutes", 0),
delivery_fee=data.get("delivery_fee", 0),
surge_active=data.get("surge_active", False),
surge_multiplier=data.get("surge_multiplier", 1.0),
timestamp=datetime.utcnow(),
distance_km=data.get("distance_km"),
driver_available=data.get("driver_available")
)
self.snapshots.append(snapshot)
return snapshot
return None
def _get_grabfood_delivery(self, session, restaurant_id, lat, lng):
"""Fetch delivery info from GrabFood."""
response = session.get(
f"https://food.grab.com/api/v1/restaurants/{restaurant_id}/delivery",
params={"latitude": lat, "longitude": lng}
)
if response.status_code == 200:
data = response.json()
return {
"restaurant_name": data.get("restaurant", {}).get("name", ""),
"eta_minutes": data.get("estimatedDeliveryTime", 0),
"delivery_fee": data.get("deliveryFee", {}).get("amount", 0),
"surge_active": data.get("surge", {}).get("isActive", False),
"surge_multiplier": data.get("surge", {}).get("multiplier", 1.0),
"distance_km": data.get("distance", 0),
"driver_available": data.get("driversAvailable", True)
}
return None
def _get_foodpanda_delivery(self, session, vendor_code, lat, lng):
"""Fetch delivery info from Foodpanda."""
response = session.get(
f"https://www.foodpanda.sg/api/v5/vendors/{vendor_code}",
params={"latitude": lat, "longitude": lng}
)
if response.status_code == 200:
data = response.json().get("data", {})
return {
"restaurant_name": data.get("name", ""),
"eta_minutes": data.get("delivery_time", 0),
"delivery_fee": data.get("delivery_fee", {}).get("value", 0),
"surge_active": data.get("is_surge", False),
"surge_multiplier": data.get("surge_fee_multiplier", 1.0),
"distance_km": data.get("distance", 0),
"driver_available": data.get("is_delivery_available", True)
}
return NoneContinuous Monitoring
def run_continuous_monitoring(self, monitoring_config, duration_hours=24,
interval_minutes=15):
"""Run continuous delivery monitoring for a set of restaurants."""
end_time = datetime.utcnow() + timedelta(hours=duration_hours)
results = []
while datetime.utcnow() < end_time:
cycle_start = datetime.utcnow()
for config in monitoring_config:
snapshot = self.capture_delivery_snapshot(
country=config["country"],
restaurant_id=config["restaurant_id"],
platform=config["platform"],
delivery_lat=config["delivery_lat"],
delivery_lng=config["delivery_lng"]
)
if snapshot:
results.append(snapshot)
time.sleep(random.uniform(2, 5))
# Wait for next interval
elapsed = (datetime.utcnow() - cycle_start).total_seconds()
wait_time = max(0, interval_minutes * 60 - elapsed)
if wait_time > 0:
time.sleep(wait_time)
return resultsAnalyzing Delivery Patterns
Time-of-Day Analysis
def analyze_time_patterns(snapshots):
"""Analyze delivery metrics by time of day."""
hourly_data = {h: {
"avg_eta": [],
"avg_fee": [],
"surge_count": 0,
"total_observations": 0,
"unavailable_count": 0
} for h in range(24)}
for s in snapshots:
hour = s.timestamp.hour
hourly_data[hour]["avg_eta"].append(s.estimated_time_minutes)
hourly_data[hour]["avg_fee"].append(s.delivery_fee)
hourly_data[hour]["total_observations"] += 1
if s.surge_active:
hourly_data[hour]["surge_count"] += 1
if s.driver_available is False:
hourly_data[hour]["unavailable_count"] += 1
analysis = {}
for hour, data in hourly_data.items():
if data["total_observations"] > 0:
analysis[f"{hour:02d}:00"] = {
"avg_delivery_time": round(
sum(data["avg_eta"]) / len(data["avg_eta"]), 1
) if data["avg_eta"] else 0,
"avg_delivery_fee": round(
sum(data["avg_fee"]) / len(data["avg_fee"]), 2
) if data["avg_fee"] else 0,
"surge_probability": f"{data['surge_count'] / data['total_observations'] * 100:.1f}%",
"driver_unavailability_rate": f"{data['unavailable_count'] / data['total_observations'] * 100:.1f}%",
"observations": data["total_observations"]
}
return analysis
def identify_peak_periods(time_analysis, eta_threshold=40):
"""Identify peak delivery periods based on ETA thresholds."""
peaks = []
current_peak = None
for hour_str, data in sorted(time_analysis.items()):
is_peak = data["avg_delivery_time"] > eta_threshold
if is_peak and current_peak is None:
current_peak = {"start": hour_str, "end": hour_str, "max_eta": data["avg_delivery_time"]}
elif is_peak and current_peak:
current_peak["end"] = hour_str
current_peak["max_eta"] = max(current_peak["max_eta"], data["avg_delivery_time"])
elif not is_peak and current_peak:
peaks.append(current_peak)
current_peak = None
if current_peak:
peaks.append(current_peak)
return peaksDay-of-Week Patterns
def analyze_weekly_patterns(snapshots):
"""Analyze delivery patterns across days of the week."""
day_names = ["Monday", "Tuesday", "Wednesday", "Thursday",
"Friday", "Saturday", "Sunday"]
daily_data = {day: {"etas": [], "fees": [], "surges": 0, "count": 0}
for day in day_names}
for s in snapshots:
day = day_names[s.timestamp.weekday()]
daily_data[day]["etas"].append(s.estimated_time_minutes)
daily_data[day]["fees"].append(s.delivery_fee)
daily_data[day]["count"] += 1
if s.surge_active:
daily_data[day]["surges"] += 1
weekly_analysis = {}
for day, data in daily_data.items():
if data["count"] > 0:
weekly_analysis[day] = {
"avg_eta": round(sum(data["etas"]) / len(data["etas"]), 1),
"avg_fee": round(sum(data["fees"]) / len(data["fees"]), 2),
"max_eta": max(data["etas"]) if data["etas"] else 0,
"surge_rate": f"{data['surges'] / data['count'] * 100:.1f}%",
"observations": data["count"]
}
return weekly_analysisGeographic Analysis
def analyze_delivery_zones(snapshots_by_location):
"""Compare delivery metrics across different delivery zones."""
zone_analysis = {}
for zone_name, snapshots in snapshots_by_location.items():
etas = [s.estimated_time_minutes for s in snapshots]
fees = [s.delivery_fee for s in snapshots]
surges = [s for s in snapshots if s.surge_active]
zone_analysis[zone_name] = {
"avg_eta": round(sum(etas) / len(etas), 1),
"median_eta": sorted(etas)[len(etas) // 2],
"p95_eta": sorted(etas)[int(len(etas) * 0.95)],
"avg_fee": round(sum(fees) / len(fees), 2),
"surge_frequency": f"{len(surges) / len(snapshots) * 100:.1f}%",
"reliability_score": round(
len([e for e in etas if e <= 45]) / len(etas) * 100, 1
),
"data_points": len(snapshots)
}
return zone_analysisPractical Applications
1. Kitchen Prep Optimization
Use delivery pattern data to optimize kitchen preparation schedules:
def generate_prep_schedule(peak_periods, kitchen_capacity):
"""Generate kitchen preparation schedule based on delivery patterns."""
schedule = []
for peak in peak_periods:
# Start prep 30 minutes before peak
peak_hour = int(peak["start"].split(":")[0])
prep_start = f"{(peak_hour - 1) % 24:02d}:30"
# Scale staffing based on peak intensity
if peak["max_eta"] > 60:
staffing = "full_capacity"
prep_note = "Pre-prepare high-volume items, batch cook popular dishes"
elif peak["max_eta"] > 45:
staffing = "enhanced"
prep_note = "Pre-portion ingredients, prepare sauces in advance"
else:
staffing = "normal_plus"
prep_note = "Standard prep with extra attention to popular items"
schedule.append({
"prep_start": prep_start,
"peak_window": f"{peak['start']}-{peak['end']}",
"staffing_level": staffing,
"prep_notes": prep_note,
"expected_peak_eta": f"{peak['max_eta']} minutes"
})
return schedule2. Platform Performance Comparison
Compare delivery performance across platforms for the same restaurant:
def compare_platform_performance(snapshots):
"""Compare delivery metrics across platforms."""
by_platform = {}
for s in snapshots:
if s.platform not in by_platform:
by_platform[s.platform] = {
"etas": [], "fees": [], "surges": 0, "count": 0
}
by_platform[s.platform]["etas"].append(s.estimated_time_minutes)
by_platform[s.platform]["fees"].append(s.delivery_fee)
by_platform[s.platform]["count"] += 1
if s.surge_active:
by_platform[s.platform]["surges"] += 1
comparison = {}
for platform, data in by_platform.items():
comparison[platform] = {
"avg_eta_minutes": round(sum(data["etas"]) / len(data["etas"]), 1),
"avg_delivery_fee": round(sum(data["fees"]) / len(data["fees"]), 2),
"surge_frequency": f"{data['surges'] / data['count'] * 100:.1f}%",
"consistency_score": round(
(1 - (max(data["etas"]) - min(data["etas"])) / max(data["etas"])) * 100, 1
) if max(data["etas"]) > 0 else 100,
"fastest_eta": min(data["etas"]),
"slowest_eta": max(data["etas"])
}
return comparison3. Delivery Fee Optimization Timing
Identify the cheapest delivery windows for cost-conscious customers:
def find_optimal_order_windows(time_analysis):
"""Find time windows with lowest delivery fees and shortest ETAs."""
scored_windows = []
for hour, data in time_analysis.items():
# Lower fee and ETA = better score
fee_score = 100 - min(data["avg_delivery_fee"] * 10, 100)
eta_score = 100 - min(data["avg_delivery_time"], 100)
combined_score = fee_score * 0.4 + eta_score * 0.6
scored_windows.append({
"time": hour,
"score": round(combined_score, 1),
"avg_eta": data["avg_delivery_time"],
"avg_fee": data["avg_delivery_fee"],
"surge_probability": data["surge_probability"]
})
scored_windows.sort(key=lambda x: x["score"], reverse=True)
return {
"best_windows": scored_windows[:5],
"worst_windows": scored_windows[-5:],
"recommendation": f"Best ordering time: {scored_windows[0]['time']} "
f"(ETA: {scored_windows[0]['avg_eta']}min, "
f"Fee: {scored_windows[0]['avg_fee']})"
}Proxy Considerations
Delivery data monitoring requires frequent, sustained access to food delivery platform APIs. Key proxy requirements:
- High frequency access: Monitoring every 15-30 minutes requires proxies that handle sustained request patterns
- Location-specific results: Delivery ETAs depend on the customer’s location, requiring accurate geo-targeting
- Multi-platform support: Comparing performance across platforms needs reliable access to all targets
- Mobile carrier authenticity: Food delivery APIs validate that requests come from mobile devices
DataResearchTools mobile proxies provide the sustained, geo-targeted access needed for delivery operations monitoring across Southeast Asia. Their mobile carrier IPs ensure that platform APIs respond with authentic delivery data, the same information real customers see when placing orders.
Conclusion
Delivery driver availability and wait time data is operational intelligence that directly impacts restaurant performance on food delivery platforms. By monitoring these metrics systematically with DataResearchTools mobile proxies, F&B operators can optimize kitchen prep schedules, compare platform performance, and identify the delivery patterns that affect their business.
Start by monitoring your own restaurant’s delivery metrics across platforms, then expand to track competitors and broader market patterns. The insights from delivery data help you anticipate demand, manage customer expectations, and make better decisions about which platforms to prioritize in Southeast Asia’s competitive food delivery landscape.
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
- API vs Web Scraping: When You Need Proxies (and When You Don’t)
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
Related Reading
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)