Building a Food Trend Analysis Dashboard with Proxy-Powered Scraping
Food trends in Southeast Asia move fast. What is trending on GrabFood in Bangkok this week may sweep through Singapore and Jakarta next month. For F&B operators, investors, and market researchers, having a real-time view of these trends provides a decisive competitive advantage. This guide walks through building a food trend analysis dashboard powered by data scraped from food delivery platforms across Southeast Asia.
Dashboard Overview
What the Dashboard Tracks
A comprehensive food trend dashboard monitors:
| Trend Category | Data Points | Update Frequency |
|---|---|---|
| Cuisine popularity | New restaurants by cuisine, order volume proxies | Daily |
| Price movements | Average prices by cuisine and market | Weekly |
| Menu innovation | New menu items, seasonal offerings | Daily |
| Rating trends | Rating changes, review sentiment | Weekly |
| Delivery patterns | ETA trends, coverage expansion | Daily |
| Promotion intensity | Discount depth, voucher volume | Hourly |
| New entrants | New restaurant openings | Daily |
| Closures | Removed listings | Weekly |
Architecture
[Data Collection] [Data Processing] [Analytics] [Presentation]
GrabFood ---> ETL Pipeline ---> Trend ---> Dashboard
Foodpanda ---> Data Cleaning ---> Algorithms ---> API
ShopeeFood ---> Normalization ---> Scoring ---> Alerts
GoFood ---> Storage ---> Forecasting ---> Reports
[Mobile Proxies via DataResearchTools]Data Collection Layer
Multi-Platform Data Collector
import requests
import time
import random
import json
from datetime import datetime, timedelta
from dataclasses import dataclass, field
from typing import List, Dict
@dataclass
class MarketSnapshot:
country: str
city: str
platform: str
timestamp: datetime
total_restaurants: int
cuisine_breakdown: Dict[str, int]
price_distribution: Dict[str, float]
avg_rating: float
new_restaurants: List[dict]
promotional_count: int
avg_delivery_time: int
avg_delivery_fee: float
class TrendDataCollector:
def __init__(self, proxy_user, proxy_pass):
self.proxy_user = proxy_user
self.proxy_pass = proxy_pass
def _get_session(self, country):
session = requests.Session()
proxy_host = f"{country.lower()}-mobile.dataresearchtools.com"
session.proxies = {
"http": f"http://{self.proxy_user}:{self.proxy_pass}@{proxy_host}:8080",
"https": f"http://{self.proxy_user}:{self.proxy_pass}@{proxy_host}:8080"
}
session.headers.update({
"User-Agent": "Mozilla/5.0 (Linux; Android 14) AppleWebKit/537.36",
"Accept": "application/json"
})
return session
def collect_market_snapshot(self, country, city, coordinates):
"""Collect a comprehensive market snapshot for a city."""
session = self._get_session(country)
lat, lng = coordinates
# Collect from each platform
platforms = ["grabfood", "foodpanda"]
snapshots = []
for platform in platforms:
restaurants = self._fetch_restaurants(session, platform, lat, lng)
if restaurants:
snapshot = self._process_snapshot(
country, city, platform, restaurants
)
snapshots.append(snapshot)
time.sleep(random.uniform(3, 6))
return snapshots
def _process_snapshot(self, country, city, platform, restaurants):
"""Process raw restaurant data into a market snapshot."""
cuisine_count = {}
prices = []
ratings = []
delivery_times = []
delivery_fees = []
promotional = 0
for r in restaurants:
# Cuisine tracking
cuisine = r.get("cuisine_type", r.get("cuisines", ["Other"]))
if isinstance(cuisine, list):
for c in cuisine:
cuisine_count[c] = cuisine_count.get(c, 0) + 1
else:
cuisine_count[cuisine] = cuisine_count.get(cuisine, 0) + 1
# Rating tracking
if r.get("rating"):
ratings.append(r["rating"])
# Delivery metrics
if r.get("delivery_time"):
delivery_times.append(r["delivery_time"])
if r.get("delivery_fee") is not None:
delivery_fees.append(r["delivery_fee"])
# Promotion tracking
if r.get("promotions") or r.get("has_promotion"):
promotional += 1
return MarketSnapshot(
country=country,
city=city,
platform=platform,
timestamp=datetime.utcnow(),
total_restaurants=len(restaurants),
cuisine_breakdown=cuisine_count,
price_distribution={},
avg_rating=round(sum(ratings) / len(ratings), 2) if ratings else 0,
new_restaurants=[],
promotional_count=promotional,
avg_delivery_time=round(
sum(delivery_times) / len(delivery_times)
) if delivery_times else 0,
avg_delivery_fee=round(
sum(delivery_fees) / len(delivery_fees), 2
) if delivery_fees else 0
)Scheduled Collection
# Define collection targets
COLLECTION_TARGETS = {
"SG": {
"Singapore Central": (1.2900, 103.8500),
"Singapore East": (1.3400, 103.9500),
"Singapore West": (1.3400, 103.7400),
"Singapore North": (1.4200, 103.8300)
},
"MY": {
"KL Central": (3.1390, 101.6869),
"Petaling Jaya": (3.1073, 101.6067),
"Penang Georgetown": (5.4141, 100.3288)
},
"TH": {
"Bangkok Central": (13.7563, 100.5018),
"Bangkok Sukhumvit": (13.7310, 100.5673),
"Chiang Mai": (18.7883, 98.9853)
},
"PH": {
"Manila Makati": (14.5547, 121.0244),
"Manila BGC": (14.5518, 121.0509),
"Cebu": (10.3157, 123.8854)
},
"ID": {
"Jakarta Central": (-6.2088, 106.8456),
"Jakarta South": (-6.2615, 106.8106),
"Bali Seminyak": (-8.6908, 115.1619)
}
}
def run_daily_collection(collector):
"""Run daily market data collection across all targets."""
daily_snapshots = []
for country, cities in COLLECTION_TARGETS.items():
for city_name, coords in cities.items():
snapshots = collector.collect_market_snapshot(
country, city_name, coords
)
daily_snapshots.extend(snapshots)
print(f"Collected: {country}/{city_name} "
f"({len(snapshots)} platform snapshots)")
time.sleep(random.uniform(5, 10))
return daily_snapshotsTrend Detection Algorithms
Cuisine Trend Detection
def detect_cuisine_trends(snapshots_history, lookback_days=30, comparison_days=90):
"""Detect trending and declining cuisine categories."""
now = datetime.utcnow()
recent_cutoff = now - timedelta(days=lookback_days)
baseline_cutoff = now - timedelta(days=comparison_days)
# Separate recent and baseline data
recent = [s for s in snapshots_history if s.timestamp >= recent_cutoff]
baseline = [s for s in snapshots_history
if baseline_cutoff <= s.timestamp < recent_cutoff]
def avg_cuisine_share(snapshots):
total_by_cuisine = {}
total_restaurants = 0
for s in snapshots:
for cuisine, count in s.cuisine_breakdown.items():
total_by_cuisine[cuisine] = total_by_cuisine.get(cuisine, 0) + count
total_restaurants += count
if total_restaurants == 0:
return {}
return {c: count / total_restaurants for c, count in total_by_cuisine.items()}
recent_shares = avg_cuisine_share(recent)
baseline_shares = avg_cuisine_share(baseline)
trends = []
all_cuisines = set(list(recent_shares.keys()) + list(baseline_shares.keys()))
for cuisine in all_cuisines:
recent_share = recent_shares.get(cuisine, 0)
baseline_share = baseline_shares.get(cuisine, 0)
if baseline_share > 0:
change_pct = (recent_share - baseline_share) / baseline_share * 100
elif recent_share > 0:
change_pct = 100 # New cuisine category
else:
change_pct = 0
trends.append({
"cuisine": cuisine,
"recent_share": f"{recent_share * 100:.1f}%",
"baseline_share": f"{baseline_share * 100:.1f}%",
"change_pct": round(change_pct, 1),
"direction": "trending_up" if change_pct > 10 else
"trending_down" if change_pct < -10 else "stable",
"is_new": baseline_share == 0 and recent_share > 0
})
return sorted(trends, key=lambda x: x["change_pct"], reverse=True)New Restaurant Trend Detection
def detect_new_restaurant_trends(current_listings, previous_listings):
"""Identify new restaurants and categorize the trend."""
current_ids = {r.get("id") for r in current_listings}
previous_ids = {r.get("id") for r in previous_listings}
new_ids = current_ids - previous_ids
closed_ids = previous_ids - current_ids
new_restaurants = [r for r in current_listings if r.get("id") in new_ids]
closed_restaurants = [r for r in previous_listings if r.get("id") in closed_ids]
# Analyze new restaurant patterns
new_by_cuisine = {}
for r in new_restaurants:
cuisine = r.get("cuisine_type", "Other")
if isinstance(cuisine, list):
cuisine = cuisine[0] if cuisine else "Other"
new_by_cuisine[cuisine] = new_by_cuisine.get(cuisine, 0) + 1
return {
"new_restaurants": len(new_restaurants),
"closed_restaurants": len(closed_restaurants),
"net_change": len(new_restaurants) - len(closed_restaurants),
"new_by_cuisine": dict(sorted(
new_by_cuisine.items(), key=lambda x: x[1], reverse=True
)),
"top_new_entries": [
{
"name": r.get("name"),
"cuisine": r.get("cuisine_type"),
"rating": r.get("rating"),
"location": r.get("address")
}
for r in sorted(
new_restaurants,
key=lambda x: x.get("rating", 0),
reverse=True
)[:10]
],
"notable_closures": [
{
"name": r.get("name"),
"cuisine": r.get("cuisine_type"),
"rating": r.get("rating"),
"review_count": r.get("review_count")
}
for r in sorted(
closed_restaurants,
key=lambda x: x.get("review_count", 0),
reverse=True
)[:10]
]
}Price Trend Analysis
def analyze_price_trends(price_history, segment="all"):
"""Analyze price trends over time."""
# Group by time period
monthly_prices = {}
for entry in price_history:
month_key = entry["timestamp"].strftime("%Y-%m")
if month_key not in monthly_prices:
monthly_prices[month_key] = []
monthly_prices[month_key].append(entry["price"])
# Calculate monthly statistics
trend_data = []
for month, prices in sorted(monthly_prices.items()):
trend_data.append({
"month": month,
"avg_price": round(sum(prices) / len(prices), 2),
"median_price": sorted(prices)[len(prices) // 2],
"min_price": min(prices),
"max_price": max(prices),
"data_points": len(prices)
})
# Calculate trend direction and rate
if len(trend_data) >= 3:
first_three_avg = sum(t["avg_price"] for t in trend_data[:3]) / 3
last_three_avg = sum(t["avg_price"] for t in trend_data[-3:]) / 3
trend_direction = "increasing" if last_three_avg > first_three_avg * 1.02 else \
"decreasing" if last_three_avg < first_three_avg * 0.98 else "stable"
monthly_change_rate = round(
(last_three_avg - first_three_avg) / first_three_avg /
len(trend_data) * 100, 2
)
else:
trend_direction = "insufficient_data"
monthly_change_rate = 0
return {
"segment": segment,
"trend_data": trend_data,
"trend_direction": trend_direction,
"monthly_change_rate": f"{monthly_change_rate}%",
"period": f"{trend_data[0]['month']} to {trend_data[-1]['month']}" if trend_data else ""
}Dashboard Components
Market Overview Panel
def generate_market_overview(latest_snapshots):
"""Generate data for the market overview panel."""
by_country = {}
for snapshot in latest_snapshots:
country = snapshot.country
if country not in by_country:
by_country[country] = {
"total_restaurants": 0,
"cities_covered": set(),
"platforms": set(),
"avg_rating": [],
"avg_delivery_time": [],
"promotion_rate": []
}
by_country[country]["total_restaurants"] += snapshot.total_restaurants
by_country[country]["cities_covered"].add(snapshot.city)
by_country[country]["platforms"].add(snapshot.platform)
by_country[country]["avg_rating"].append(snapshot.avg_rating)
by_country[country]["avg_delivery_time"].append(snapshot.avg_delivery_time)
if snapshot.total_restaurants > 0:
by_country[country]["promotion_rate"].append(
snapshot.promotional_count / snapshot.total_restaurants
)
overview = {}
for country, data in by_country.items():
overview[country] = {
"total_restaurants": data["total_restaurants"],
"cities": len(data["cities_covered"]),
"platforms": len(data["platforms"]),
"avg_rating": round(
sum(data["avg_rating"]) / len(data["avg_rating"]), 2
) if data["avg_rating"] else 0,
"avg_delivery_time_min": round(
sum(data["avg_delivery_time"]) / len(data["avg_delivery_time"])
) if data["avg_delivery_time"] else 0,
"promotion_rate": f"{sum(data['promotion_rate']) / len(data['promotion_rate']) * 100:.1f}%"
if data["promotion_rate"] else "0%"
}
return overviewTrend Alerts Panel
def generate_trend_alerts(cuisine_trends, price_trends, new_entries):
"""Generate alert notifications for significant trends."""
alerts = []
# Cuisine trend alerts
for trend in cuisine_trends:
if trend["change_pct"] > 25:
alerts.append({
"type": "trending_cuisine",
"severity": "info",
"title": f"{trend['cuisine']} is trending up",
"message": f"{trend['cuisine']} cuisine grew {trend['change_pct']}% "
f"in market share over the past month",
"action": f"Consider adding {trend['cuisine']} options to your menu"
})
elif trend["change_pct"] < -25:
alerts.append({
"type": "declining_cuisine",
"severity": "warning",
"title": f"{trend['cuisine']} is declining",
"message": f"{trend['cuisine']} cuisine dropped {abs(trend['change_pct'])}% "
f"in market share",
"action": "Review your menu items in this category"
})
# New cuisine category alert
for trend in cuisine_trends:
if trend["is_new"]:
alerts.append({
"type": "new_cuisine",
"severity": "info",
"title": f"New cuisine category: {trend['cuisine']}",
"message": f"{trend['cuisine']} cuisine has appeared in the market",
"action": "Monitor for growth potential"
})
# Market growth/contraction alerts
net_change = new_entries.get("net_change", 0)
if net_change > 50:
alerts.append({
"type": "market_growth",
"severity": "info",
"title": "Significant market growth detected",
"message": f"{net_change} net new restaurants added",
"action": "Market is expanding - consider accelerating growth plans"
})
elif net_change < -20:
alerts.append({
"type": "market_contraction",
"severity": "warning",
"title": "Market contraction detected",
"message": f"{abs(net_change)} net restaurants lost",
"action": "Market may be saturated - focus on differentiation"
})
return sorted(alerts, key=lambda x: {"warning": 0, "info": 1}[x["severity"]])Cross-Market Comparison Panel
def generate_cross_market_comparison(overview_data):
"""Generate cross-market comparison data for the dashboard."""
markets = list(overview_data.keys())
comparison = {
"markets": markets,
"metrics": {
"restaurant_density": {
m: overview_data[m]["total_restaurants"]
for m in markets
},
"avg_rating": {
m: overview_data[m]["avg_rating"]
for m in markets
},
"delivery_speed": {
m: overview_data[m]["avg_delivery_time_min"]
for m in markets
},
"promotion_intensity": {
m: overview_data[m]["promotion_rate"]
for m in markets
}
},
"rankings": {
"most_restaurants": sorted(markets, key=lambda m: overview_data[m]["total_restaurants"], reverse=True)[0],
"highest_rated": sorted(markets, key=lambda m: overview_data[m]["avg_rating"], reverse=True)[0],
"fastest_delivery": sorted(markets, key=lambda m: overview_data[m]["avg_delivery_time_min"])[0],
"most_promotions": sorted(
markets,
key=lambda m: float(overview_data[m]["promotion_rate"].rstrip("%")),
reverse=True
)[0]
}
}
return comparisonBuilding the Dashboard Frontend
API Endpoints
from flask import Flask, jsonify, request
from datetime import datetime, timedelta
app = Flask(__name__)
@app.route("/api/v1/overview")
def market_overview():
"""Market overview endpoint."""
latest = get_latest_snapshots()
return jsonify(generate_market_overview(latest))
@app.route("/api/v1/trends/cuisine")
def cuisine_trends():
"""Cuisine trend endpoint."""
days = request.args.get("days", 30, type=int)
history = get_snapshot_history(days=days + 90)
trends = detect_cuisine_trends(history, lookback_days=days)
return jsonify(trends)
@app.route("/api/v1/trends/price")
def price_trends():
"""Price trend endpoint."""
country = request.args.get("country", "SG")
cuisine = request.args.get("cuisine", "all")
history = get_price_history(country=country, cuisine=cuisine)
return jsonify(analyze_price_trends(history, segment=cuisine))
@app.route("/api/v1/trends/new-restaurants")
def new_restaurants():
"""New restaurant entries endpoint."""
country = request.args.get("country", "SG")
current = get_current_listings(country)
previous = get_previous_listings(country, days_ago=7)
return jsonify(detect_new_restaurant_trends(current, previous))
@app.route("/api/v1/alerts")
def trend_alerts():
"""Trend alerts endpoint."""
cuisine = detect_cuisine_trends(get_snapshot_history())
prices = analyze_price_trends(get_price_history())
entries = detect_new_restaurant_trends(
get_current_listings(), get_previous_listings()
)
return jsonify(generate_trend_alerts(cuisine, prices, entries))
@app.route("/api/v1/compare")
def cross_market():
"""Cross-market comparison endpoint."""
latest = get_latest_snapshots()
overview = generate_market_overview(latest)
return jsonify(generate_cross_market_comparison(overview))Dashboard Layout
A practical dashboard layout for food trend analysis:
+------------------------------------------+
| MARKET OVERVIEW |
| [SG] [MY] [TH] [PH] [ID] |
| Restaurants | Avg Rating | Delivery Time |
+------------------------------------------+
| | |
| CUISINE TRENDS | PRICE TRENDS |
| [Rising] | [Chart: 6 months] |
| [Declining] | [By Cuisine] |
| [New Categories] | [By Country] |
| | |
+------------------------------------------+
| | |
| NEW ENTRANTS | ALERTS |
| [Top 10 New] | [Trending alerts] |
| [Notable Closures] | [Warning alerts] |
| [By Cuisine] | [Action items] |
| | |
+------------------------------------------+
| CROSS-MARKET COMPARISON |
| [Density] [Ratings] [Speed] [Promotions] |
+------------------------------------------+Data Storage
Time-Series Database Design
CREATE TABLE market_snapshots (
id SERIAL PRIMARY KEY,
country VARCHAR(2),
city VARCHAR(100),
platform VARCHAR(50),
timestamp TIMESTAMP,
total_restaurants INTEGER,
avg_rating DECIMAL(3,2),
avg_delivery_time INTEGER,
avg_delivery_fee DECIMAL(10,2),
promotional_count INTEGER,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE cuisine_snapshots (
id SERIAL PRIMARY KEY,
snapshot_id INTEGER REFERENCES market_snapshots(id),
cuisine VARCHAR(100),
restaurant_count INTEGER,
share_pct DECIMAL(5,2)
);
CREATE TABLE price_history (
id SERIAL PRIMARY KEY,
country VARCHAR(2),
city VARCHAR(100),
platform VARCHAR(50),
cuisine VARCHAR(100),
item_name VARCHAR(255),
price DECIMAL(10,2),
currency VARCHAR(3),
timestamp TIMESTAMP
);
CREATE INDEX idx_snapshots_country_date ON market_snapshots(country, timestamp);
CREATE INDEX idx_prices_country_cuisine ON price_history(country, cuisine, timestamp);Proxy Infrastructure for Trend Monitoring
A food trend dashboard requires sustained, reliable data collection across multiple countries and platforms. Key proxy requirements:
- Multi-country coverage: Simultaneous collection from SG, MY, TH, PH, and ID
- Daily reliability: Trend detection depends on consistent daily snapshots
- High success rates: Missing data creates gaps in trend analysis
- Scalability: As you add cities and platforms, proxy capacity must scale
DataResearchTools mobile proxies are designed for exactly this kind of sustained, multi-market data collection. Their Southeast Asian mobile carrier IPs provide the geographic coverage and platform trust needed to maintain continuous food delivery data pipelines across the region.
Conclusion
A food trend analysis dashboard transforms raw food delivery platform data into strategic intelligence. By combining proxy-powered scraping with trend detection algorithms and clear visualization, F&B operators and analysts gain real-time visibility into Southeast Asia’s dynamic food market.
The foundation of any trend dashboard is reliable data collection. DataResearchTools mobile proxies provide the infrastructure needed to collect consistent, accurate data from food delivery platforms across all major SEA markets. Build your collection pipeline first, add trend detection logic, and then layer on the dashboard and alerting components to create a comprehensive food intelligence system.
Start with one country and one platform, validate your trend detection algorithms against known market movements, and expand coverage as your dashboard proves its value. Over time, the historical data you accumulate becomes increasingly valuable for identifying long-term trends and predicting future market movements.
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
- API vs Web Scraping: When You Need Proxies (and When You Don’t)
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
Related Reading
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)