Menu Engineering with Web Scraping: What Competitors Charge and Why
Menu engineering is the science of designing a menu to maximize profitability. Traditionally, it relied on internal data: food costs, contribution margins, and sales volume. But in the age of food delivery, where your menu sits alongside dozens of competitors on the same screen, external competitive data is equally important.
By scraping competitor menus from food delivery platforms, restaurants can understand pricing norms, identify profitable positioning opportunities, and design menus that outperform the competition. This guide covers how to collect and use competitor menu data for strategic menu engineering.
What Is Menu Engineering?
The Classic Framework
Menu engineering categorizes items into four quadrants based on popularity and profitability:
| Category | Popularity | Profitability | Strategy |
|---|---|---|---|
| Stars | High | High | Promote heavily |
| Plowhorses | High | Low | Increase price or reduce cost |
| Puzzles | Low | High | Improve visibility |
| Dogs | Low | Low | Consider removing |
The Missing Data: Competitive Context
Traditional menu engineering only uses internal data. But in food delivery, context matters:
- Is your “Star” item priced 30% higher than competitors? It might not stay popular
- Is your “Puzzle” actually priced too high compared to market? That is why nobody orders it
- Are competitors making their “Plowhorses” work by using different portion strategies?
- Could your “Dogs” be transformed by observing what competitors do with similar items?
Competitive menu data fills these gaps.
Collecting Competitor Menu Data
What to Collect
For each competitor, gather:
@dataclass
class CompetitorMenuItem:
restaurant_name: str
restaurant_id: str
platform: str
category: str
item_name: str
description: str
price: float
original_price: float # Before discounts
currency: str
has_photo: bool
photo_quality: str # professional, amateur, none
position_in_category: int # 1st, 2nd, etc.
modifiers_available: bool
modifier_groups: list
is_bestseller_tagged: bool
is_promoted: bool
portion_indicators: list # words suggesting size
scraped_at: datetimeMulti-Platform Collection
Collect from all platforms where your competitors are present:
class MenuDataCollector:
def __init__(self, proxy_user, proxy_pass):
self.proxy_user = proxy_user
self.proxy_pass = proxy_pass
def collect_competitor_menus(self, competitor_ids, country="SG"):
"""Collect menus for a list of competitors across platforms."""
all_menus = {}
session = self._get_session(country)
platforms = {
"grabfood": self._scrape_grabfood_menu,
"foodpanda": self._scrape_foodpanda_menu,
"shopeefood": self._scrape_shopeefood_menu
}
for comp_id, comp_info in competitor_ids.items():
all_menus[comp_id] = {"name": comp_info["name"], "platforms": {}}
for platform, scrape_func in platforms.items():
platform_id = comp_info.get(f"{platform}_id")
if platform_id:
menu = scrape_func(session, platform_id)
if menu:
all_menus[comp_id]["platforms"][platform] = menu
time.sleep(random.uniform(2, 4))
return all_menus
def _get_session(self, country):
session = requests.Session()
proxy_host = f"{country.lower()}-mobile.dataresearchtools.com"
session.proxies = {
"http": f"http://{self.proxy_user}:{self.proxy_pass}@{proxy_host}:8080",
"https": f"http://{self.proxy_user}:{self.proxy_pass}@{proxy_host}:8080"
}
session.headers.update({
"User-Agent": "Mozilla/5.0 (Linux; Android 14) AppleWebKit/537.36",
"Accept": "application/json"
})
return sessionAnalyzing Competitor Menu Structure
Category Structure Analysis
How competitors organize their menus reveals strategic thinking:
def analyze_menu_structure(competitor_menus):
"""Analyze how competitors structure their menu categories."""
structures = []
for comp_id, data in competitor_menus.items():
for platform, menu_items in data["platforms"].items():
categories = {}
for item in menu_items:
cat = item.get("category", "Uncategorized")
if cat not in categories:
categories[cat] = {
"item_count": 0,
"avg_price": [],
"has_photos": 0,
"promoted_items": 0
}
categories[cat]["item_count"] += 1
categories[cat]["avg_price"].append(item.get("price", 0))
if item.get("has_photo"):
categories[cat]["has_photos"] += 1
if item.get("is_promoted"):
categories[cat]["promoted_items"] += 1
structure = {
"restaurant": data["name"],
"platform": platform,
"total_categories": len(categories),
"total_items": sum(c["item_count"] for c in categories.values()),
"categories": {
name: {
"items": cat_data["item_count"],
"avg_price": round(
sum(cat_data["avg_price"]) / len(cat_data["avg_price"]), 2
),
"photo_rate": round(
cat_data["has_photos"] / cat_data["item_count"] * 100, 1
),
"promoted_rate": round(
cat_data["promoted_items"] / cat_data["item_count"] * 100, 1
)
}
for name, cat_data in categories.items()
}
}
structures.append(structure)
return structuresFirst-Item Analysis
The first item in each category gets the most visibility. Analyze what competitors put first:
def analyze_first_items(competitor_menus):
"""Analyze the first item in each category across competitors."""
first_items = []
for comp_id, data in competitor_menus.items():
for platform, menu_items in data["platforms"].items():
# Group by category and find first item
categories_seen = set()
for item in menu_items:
cat = item.get("category", "")
if cat not in categories_seen:
categories_seen.add(cat)
first_items.append({
"restaurant": data["name"],
"category": cat,
"first_item": item["item_name"],
"price": item["price"],
"has_photo": item.get("has_photo", False),
"is_bestseller": item.get("is_bestseller_tagged", False),
"description_length": len(item.get("description", ""))
})
# Patterns in first-item selection
total = len(first_items)
patterns = {
"pct_with_photos": round(
len([i for i in first_items if i["has_photo"]]) / total * 100, 1
),
"pct_bestsellers": round(
len([i for i in first_items if i["is_bestseller"]]) / total * 100, 1
),
"avg_price": round(
sum(i["price"] for i in first_items) / total, 2
),
"avg_description_length": round(
sum(i["description_length"] for i in first_items) / total
)
}
return {"first_items": first_items, "patterns": patterns}Price Architecture Analysis
Price Point Distribution
def analyze_price_architecture(competitor_menus, your_menu):
"""Compare your price architecture against competitors."""
competitor_prices = []
your_prices = []
for item in your_menu:
your_prices.append(item["price"])
for comp_id, data in competitor_menus.items():
for platform, items in data["platforms"].items():
for item in items:
competitor_prices.append(item["price"])
# Price distribution
def price_distribution(prices, currency="SGD"):
sorted_prices = sorted(prices)
n = len(sorted_prices)
return {
"min": sorted_prices[0],
"q1": sorted_prices[n // 4],
"median": sorted_prices[n // 2],
"q3": sorted_prices[3 * n // 4],
"max": sorted_prices[-1],
"mean": round(sum(sorted_prices) / n, 2)
}
return {
"your_distribution": price_distribution(your_prices),
"market_distribution": price_distribution(competitor_prices),
"price_index": round(
(sum(your_prices) / len(your_prices)) /
(sum(competitor_prices) / len(competitor_prices)) * 100, 1
),
"interpretation": "Your prices are "
+ ("above" if sum(your_prices) / len(your_prices) >
sum(competitor_prices) / len(competitor_prices)
else "below")
+ " market average"
}Anchor Pricing Detection
Identify if competitors use pricing psychology:
def detect_pricing_psychology(menu_items):
"""Detect pricing psychology tactics in competitor menus."""
tactics = {
"charm_pricing": 0, # $9.90, $14.99
"round_pricing": 0, # $10, $15
"prestige_pricing": 0, # $12, $18 (no decimals, not round)
"decoy_items": [], # Unusually expensive items
"bundle_discounts": 0, # Set meals cheaper than individual
}
prices = [item["price"] for item in menu_items]
for price in prices:
decimal_part = price - int(price)
if decimal_part >= 0.89: # .90, .95, .99
tactics["charm_pricing"] += 1
elif decimal_part == 0:
if price % 5 == 0:
tactics["round_pricing"] += 1
else:
tactics["prestige_pricing"] += 1
# Detect decoy pricing (items priced 2x+ above category average)
by_category = {}
for item in menu_items:
cat = item.get("category", "")
if cat not in by_category:
by_category[cat] = []
by_category[cat].append(item)
for cat, items in by_category.items():
cat_prices = [i["price"] for i in items]
avg = sum(cat_prices) / len(cat_prices)
for item in items:
if item["price"] > avg * 2:
tactics["decoy_items"].append({
"item": item["item_name"],
"price": item["price"],
"category_avg": round(avg, 2),
"multiple": round(item["price"] / avg, 1)
})
total = len(prices)
tactics["charm_pricing_pct"] = round(tactics["charm_pricing"] / total * 100, 1)
tactics["round_pricing_pct"] = round(tactics["round_pricing"] / total * 100, 1)
return tacticsItem-Level Competitive Analysis
Similar Item Price Comparison
def compare_similar_items(your_items, competitor_items, category=None):
"""Find and compare similarly named items across restaurants."""
from difflib import SequenceMatcher
comparisons = []
target_items = your_items if not category else [
i for i in your_items if i.get("category") == category
]
for your_item in target_items:
matches = []
for comp_item in competitor_items:
similarity = SequenceMatcher(
None,
your_item["item_name"].lower(),
comp_item["item_name"].lower()
).ratio()
if similarity >= 0.6:
matches.append({
"competitor": comp_item.get("restaurant_name"),
"item_name": comp_item["item_name"],
"price": comp_item["price"],
"similarity": round(similarity, 2),
"has_photo": comp_item.get("has_photo", False),
"description_length": len(comp_item.get("description", ""))
})
if matches:
match_prices = [m["price"] for m in matches]
comparisons.append({
"your_item": your_item["item_name"],
"your_price": your_item["price"],
"competitor_matches": sorted(matches, key=lambda x: x["price"]),
"market_avg": round(sum(match_prices) / len(match_prices), 2),
"market_min": min(match_prices),
"market_max": max(match_prices),
"your_position": "cheapest" if your_item["price"] <= min(match_prices)
else "most expensive" if your_item["price"] >= max(match_prices)
else "mid-range",
"price_vs_avg": f"{round((your_item['price'] / (sum(match_prices) / len(match_prices)) - 1) * 100, 1)}%"
})
return comparisonsDescription Quality Analysis
def analyze_descriptions(menu_items):
"""Analyze quality and patterns in menu item descriptions."""
description_stats = {
"total_items": len(menu_items),
"items_with_description": 0,
"avg_description_length": 0,
"common_descriptors": {},
"uses_ingredients": 0,
"uses_cooking_method": 0,
"uses_origin": 0,
"uses_sensory_words": 0
}
sensory_words = ["crispy", "tender", "juicy", "creamy", "smoky", "tangy",
"spicy", "rich", "fresh", "aromatic", "crunchy", "silky",
"fluffy", "zesty", "savory"]
cooking_methods = ["grilled", "fried", "steamed", "baked", "roasted",
"braised", "stir-fried", "smoked", "slow-cooked",
"pan-seared", "charcoal"]
origin_words = ["traditional", "authentic", "homemade", "signature",
"classic", "artisan", "local", "imported", "premium"]
lengths = []
for item in menu_items:
desc = item.get("description", "")
if desc:
description_stats["items_with_description"] += 1
lengths.append(len(desc))
desc_lower = desc.lower()
if any(w in desc_lower for w in sensory_words):
description_stats["uses_sensory_words"] += 1
if any(w in desc_lower for w in cooking_methods):
description_stats["uses_cooking_method"] += 1
if any(w in desc_lower for w in origin_words):
description_stats["uses_origin"] += 1
if lengths:
description_stats["avg_description_length"] = round(sum(lengths) / len(lengths))
description_stats["description_rate"] = f"{description_stats['items_with_description'] / len(menu_items) * 100:.1f}%"
return description_statsActionable Menu Engineering Recommendations
Generating Recommendations
def generate_menu_recommendations(your_menu, competitive_analysis, sales_data=None):
"""Generate actionable menu engineering recommendations."""
recommendations = []
for comparison in competitive_analysis:
your_price = comparison["your_price"]
market_avg = comparison["market_avg"]
# Price adjustment recommendations
if your_price > market_avg * 1.2:
recommendations.append({
"item": comparison["your_item"],
"type": "price_review",
"priority": "high",
"finding": f"Priced {round((your_price/market_avg - 1) * 100)}% above market average",
"suggestion": f"Consider reducing to {round(market_avg * 1.05, 2)} - {round(market_avg * 1.15, 2)} "
f"or enhance perceived value with better photos and descriptions"
})
elif your_price < market_avg * 0.8:
recommendations.append({
"item": comparison["your_item"],
"type": "price_opportunity",
"priority": "medium",
"finding": f"Priced {round((1 - your_price/market_avg) * 100)}% below market average",
"suggestion": f"Potential to increase price to {round(market_avg * 0.95, 2)} without losing competitiveness"
})
# Photo recommendations
competitor_photo_rate = len(
[m for m in comparison["competitor_matches"] if m.get("has_photo")]
) / len(comparison["competitor_matches"]) if comparison["competitor_matches"] else 0
your_has_photo = next(
(item.get("has_photo") for item in your_menu
if item["item_name"] == comparison["your_item"]),
False
)
if not your_has_photo and competitor_photo_rate > 0.5:
recommendations.append({
"item": comparison["your_item"],
"type": "add_photo",
"priority": "high",
"finding": f"{round(competitor_photo_rate * 100)}% of competitors have photos for similar items",
"suggestion": "Add a professional food photo to increase click-through rate"
})
return sorted(recommendations, key=lambda x: {"high": 0, "medium": 1, "low": 2}[x["priority"]])Proxy Requirements for Menu Engineering
Menu engineering data collection requires:
- Accurate location targeting: Menu pricing varies by location, requiring country-specific mobile proxies
- Multi-platform access: Compare menus across GrabFood, Foodpanda, and ShopeeFood
- Consistent access: Regular menu snapshots for trend tracking
- High success rate: Missing data leads to incomplete competitive analysis
DataResearchTools mobile proxies provide the reliable, geo-targeted access needed to collect comprehensive competitor menu data across all major SEA food delivery platforms.
Conclusion
Menu engineering powered by competitive web scraping data transforms menu design from guesswork into science. By collecting and analyzing competitor menus using DataResearchTools mobile proxies, restaurants can optimize pricing, improve item positioning, and make data-driven decisions about menu changes.
The most successful F&B operators in Southeast Asia treat their menu as a living document, continuously refined based on competitive intelligence. Start by scraping your top five competitors, run the analysis, and identify the quick wins: pricing adjustments, missing photos, and description improvements that can boost revenue without changing a single recipe.
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
- API vs Web Scraping: When You Need Proxies (and When You Don’t)
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)
Related Reading
- Best Proxies for Food Delivery Platform Scraping
- How Cloud Kitchens Use Proxies for Competitive Menu Analysis
- aiohttp + BeautifulSoup: Async Python Scraping
- How to Scrape AliExpress Product Data Without Getting Blocked
- Amazon Buy Box Monitoring: Proxy Setup for Continuous Tracking
- How Anti-Bot Systems Detect Scrapers (Cloudflare, Akamai, PerimeterX)