local-market-research
Keyword research and competitor analysis for local service businesses using Google Keyword Planner (Ads API), Search Console, Auction Insights, and SERP data. Covers PPC keyword discovery, competitor ad analysis, GSC gap identification, and seasonal keyword planning. Built for home services โ finding leads, not blog traffic.
marketing
Skill Instructions
---
name: local-market-research
slug: local-market-research
version: 1.0.0
description: Keyword research and competitor analysis for local service businesses using real data sources Mary actually has access to โ Google Keyword Planner (via Ads API), Google Search Console, Google Ads Auction Insights, and SERP scraping. Covers PPC keyword discovery, local pack competitor analysis, search volume and CPC data, what competitors are bidding on, and identifying gaps. Designed for home services (tree care, HVAC, contracting) where the goal is more leads, not more blog traffic.
---
# Local Market Research
Keyword research and competitor analysis using real data. Not generic SEO โ this is for finding what Tulsa homeowners actually search for when they need a tree service, what competitors are doing in paid and organic, and where Ironwood has gaps or opportunities.
## When to Use
- Finding new keywords to target in Google Ads
- Identifying negative keywords from search term reports
- Analyzing what competitors are bidding on (auction insights)
- Discovering which organic keywords Ironwood already ranks for (GSC)
- Understanding the local pack competitive landscape for key terms
- Seasonal keyword planning (storm season, spring cleanup, etc.)
- Researching a new service line or geographic expansion
## Data Sources Available
| Source | What It Gives You | Access |
|--------|------------------|--------|
| Google Keyword Planner | Search volume, CPC estimates, keyword ideas | Google Ads API (`mary-google-ads.yaml`) |
| Google Search Console | Ironwood's actual impressions, clicks, position, CTR by query | Service account `gsc-service-account.json` |
| Google Ads Search Terms | Exact queries that triggered Ironwood's ads | Google Ads API |
| Google Ads Auction Insights | Who else is bidding on the same keywords, their impression share | Google Ads API |
| SERP scraping | Who's in the Local Pack, what organic results look like | WebFetch on target queries |
**No Ahrefs, no SEMrush.** Work with what's real and available.
## Workflow
### Keyword Research Workflow
1. **Pull GSC data** โ what queries is Ironwood already getting impressions for? High impression / low CTR = priority to fix. Low position (11โ20) = quick win opportunity.
2. **Pull search term report** from Google Ads โ what are people actually typing that triggered ads? This is real demand data.
3. **Run Keyword Planner** on seed terms โ get volume and CPC estimates for terms we're NOT targeting yet.
4. **Identify gaps** โ queries with volume and intent that Ironwood isn't running ads or ranking for.
5. **Categorize by intent** โ emergency (highest urgency/value), routine (trimming, pruning), seasonal, informational.
6. **Prioritize** โ by volume ร intent value รท estimated competition.
7. **Write report** โ save to `~/.openclaw/workspace-mary/reports/TASK-<num>-<slug>.md`.
8. **Create staged tasks** โ one per keyword cluster to add to campaigns; stage for approval.
### Competitor Analysis Workflow
1. **Pull Auction Insights** from Google Ads โ who is bidding on the same keywords as Ironwood? What's their impression share vs ours?
2. **SERP check** โ `curl` or WebFetch key queries ("tree service tulsa", "emergency tree removal tulsa", "tree trimming tulsa ok") to see:
- Who's in the Local Pack (top 3 map results)?
- Who's running Google Ads?
- Who's in organic top 5?
3. **Profile top competitors** โ for each competitor: Are they running LSA? What ad copy are they using? What's their GBP review count?
4. **Identify their gaps** โ what keywords are they NOT targeting that we could own?
5. **Report** โ competitive landscape summary with specific action items.
## Google Ads API โ Keyword Planner
```python
from google.ads.googleads.client import GoogleAdsClient
client = GoogleAdsClient.load_from_storage(
"/Users/dr.zoidberg/.openclaw/credentials/mary-google-ads.yaml"
)
# Get keyword ideas from seed terms
keyword_plan_idea_service = client.get_service("KeywordPlanIdeaService")
request = client.get_type("GenerateKeywordIdeasRequest")
request.customer_id = "<customer_id>" # from google-ads.yaml
request.language = client.get_service("GoogleAdsService").language_constant_path("1000") # English
request.geo_target_constants.append(
client.get_service("GoogleAdsService").geo_target_constant_path("1026648") # Tulsa, OK
)
request.keyword_seed.keywords.extend(["tree removal tulsa", "tree trimming", "stump grinding"])
response = keyword_plan_idea_service.generate_keyword_ideas(request=request)
for idea in response:
metrics = idea.keyword_idea_metrics
print(f"{idea.text}: vol={metrics.avg_monthly_searches}, cpc={metrics.average_cpc_micros / 1_000_000:.2f}, comp={metrics.competition.name}")
```
## Google Search Console โ Query Pull
```python
from google.oauth2 import service_account
from googleapiclient.discovery import build
creds = service_account.Credentials.from_service_account_file(
"/Users/dr.zoidberg/.openclaw/credentials/gsc-service-account.json",
scopes=["https://www.googleapis.com/auth/webmasters.readonly"]
)
service = build("searchconsole", "v1", credentials=creds)
response = service.searchanalytics().query(
siteUrl="https://ironwoodtreeco.com",
body={
"startDate": "2026-01-01",
"endDate": "2026-03-25",
"dimensions": ["query"],
"rowLimit": 500
}
).execute()
for row in response.get("rows", []):
print(f"{row['keys'][0]}: clicks={row['clicks']}, impr={row['impressions']}, pos={row['position']:.1f}, ctr={row['ctr']*100:.1f}%")
```
## Google Ads โ Search Terms Report
```python
# Pull actual search terms that triggered ads in the last 30 days
query = """
SELECT
search_term_view.search_term,
metrics.impressions,
metrics.clicks,
metrics.cost_micros,
metrics.conversions
FROM search_term_view
WHERE segments.date DURING LAST_30_DAYS
ORDER BY metrics.impressions DESC
LIMIT 200
"""
ga_service = client.get_service("GoogleAdsService")
response = ga_service.search(customer_id="<customer_id>", query=query)
for row in response:
print(f"{row.search_term_view.search_term}: {row.metrics.clicks} clicks, {row.metrics.conversions:.1f} conv")
```
## Google Ads โ Auction Insights
```python
query = """
SELECT
auction_insight.domain,
metrics.auction_insight_search_impression_share,
metrics.auction_insight_search_overlap_rate,
metrics.auction_insight_search_position_above_rate,
metrics.auction_insight_search_top_impression_percentage
FROM campaign
WHERE segments.date DURING LAST_30_DAYS
"""
```
## Keyword Categories for Home Services
Organize discoveries into these buckets:
### Emergency / High-Intent (Highest value โ bid aggressively)
- "emergency tree removal tulsa"
- "tree fell on house tulsa"
- "storm damage tree service"
- "hazard tree removal"
- "tree fell on fence" / "tree fell on car"
### Routine Service (High volume, strong intent)
- "tree trimming tulsa ok"
- "tree pruning tulsa"
- "tree removal cost tulsa"
- "stump grinding tulsa"
- "dead tree removal"
### Seasonal (Volume spikes โ prepare in advance)
- Spring: "tree trimming near me", "overgrown tree", "spring tree service"
- Summer: "tree health check", "heat stressed tree"
- Fall: "dead limb removal", "tree inspection"
- Storm season: "tree damage insurance claim", "fallen tree removal"
### Local Variants (Expand service area)
- "[service] + [city]" for each city in service area: Broken Arrow, Owasso, Bixby, Sand Springs, Claremore, Jenks, Sapulpa
- "tree service near [landmark]" patterns
### Negative Keywords to Mine
From search term report, flag these patterns:
- "DIY", "how to", "yourself"
- "free" (unless running a free estimate offer)
- "jobs", "career", "hire", "employment"
- "school", "training", "certification", "class"
- "rental", "rent a"
- Other cities/states outside service area
## Competitor Profiling Template
For each competitor found in SERP/auction insights:
```
Competitor: [Business Name]
Website: [URL]
GBP Reviews: [count] at [score] stars
LSA: [Yes/No โ Google Guaranteed badge visible?]
Running Ads: [Yes/No]
Ad Copy Sample: "[headline] / [description]"
Organic Rank: [position for key terms]
Strengths: [what they're doing well]
Gaps: [what they're missing that we can own]
```
## Priority Scoring
For each keyword opportunity, score it:
```
Priority = (Monthly Search Volume ร Intent Score) รท Estimated CPC
Intent Scores:
- Emergency/hazard: 5
- Service + location: 4
- Service generic: 3
- Cost/price queries: 3
- Informational: 1
High priority: Score > 500
Medium: 100โ500
Low / long-term: < 100
```
## Output Format
Every research task produces a report with:
1. **Executive Summary** โ top 3 findings, top 3 recommended actions
2. **Keyword Opportunities** โ prioritized table with volume, CPC, intent, priority score
3. **Negative Keywords to Add** โ from search term analysis
4. **Competitor Landscape** โ who's competing, their strengths/gaps, our opportunities
5. **Seasonal Calendar** โ when to run what (if applicable)
6. **Staged Tasks** โ one per action item, ready for Nick approval
## Related Skills
- `google-ads-manager` โ Implement the keyword findings into campaigns
- `local-services-ads` โ LSA keyword strategy (categories, service selection)
- `gbp-manager` โ GBP keyword usage (description, posts, services)
- `storybrand-copywriter` โ Turn keyword intent data into landing page and ad copy