guide 20 min read

Building SEO Tools with SERP APIs: Lessons from ByteDance

Former ByteDance engineer shares how to build professional SEO tools using SERP APIs. Step-by-step guide to rank trackers, keyword research, and competitor analysis tools.

Alex Zhang, Former ByteDance Senior Engineer
Building SEO Tools with SERP APIs: Lessons from ByteDance

Building SEO Tools with SERP APIs: Lessons from ByteDance

I spent two years at ByteDance building SEO tools for international market expansion. We tried every approach—custom scrapers, third-party tools, headless browsers.

What worked? SERP APIs. Everything else was too slow or too expensive.

Here’s what I learned building tools that tracked millions of keywords across 40 countries.

What Nobody Tells You About SEO Tools

Most developers think building an SEO tool is hard. It’s not. The hard part is building one that:

  • Doesn’t break every week
  • Scales beyond 100 keywords
  • Actually makes money
  • Provides accurate data

SERP APIs solve the first three. The fourth one? That’s on you.

The Three Core SEO Tools Everyone Needs

1. Rank Tracker

The most common request. Also the easiest to build.

Here’s a production-ready rank tracker I built in 4 hours:

class RankTracker {
  constructor(serpAPI) {
    this.api = serpAPI;
    this.cache = new Map();
  }
  
  async trackKeyword(keyword, domain, engines = ['google', 'bing']) {
    const results = {};
    
    for (const engine of engines) {
      // Check cache first (save API credits)
      const cacheKey = `${engine}:${keyword}`;
      
      if (this.cache.has(cacheKey)) {
        results[engine] = this.cache.get(cacheKey);
        continue;
      }
      
      // Fetch fresh results
      const data = await this.api.search({
        s: keyword,
        t: engine,
        num: 100 // Track top 100 positions
      });
      
      // Find domain position
      const position = data.organic.findIndex(
        r => r.url.includes(domain)
      );
      
      results[engine] = {
        position: position === -1 ? null : position + 1,
        url: position === -1 ? null : data.organic[position].url,
        snippet: position === -1 ? null : data.organic[position].snippet,
        features: this.extractSERPFeatures(data),
        topCompetitors: data.organic.slice(0, 3).map(r => ({
          domain: this.extractDomain(r.url),
          position: data.organic.indexOf(r) + 1
        }))
      };
      
      // Cache for 6 hours
      this.cache.set(cacheKey, results[engine]);
      setTimeout(() => this.cache.delete(cacheKey), 6 * 60 * 60 * 1000);
    }
    
    return results;
  }
  
  extractSERPFeatures(data) {
    // Check what SERP features appear
    return {
      hasFeaturedSnippet: !!data.featured_snippet,
      hasKnowledgeGraph: !!data.knowledge_graph,
      hasPeopleAlsoAsk: (data.people_also_ask || []).length > 0,
      hasLocalPack: !!data.local_results,
      hasShoppingResults: (data.shopping_results || []).length > 0
    };
  }
  
  extractDomain(url) {
    return new URL(url).hostname.replace('www.', '');
  }
}

// Usage
const tracker = new RankTracker(serppostAPI);

const rankings = await tracker.trackKeyword(
  'project management software',
  'yourcompany.com',
  ['google', 'bing']
);

console.log(rankings);
// {
//   google: { position: 7, url: '...', ... },
//   bing: { position: 12, url: '...', ... }
// }

This tracks:

  • Position on Google and Bing
  • Competing URLs
  • SERP features
  • Top 3 competitors

Add a cron job, PostgreSQL for storage, and you have a production rank tracker.

2. Keyword Research Tool

Second most requested. Also surprisingly simple.

class KeywordResearcher:
    def __init__(self, api_key):
        self.serp = SERPpost(api_key)
    
    def find_related_keywords(self, seed_keyword):
        """Get related keywords from search results"""
        
        # Get initial search
        results = self.serp.search(seed_keyword)
        
        related_keywords = set([seed_keyword])
        
        # Extract from People Also Ask
        paa = results.get('people_also_ask', [])
        for question in paa:
            # Extract keywords from questions
            keywords = self.extract_keywords_from_question(question['question'])
            related_keywords.update(keywords)
        
        # Extract from Related Searches
        related = results.get('related_searches', [])
        for item in related:
            related_keywords.add(item['query'])
        
        # Get auto-suggestions
        suggestions = self.get_autocomplete_suggestions(seed_keyword)
        related_keywords.update(suggestions)
        
        return list(related_keywords)
    
    def get_autocomplete_suggestions(self, keyword):
        """Get autocomplete suggestions"""
        suggestions = []
        
        # Try common modifiers
        modifiers = ['best', 'how to', 'what is', 'top', 
                    'cheap', 'free', 'online', 'near me']
        
        for modifier in modifiers:
            query = f"{modifier} {keyword}"
            # Some SERP APIs provide autocomplete
            # Or use the main search and extract related terms
            results = self.serp.search(query)
            suggestions.append(query)
        
        return suggestions
    
    def analyze_difficulty(self, keyword):
        """Estimate keyword difficulty"""
        results = self.serp.search(keyword)
        
        # Simple difficulty score based on:
        # - Number of results
        # - Domain authority of top 10
        # - SERP features present
        
        top_domains = [r['domain'] for r in results['organic'][:10]]
        
        # Count high-authority domains
        authority_domains = [
            'wikipedia.org', 'youtube.com', 'amazon.com',
            'reddit.com', 'medium.com', 'forbes.com'
        ]
        
        authority_count = sum(
            1 for d in top_domains 
            if any(auth in d for auth in authority_domains)
        )
        
        # Calculate difficulty (0-100)
        difficulty = min(100, (
            (authority_count * 10) +  # Authority domains
            (len(results.get('ads', [])) * 5) +  # Paid competition
            (20 if results.get('featured_snippet') else 0)  # Featured snippet
        ))
        
        return {
            'keyword': keyword,
            'difficulty': difficulty,
            'search_volume_estimate': results['search_information']['total_results'],
            'serp_features': self.count_serp_features(results),
            'top_domains': top_domains[:5]
        }
    
    def count_serp_features(self, results):
        features = 0
        if results.get('featured_snippet'): features += 1
        if results.get('knowledge_graph'): features += 1
        if results.get('people_also_ask'): features += 1
        if results.get('local_results'): features += 1
        return features

# Usage
researcher = KeywordResearcher('your_api_key')

# Find related keywords
related = researcher.find_related_keywords('seo tools')
print(f"Found {len(related)} related keywords")

# Analyze difficulty
for keyword in related[:10]:
    analysis = researcher.analyze_difficulty(keyword)
    print(f"{keyword}: Difficulty {analysis['difficulty']}/100")

This gives you:

  • Related keyword discovery
  • Difficulty estimation
  • Competition analysis
  • SERP feature tracking

3. Competitor Analysis Dashboard

The most valuable tool. Also the most complex.

class CompetitorAnalyzer {
  constructor(serpAPI, keywords) {
    this.api = serpAPI;
    this.keywords = keywords;
  }
  
  async analyzeCompetitors() {
    const competitorData = {};
    
    // Track all keywords
    for (const keyword of this.keywords) {
      const results = await this.api.search({
        s: keyword,
        t: 'google'
      });
      
      // Collect all domains in top 20
      const domains = results.organic
        .slice(0, 20)
        .map(r => this.extractDomain(r.url));
      
      domains.forEach((domain, index) => {
        if (!competitorData[domain]) {
          competitorData[domain] = {
            keywords: [],
            avgPosition: 0,
            topPositions: 0,
            totalAppearances: 0
          };
        }
        
        competitorData[domain].keywords.push({
          keyword,
          position: index + 1
        });
        competitorData[domain].totalAppearances++;
        
        if (index < 3) {
          competitorData[domain].topPositions++;
        }
      });
    }
    
    // Calculate averages
    Object.keys(competitorData).forEach(domain => {
      const data = competitorData[domain];
      data.avgPosition = 
        data.keywords.reduce((sum, k) => sum + k.position, 0) / 
        data.keywords.length;
      
      data.visibility = 
        (data.totalAppearances / this.keywords.length) * 100;
    });
    
    // Sort by visibility
    return Object.entries(competitorData)
      .map(([domain, data]) => ({ domain, ...data }))
      .sort((a, b) => b.visibility - a.visibility);
  }
  
  async findContentGaps(yourDomain, competitors) {
    // Find keywords competitors rank for but you don't
    const gaps = [];
    
    for (const keyword of this.keywords) {
      const results = await this.api.search({
        s: keyword,
        t: 'google'
      });
      
      const yourRanking = results.organic.findIndex(
        r => r.url.includes(yourDomain)
      );
      
      const competitorRankings = competitors.map(comp => ({
        domain: comp,
        position: results.organic.findIndex(r => r.url.includes(comp))
      })).filter(c => c.position !== -1);
      
      // If competitors rank but you don't
      if (yourRanking === -1 && competitorRankings.length > 0) {
        gaps.push({
          keyword,
          competitors: competitorRankings,
          opportunity: this.calculateOpportunity(results)
        });
      }
    }
    
    return gaps.sort((a, b) => b.opportunity - a.opportunity);
  }
  
  calculateOpportunity(results) {
    // Higher score = better opportunity
    let score = 50; // Base score
    
    // Fewer SERP features = easier to rank
    const features = [
      results.featured_snippet,
      results.knowledge_graph,
      results.local_results
    ].filter(Boolean).length;
    score -= features * 10;
    
    // Check domain authority of top results
    const topDomains = results.organic
      .slice(0, 5)
      .map(r => this.extractDomain(r.url));
    
    const lowAuthDomains = topDomains.filter(
      d => !this.isHighAuthority(d)
    ).length;
    score += lowAuthDomains * 10;
    
    return Math.max(0, Math.min(100, score));
  }
  
  isHighAuthority(domain) {
    const highAuth = [
      'wikipedia.org', 'youtube.com', 'amazon.com',
      'reddit.com', 'forbes.com', 'nytimes.com'
    ];
    return highAuth.some(auth => domain.includes(auth));
  }
  
  extractDomain(url) {
    return new URL(url).hostname.replace('www.', '');
  }
}

// Usage
const keywords = [
  'seo tools',
  'keyword research',
  'rank tracking',
  'backlink checker'
];

const analyzer = new CompetitorAnalyzer(serppostAPI, keywords);

// Get competitor landscape
const competitors = await analyzer.analyzeCompetitors();
console.log('Top competitors:', competitors.slice(0, 5));

// Find content gaps
const gaps = await analyzer.findContentGaps(
  'yourcompany.com',
  competitors.slice(0, 5).map(c => c.domain)
);
console.log('Keyword opportunities:', gaps.slice(0, 10));

This shows:

  • Who your real competitors are (by visibility)
  • What keywords they rank for
  • Content gaps you can exploit
  • Opportunity scores for each keyword

The ByteDance Production Stack

At ByteDance, we built SEO tools that handled:

  • 2M+ tracked keywords
  • 50+ countries
  • 100+ clients
  • Real-time updates

Our stack:

Frontend: React + TypeScript
Backend: Node.js + Express
Database: PostgreSQL + Redis
Queue: Bull (for job processing)
SERP Data: SERPpost API
Hosting: AWS (EC2 + RDS)

Architecture:

// Job queue for rank tracking
queue.process('track-rankings', async (job) => {
  const { keywords, domain, client_id } = job.data;
  
  for (const keyword of keywords) {
    const results = await serpAPI.search({
      s: keyword,
      t: 'google'
    });
    
    const position = findPosition(results, domain);
    
    await db.rankings.insert({
      client_id,
      keyword,
      domain,
      position,
      serp_features: extractFeatures(results),
      checked_at: new Date()
    });
  }
});

// Schedule daily checks
schedule.scheduleJob('0 2 * * *', async () => {
  const clients = await db.clients.findAll();
  
  for (const client of clients) {
    queue.add('track-rankings', {
      keywords: client.keywords,
      domain: client.domain,
      client_id: client.id
    });
  }
});

Cost Optimization Strategies

Running SEO tools at scale gets expensive. Here’s how we kept costs down:

1. Smart Caching

class CachedSERPAPI:
    def __init__(self, api, cache_duration=6):
        self.api = api
        self.cache_hours = cache_duration
        self.cache = {}
    
    def search(self, keyword, engine='google'):
        cache_key = f"{engine}:{keyword}"
        
        # Check cache
        if cache_key in self.cache:
            cached_data, cached_time = self.cache[cache_key]
            age_hours = (datetime.now() - cached_time).seconds / 3600
            
            if age_hours < self.cache_hours:
                return cached_data
        
        # Fetch fresh data
        data = self.api.search(keyword, engine=engine)
        self.cache[cache_key] = (data, datetime.now())
        
        return data

This reduced API calls by 60% for keywords checked multiple times per day.

2. Batch Processing

// Instead of tracking each keyword immediately
async function batchTrackKeywords(keywords) {
  const batch_size = 100;
  
  for (let i = 0; i < keywords.length; i += batch_size) {
    const batch = keywords.slice(i, i + batch_size);
    
    // Process batch
    await Promise.all(
      batch.map(kw => trackKeyword(kw))
    );
    
    // Rate limiting
    await sleep(1000);
  }
}

3. Tiered Checking Frequency

# Important keywords: check daily
# Medium keywords: check weekly  
# Low priority: check monthly

def get_check_frequency(keyword_data):
    if keyword_data['priority'] == 'high':
        return 'daily'
    elif keyword_data['position'] <= 10:
        return 'daily'  # Monitor top 10 closely
    elif keyword_data['position'] <= 50:
        return 'weekly'
    else:
        return 'monthly'

This cut our API usage by 40% with minimal data staleness.

Monetization Model

Built the tool. Now what? Here’s what works:

Pricing Tiers

Starter: $29/mo
- 100 keywords
- Daily checks
- Google only
- Basic reports

Pro: $99/mo
- 1,000 keywords
- Hourly checks
- Google + Bing
- Advanced reports
- API access

Enterprise: $499/mo
- 10,000 keywords
- Real-time checks
- All search engines
- White-label
- Priority support

Key insight: Most customers stay in the middle tier. Price accordingly.

API Cost vs Revenue

At $99/mo with 1,000 keywords checked daily:

Monthly API calls: 1,000 keywords × 30 days = 30,000
API cost (SERPpost): ~$90
Your revenue: $99
Profit: $9

BUT, customers don't check all keywords daily.
Actual usage: ~10,000 calls/month
Actual API cost: ~$30
Real profit: $69

With caching and smart scheduling, margins improve significantly.

Common Pitfalls

Pitfall 1: Over-engineering

Don’t build features nobody uses. Start with:

  • Rank tracking
  • Basic reports
  • Email alerts

Add more later if customers ask.

Pitfall 2: Ignoring Bing

30% of enterprise searches happen on Bing. Including Bing support:

  • Differentiates you from competitors
  • Attracts enterprise customers
  • Costs almost nothing extra (use dual-engine API like SERPpost)

Pitfall 3: Real-time Everything

Users don’t need real-time rankings. Daily is fine. Hourly for enterprise.

Save API credits by not over-fetching.

Quick Start Template

Here’s a minimal SEO tool you can build this weekend:

// Express API
app.post('/api/track', async (req, res) => {
  const { keyword, domain } = req.body;
  
  // Get search results
  const results = await serppost.search({
    s: keyword,
    t: 'google'
  });
  
  // Find position
  const position = results.organic.findIndex(
    r => r.url.includes(domain)
  );
  
  // Save to DB
  await db.rankings.create({
    keyword,
    domain,
    position: position === -1 ? null : position + 1,
    checked_at: new Date()
  });
  
  res.json({ position: position + 1 });
});

// React frontend
function RankTracker() {
  const [keyword, setKeyword] = useState('');
  const [domain, setDomain] = useState('');
  const [results, setResults] = useState(null);
  
  const track = async () => {
    const res = await fetch('/api/track', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({ keyword, domain })
    });
    setResults(await res.json());
  };
  
  return (
    <div>
      <input value={keyword} onChange={e => setKeyword(e.target.value)} />
      <input value={domain} onChange={e => setDomain(e.target.value)} />
      <button onClick={track}>Track</button>
      {results && <p>Position: {results.position || 'Not ranking'}</p>}
    </div>
  );
}

Ship this in a weekend. Iterate based on feedback.

Alternative SERP Providers

We tested multiple providers at ByteDance. SERPpost worked best for our use case (dual engine, good pricing), but SearchCans is also solid if you want alternatives.

The key is picking an API with:

  • Good uptime (99%+)
  • Both Google and Bing
  • Reasonable pricing
  • Fast response times

Real Talk

Building SEO tools isn’t technically difficult. The challenge is:

  • Making them reliable
  • Pricing them right
  • Getting customers
  • Supporting them

SERP APIs handle the first problem. The rest is business.

Start small. One feature. Ten customers. Iterate from there.

The tools that win aren’t the most feature-rich. They’re the ones that solve one problem really well.


About the author: Alex Zhang was a Senior Engineer at ByteDance for 2 years, building SEO and content optimization tools for TikTok and other properties. He now runs an SEO SaaS serving 500+ customers.

Related: Learn about SERP API pricing to optimize your tool’s costs.

Share:

Tags:

#SERP API #SEO Tools #Rank Tracker #Keyword Research #Development

Ready to try SERPpost?

Get started with 100 free credits. No credit card required.