Errors & Rate Limits

How the API communicates errors, enforces rate limits, and what headers to monitor.

HTTP Status Codes

CodeMeaningWhen
200OKRequest succeeded
400Bad RequestInvalid query parameters or malformed request
401UnauthorizedMissing or invalid authentication token
403ForbiddenValid auth but insufficient permissions
404Not FoundResource does not exist
429Too Many RequestsRate limit exceeded
500Internal ErrorUnexpected server error

Error Response Format

All error responses return a JSON object with an error field:

json
{
  "error": "Rate limit exceeded",
  "retryAfter": 42
}

Rate Limits

Rate limits are enforced per IP address (for unauthenticated requests) or per API key (for authenticated requests).

TierPer MinutePer Day
Developer17010,000
Enterprise1,700100,000

Rate Limit Headers

Every API response includes rate limit information in the headers:

X-RateLimit-LimitMaximum requests per window
X-RateLimit-RemainingRequests remaining in the current window
X-RateLimit-ResetUnix timestamp when the window resets
Retry-AfterSeconds to wait before retrying (only on 429)

Handling 429 Responses

When you hit the rate limit, back off using the Retry-After header:

Retry Logic Example
async function fetchWithRetry(url, options = {}, maxRetries = 3) {
  for (let i = 0; i < maxRetries; i++) {
    const res = await fetch(url, options);

    if (res.status === 429) {
      const retryAfter = parseInt(res.headers.get("Retry-After") || "60");
      console.log(`Rate limited. Retrying in ${retryAfter}s...`);
      await new Promise((r) => setTimeout(r, retryAfter * 1000));
      continue;
    }

    return res.json();
  }

  throw new Error("Max retries exceeded");
}

Bot Detection

The API blocks requests with User-Agent strings containing bot, crawler, spider, or scrape (case insensitive). Set a descriptive User-Agent for your application (e.g., MyApp/1.0) to avoid false positives. Search engine bots (Googlebot, Bingbot) and social platform crawlers are whitelisted.