Pagination
How to paginate through large result sets
Several telco.dev API endpoints return paginated results. This guide explains how to work with pagination effectively.
Paginated Endpoints
The following endpoints support pagination:
GET /v1/npa/:npa- Area code exchangesGET /v1/search- Search resultsGET /v1/carriers- Carrier list
Pagination Parameters
| Parameter | Type | Default | Max | Description |
|---|---|---|---|---|
limit | integer | 50 | 100 | Number of results per page |
offset | integer | 0 | - | Number of results to skip |
Response Format
Paginated responses include a pagination object:
{
"results": [...],
"pagination": {
"limit": 50,
"offset": 0,
"total": 847
}
}
| Field | Description |
|---|---|
limit | Number of results requested |
offset | Number of results skipped |
total | Total number of matching results |
Basic Pagination
First Page
curl -H "X-API-Key: your-key" \
"https://api.telco.dev/v1/npa/415?limit=20"
Second Page
curl -H "X-API-Key: your-key" \
"https://api.telco.dev/v1/npa/415?limit=20&offset=20"
Third Page
curl -H "X-API-Key: your-key" \
"https://api.telco.dev/v1/npa/415?limit=20&offset=40"
Iterating Through All Pages
JavaScript
async function getAllResults(baseUrl, apiKey) {
const allResults = [];
const limit = 100; // Max allowed
let offset = 0;
let total = null;
while (total === null || offset < total) {
const url = `${baseUrl}?limit=${limit}&offset=${offset}`;
const response = await fetch(url, {
headers: { "X-API-Key": apiKey }
});
const data = await response.json();
// Handle different response shapes
const results = data.results || data.exchanges || data.carriers;
allResults.push(...results);
total = data.pagination.total;
offset += limit;
// Optional: Add delay to avoid rate limits
if (offset < total) {
await new Promise(r => setTimeout(r, 100));
}
}
return allResults;
}
// Usage
const allExchanges = await getAllResults(
"https://api.telco.dev/v1/npa/415",
process.env.TELCO_API_KEY
);
console.log(`Fetched ${allExchanges.length} exchanges`);
Python
import requests
import time
import os
def get_all_results(base_url, api_key):
all_results = []
limit = 100
offset = 0
total = None
while total is None or offset < total:
response = requests.get(
base_url,
params={"limit": limit, "offset": offset},
headers={"X-API-Key": api_key}
)
response.raise_for_status()
data = response.json()
# Handle different response shapes
results = data.get("results") or data.get("exchanges") or data.get("carriers")
all_results.extend(results)
total = data["pagination"]["total"]
offset += limit
# Avoid rate limits
if offset < total:
time.sleep(0.1)
return all_results
# Usage
all_exchanges = get_all_results(
"https://api.telco.dev/v1/npa/415",
os.environ["TELCO_API_KEY"]
)
print(f"Fetched {len(all_exchanges)} exchanges")
Generator Pattern (Memory Efficient)
For very large datasets, use a generator to avoid loading everything into memory:
JavaScript
async function* paginatedResults(baseUrl, apiKey) {
const limit = 100;
let offset = 0;
let total = null;
while (total === null || offset < total) {
const url = `${baseUrl}?limit=${limit}&offset=${offset}`;
const response = await fetch(url, {
headers: { "X-API-Key": apiKey }
});
const data = await response.json();
const results = data.results || data.exchanges || data.carriers;
for (const item of results) {
yield item;
}
total = data.pagination.total;
offset += limit;
}
}
// Usage
for await (const exchange of paginatedResults(url, apiKey)) {
console.log(exchange.nxx);
// Process one at a time
}
Python
def paginated_results(base_url, api_key):
limit = 100
offset = 0
total = None
while total is None or offset < total:
response = requests.get(
base_url,
params={"limit": limit, "offset": offset},
headers={"X-API-Key": api_key}
)
data = response.json()
results = data.get("results") or data.get("exchanges") or data.get("carriers")
for item in results:
yield item
total = data["pagination"]["total"]
offset += limit
# Usage
for exchange in paginated_results(url, api_key):
print(exchange["nxx"])
# Process one at a time
Best Practices
- Use the maximum limit (100) when fetching all data to minimize API calls
- Check the
totalfield to know when you've fetched everything - Add small delays between requests to avoid hitting rate limits
- Use generators for large datasets to minimize memory usage
- Cache results if you'll need the same data multiple times
- Handle errors at each page to avoid losing progress