Pagination
Many endpoints return large datasets. Use pagination to fetch results efficiently.
Pagination Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
limit | integer | 20 | Results per page (max 100) |
offset | integer | 0 | Number of results to skip |
Basic Usage
# First page (results 1-20)
curl "https://api.web3identity.com/api/wallet/0x.../transactions"
# Second page (results 21-40)
curl "https://api.web3identity.com/api/wallet/0x.../transactions?limit=20&offset=20"
# Third page (results 41-60)
curl "https://api.web3identity.com/api/wallet/0x.../transactions?limit=20&offset=40"
Response Format
Paginated responses include metadata:
{
"data": [...],
"pagination": {
"limit": 20,
"offset": 0,
"total": 1547,
"hasMore": true
}
}
| Field | Description |
|---|---|
limit | Results per page |
offset | Current offset |
total | Total results available |
hasMore | Whether more pages exist |
Paginated Endpoints
These endpoints support pagination:
| Endpoint | Default Limit | Max Limit |
|---|---|---|
/api/wallet/{address}/transactions | 20 | 100 |
/api/wallet/{address}/nfts | 20 | 100 |
/api/farcaster/user/{fid}/casts | 20 | 100 |
/api/farcaster/user/{fid}/followers | 20 | 100 |
/api/ens/domains/{address} | 50 | 100 |
/api/defi/protocols | 50 | 100 |
/api/yields | 50 | 100 |
Iterating Through All Results
JavaScript
async function fetchAllTransactions(address) {
const allResults = [];
let offset = 0;
const limit = 100;
while (true) {
const response = await fetch(
`https://api.web3identity.com/api/wallet/${address}/transactions?limit=${limit}&offset=${offset}`
);
const data = await response.json();
allResults.push(...data.data);
if (!data.pagination.hasMore) break;
offset += limit;
// Respect rate limits
await new Promise(r => setTimeout(r, 100));
}
return allResults;
}
Python
import requests
import time
def fetch_all_transactions(address):
all_results = []
offset = 0
limit = 100
while True:
response = requests.get(
f"https://api.web3identity.com/api/wallet/{address}/transactions",
params={"limit": limit, "offset": offset}
)
data = response.json()
all_results.extend(data["data"])
if not data["pagination"]["hasMore"]:
break
offset += limit
# Respect rate limits
time.sleep(0.1)
return all_results
Best Practices
Use Reasonable Page Sizes
// ✅ Good: Reasonable limit
const response = await fetch(`${API}/transactions?limit=50`);
// ❌ Bad: Too small (many requests)
const response = await fetch(`${API}/transactions?limit=5`);
// ❌ Bad: Max limit every time (may be slow)
const response = await fetch(`${API}/transactions?limit=100`);
Handle Empty Results
const data = await response.json();
if (data.data.length === 0) {
console.log('No results found');
return;
}
Implement Rate Limit Awareness
async function paginateWithRateLimit(baseUrl, limit = 50) {
const results = [];
let offset = 0;
let requestCount = 0;
while (true) {
// Check rate limit (30 req/min for free tier)
if (requestCount >= 25) {
console.log('Approaching rate limit, pausing...');
await new Promise(r => setTimeout(r, 60000));
requestCount = 0;
}
const response = await fetch(`${baseUrl}?limit=${limit}&offset=${offset}`);
requestCount++;
if (response.status === 429) {
const retryAfter = response.headers.get('Retry-After') || 60;
await new Promise(r => setTimeout(r, retryAfter * 1000));
continue;
}
const data = await response.json();
results.push(...data.data);
if (!data.pagination.hasMore) break;
offset += limit;
}
return results;
}
Cursor-Based Pagination
Some endpoints use cursor-based pagination for better performance with large datasets:
# First request
curl "https://api.web3identity.com/api/farcaster/feed?limit=20"
# Response includes cursor
{
"data": [...],
"cursor": "eyJpZCI6MTIzNDV9"
}
# Next page using cursor
curl "https://api.web3identity.com/api/farcaster/feed?limit=20&cursor=eyJpZCI6MTIzNDV9"
Cursor advantages:
- Consistent results even when new data is added
- Better performance for deep pagination
- No skipped/duplicate results
Rate Limit Considerations
Each paginated request counts against your rate limit:
| Scenario | Requests Needed | Time Required |
|---|---|---|
| 100 results, limit=20 | 5 requests | ~5 seconds |
| 1000 results, limit=100 | 10 requests | ~10 seconds |
| 10000 results, limit=100 | 100 requests | ~4 minutes* |
*At 30 req/min rate limit for free tier
Batch When Possible
If you need data for multiple addresses, use batch endpoints instead of paginating each individually.