Skip to main content

Pagination

Many endpoints return large datasets. Use pagination to fetch results efficiently.

Pagination Parameters

ParameterTypeDefaultDescription
limitinteger20Results per page (max 100)
offsetinteger0Number of results to skip

Basic Usage

# First page (results 1-20)
curl "https://api.web3identity.com/api/wallet/0x.../transactions"

# Second page (results 21-40)
curl "https://api.web3identity.com/api/wallet/0x.../transactions?limit=20&offset=20"

# Third page (results 41-60)
curl "https://api.web3identity.com/api/wallet/0x.../transactions?limit=20&offset=40"

Response Format

Paginated responses include metadata:

{
"data": [...],
"pagination": {
"limit": 20,
"offset": 0,
"total": 1547,
"hasMore": true
}
}
FieldDescription
limitResults per page
offsetCurrent offset
totalTotal results available
hasMoreWhether more pages exist

Paginated Endpoints

These endpoints support pagination:

EndpointDefault LimitMax Limit
/api/wallet/{address}/transactions20100
/api/wallet/{address}/nfts20100
/api/farcaster/user/{fid}/casts20100
/api/farcaster/user/{fid}/followers20100
/api/ens/domains/{address}50100
/api/defi/protocols50100
/api/yields50100

Iterating Through All Results

JavaScript

async function fetchAllTransactions(address) {
const allResults = [];
let offset = 0;
const limit = 100;

while (true) {
const response = await fetch(
`https://api.web3identity.com/api/wallet/${address}/transactions?limit=${limit}&offset=${offset}`
);
const data = await response.json();

allResults.push(...data.data);

if (!data.pagination.hasMore) break;
offset += limit;

// Respect rate limits
await new Promise(r => setTimeout(r, 100));
}

return allResults;
}

Python

import requests
import time

def fetch_all_transactions(address):
all_results = []
offset = 0
limit = 100

while True:
response = requests.get(
f"https://api.web3identity.com/api/wallet/{address}/transactions",
params={"limit": limit, "offset": offset}
)
data = response.json()

all_results.extend(data["data"])

if not data["pagination"]["hasMore"]:
break
offset += limit

# Respect rate limits
time.sleep(0.1)

return all_results

Best Practices

Use Reasonable Page Sizes

// ✅ Good: Reasonable limit
const response = await fetch(`${API}/transactions?limit=50`);

// ❌ Bad: Too small (many requests)
const response = await fetch(`${API}/transactions?limit=5`);

// ❌ Bad: Max limit every time (may be slow)
const response = await fetch(`${API}/transactions?limit=100`);

Handle Empty Results

const data = await response.json();

if (data.data.length === 0) {
console.log('No results found');
return;
}

Implement Rate Limit Awareness

async function paginateWithRateLimit(baseUrl, limit = 50) {
const results = [];
let offset = 0;
let requestCount = 0;

while (true) {
// Check rate limit (30 req/min for free tier)
if (requestCount >= 25) {
console.log('Approaching rate limit, pausing...');
await new Promise(r => setTimeout(r, 60000));
requestCount = 0;
}

const response = await fetch(`${baseUrl}?limit=${limit}&offset=${offset}`);
requestCount++;

if (response.status === 429) {
const retryAfter = response.headers.get('Retry-After') || 60;
await new Promise(r => setTimeout(r, retryAfter * 1000));
continue;
}

const data = await response.json();
results.push(...data.data);

if (!data.pagination.hasMore) break;
offset += limit;
}

return results;
}

Cursor-Based Pagination

Some endpoints use cursor-based pagination for better performance with large datasets:

# First request
curl "https://api.web3identity.com/api/farcaster/feed?limit=20"

# Response includes cursor
{
"data": [...],
"cursor": "eyJpZCI6MTIzNDV9"
}

# Next page using cursor
curl "https://api.web3identity.com/api/farcaster/feed?limit=20&cursor=eyJpZCI6MTIzNDV9"

Cursor advantages:

  • Consistent results even when new data is added
  • Better performance for deep pagination
  • No skipped/duplicate results

Rate Limit Considerations

Each paginated request counts against your rate limit:

ScenarioRequests NeededTime Required
100 results, limit=205 requests~5 seconds
1000 results, limit=10010 requests~10 seconds
10000 results, limit=100100 requests~4 minutes*

*At 30 req/min rate limit for free tier

Batch When Possible

If you need data for multiple addresses, use batch endpoints instead of paginating each individually.