Review data loading operations to ensure they are properly scoped, filtered, and batched to prevent performance issues. Large datasets should be handled with appropriate pagination, filtering by relevant criteria (like date ranges), and avoiding operations that could load excessive amounts of data into memory.
Key areas to check:
Example of problematic pattern:
// Loads ALL jobs for ALL sources without filtering
const allJobs = await Promise.all(
dataSources.map(async (source) => {
// This could return 283,914 items for large teams
return await api.externalDataSources.jobs(source.id, null, null)
})
)
Better approach:
// Apply filtering and reasonable limits upfront
const recentJobs = await Promise.all(
dataSources.map(async (source) => {
return await api.externalDataSources.jobs(
source.id,
cutoffDate, // Filter by date
REASONABLE_LIMIT // Appropriate batch size
)
})
)
Enter the URL of a public GitHub repository