Discussions
How to Efficiently Handle Large Datasets with JSData in a Node.js App?
8 months ago by Nathaniel Quinn
Hi everyone,
I'm building a Node.js app using JSData with a REST adapter to manage a dataset of ~50,000 records (product data with fields like ID, name, price, etc.). I'm running into performance issues when querying and filtering large datasets, especially when doing client-side filtering with DS.filter. The app feels sluggish, and memory usage spikes. What's the best approach to optimize JSData for handling large datasets? Should I lean more on server-side filtering with the REST API, or are there specific JSData configs (like caching or query optimization) I should tweak? Any tips or examples for improving performance would be awesome! Thanks!
