Problem to fetch a big json file

First of all i am completly new to wix and i hope you can help me with this problem.

I want to import articles to my website, i have a json file with 10.000 articles, the size of the file is round about 50mb.

The json file looking like this:

{
    "root": {
        "item": [
            {
                "id": "1",
                "mpn": "",
                "text": "test",
                "man": "",
                "ean": "",
                "price": "33",
                "stock": "0",
                "ordered": "",
                "eta": "",
                "indate": "",
                "weight": "0.14",
                "eol": "0",
                "cat_id": "152",
                "cat1": "Kabel & Adapter",
                "cat2": "Computer",
                "cat3": "",
                "cat4": "",
                "cat5": "",
                "cat6": "",
                "title":"test article",
                "short_desc":"",
                "short_summary":"",
                "long_summary":"",
                "marketing_text":"",
                "specs":"",
                "pdf":"",
                "pdf_manual":"",
                "images_s":"",
                "images_m":"",
                "images_l":"",
                "images_xl":""
            },
            ...

I use the getJSON function to fetch the json.
In the frontend i have a button what call this backend function.
export function getJsonProducts() {
getJSON(“link”)
.then(json => console.log(json.someKey))
.catch(err => console.log(err));
}

It works perfectly for smaller json files but every time i try to fetch the big JSON file i get nothing in the console (no outputs/errors) that why i think i get timed out.

Is there a way to fetch only pieces of the file?
Or I am completly stupid and there is a better way to fetch the file / import the file to the database?

found a better way but its also not 100% working, i not getting the 50mb file to fetch.
new code :

export async function getJsonProducts() {
console.log(“fetching”);
await getJSON(“link”)
.then(json => {
console.log(json.root.item[0]);
console.log(“end”);
})
. catch (err => console.log(err));
}

Did you get any answers to this ? I am trying to load a 5MB file into a JSON structure, which I can do, but when trying to process it (looping through each item) it just seems to stop - no errors, so I assume some memory limit ?

Are you trying to load all 10,000 items for each user? If so that will cause a massive performance overhead to your site. You could, instead make the request once and save the returned information to a database and then render the articles with a repeater and a dynamic page for instance.

Regarding size limits some API’s do have a size limit for each request. This is because it is very time consuming and computationally heavy (especially with relational databases) to query, join, format, and send large amounts of information through a single request. More than likely they also have a max amount of time that a request can be open before being timed out. I would reach out to whomever is hosting the API you are querying data from to see if they have a way of delivering the full dataset in smaller chunks. This could allow you to render and return smaller chunks of data depending on where the user navigates to on the site and improve performance for both your site and their API