CSV build issue - HELP!

OK…this is really bizarre.

I have code to build a CSV from an array of data that runs fine in preview in the editor but crashes in a browser: Request failed with status code 413

I would have expected this to be a file size related error but the file size is tiny - about 250KB when it completes in preview mode.

So here’s the code. I am confident that there’s nothing wrong with it as it runs in preview perfectly and, as long as I limit the number of records I send to it, it runs fine in a live browser too. I can send 615 records through it in a browser but at 620 records, it gives me that error. It’s not a data problem as when I slice the array to target where it fails, it still runs well, beyond the 615th record, when I slice the array from the 600th.

This is the code:

export const generateMembershipCsvFromContacts = webMethod(Permissions.Anyone, async (contactsArray) => {
   const headers = [
       "Title",
       "First Name",
       "Last Name",
       "Member Number",
       "Email",
       "Phone",
       "Number of Journals",
       "Member Type",
       "Family Member Number",
       "Addr1",
       "Addr2",
       "Addr3",
       "Addr4",
       "Addr5",
       "Addr6",
       "Addr7",
       "Date Address Changed",
       "Date Email Changed",
       "Life Time Sub?",
       "Amount Last Paid",
       "Payment Method",
       "Payment Status",
       "Journal Type",
       "Last Payment Date",
       "Gift Aid?",
       "Renewal Date",
       "Payment Comments",
       "Joined Date",
       "Deceased Date",
       "Notes"
   ];
   console.log("Starting CSV generation");

   let csvContent = headers.join(",") + "\n";

   contactsArray.forEach(contact => {

       //console.log("Processing contact with ID:", contact._id); 

       let email = contact.info.emails && contact.info.emails.length > 0 ? contact.info.emails[0].email : "";
       let phone = contact.info.phones && contact.info.phones.length > 0 ? contact.info.phones[0].phone : "";

       let row = [
           contact.info.extendedFields["custom.title"] || "",
           contact.info.name.first || "",
           contact.info.name.last || "",
           contact.info.extendedFields["custom.membernumber"] || "",
           email,
           phone,
           contact.info.extendedFields["custom.numberofjournals"] || "",
           contact.info.extendedFields["custom.type"] || "",
           contact.info.extendedFields["custom.otherfamilynumber"] || "",
           contact.info.extendedFields["custom.addr-1-house-name"] || "",
           contact.info.extendedFields["custom.addr2"] || "",
           contact.info.extendedFields["custom.addr3"] || "",
           contact.info.extendedFields["custom.addr4city"] || "",
           contact.info.extendedFields["custom.addr5statecounty"] || "",
           contact.info.extendedFields["custom.addr6postcodezip"] || "",
           contact.info.extendedFields["custom.addr7country"] || "",
           contact.info.extendedFields["custom.dateaddresschanged"] || "",
           contact.info.extendedFields["custom.dateemailchanged"] || "",
           contact.info.extendedFields["custom.lifetime-subscription"] || "",
           contact.info.extendedFields["custom.amountlastpaid"] || "",
           contact.info.extendedFields["custom.paymentmethod"] || "",
           contact.info.extendedFields["custom.paymentstatus"] || "",
           contact.info.extendedFields["custom.journaltype"] || "",
           contact.info.extendedFields["custom.lastpaymentdate"] || "",
           contact.info.extendedFields["custom.giftaid"] || "",
           contact.info.extendedFields["custom.renewaldate"] || "",
           contact.info.extendedFields["custom.paymentcomments"] || "",
           contact.info.extendedFields["custom.joineddate"] || "",
           contact.info.extendedFields["custom.deceaseddate"] || "",
           contact.info.extendedFields["custom.notes"] || ""
       ].join(",");

       csvContent += row + "\n";
   });

   console.log("CSV content prepared, starting upload...");

   const csvBuffer = Buffer.from(csvContent);

   try {
       let uploadResult = await uploadCSV(csvBuffer);
       console.log("Upload successful, file URL:", uploadResult.fileUrl);
       return uploadResult.fileUrl;
   } catch (error) {
       console.error("Error during CSV upload:", error.message);
       throw error; // re-throw the error to ensure it gets caught by any caller
   }
});

async function uploadCSV(buffer) {
   console.log("Preparing to upload CSV file. Buffer size:", buffer.length, "bytes");

   try {
       let result = await mediaManager.upload(
           "/CSVdownloads",
           buffer,
           "Contacts.csv", {
               "mediaOptions": {
                   "mimeType": "text/csv",
                   "mediaType": "document"
               },
               "metadataOptions": {
                   "isPrivate": false,
                   "isVisitorUpload": false
               }
           }
       );
       console.log("CSV upload successful. File URL:", result.fileUrl);
       return result;
   } catch (error) {
       console.error("CSV upload failed:", error.message);
       throw error; // Re-throw the error to ensure it gets caught by the caller.
   }
}

Does anyone have any idea where I should look to troubleshoot? I am stumped.

I am logged in on the same account in preview and a chrome browser…so I can’t see that it would be permissions related…it’s just plain WEIRD.

Thanks,
Simon.

What’s the size of the buffer for 620 entries? (buffer.length will give you the size in bytes). I see from our docs that the limit should be 1,000MB which I doubt your file is reaching but perhaps it’s hitting some other hard limit.

Also interesting how it works in Preview mode and not production. Often when we encounter differences between Preview and production it’s a permissions issue where the Preview site is running as an administrator and production is not. I’m not sure in this case but something worth looking at. Does this issue still happen on a live site even if you’re logged in to the site as an admin user?

Does the site have enough storage space as well?

Hi Anthony,

Thanks for looking into this.

The buffer.length is 285633 bytes. So basically it’s tiny. I can’t imagine why there’d be a limit on a CSV file of about 1350 rows…and yet, here we are…

The site has 120GB media storage. Most of it is unused at this point.

I already tried switching users around and changed the permission on the back-end function to anyone, to try to uncover the problem…but to no avail. Where I am today is logged into the site with the owner credentials, identical to how I am logged in to the preview mode in the editor. And, as I say in my original post, I can run this function in both preview and production…the only difference being that in production it throws this 413 error when the file size gets over about 620 rows in the CSV. I am including the file here in case someone smarter than me can decipher it:

"Error generating CSV: Error: Request failed with status code 413 at Object.<anonymous> (https://static.parastorage.com/services/wix-thunderbolt/dist/feature-telemetry-wix-code-sdk.612c9cae.chunk.min.js:1:10608) at Object.eval [as error] (https://static.parastorage.com/services/wix-code-viewer-app/1.1479.714/app.js:2:39805) at eval (https://bundler.wix-code.com/80739fe0-d4bc-4b66-83d5-6f8d06d91328/031c2c86-4d17-4914-b976-733eea1fdde9/dcb07c5f-1687-47ba-afde-ae7cc2ce4ac8/pages/h12wq.js?analyze-imported-namespaces=true&init-platform-api-provider=true&get-app-def-id-from-package-name=false&disable-yarn-pnp-mode=false&bundler-traffic-to-aws=false&bundler-typescript-analysis=false&dependencies-token=3938&cache-buster=cachebuster2:2:12307) at l (https://bundler.wix-code.com/80739fe0-d4bc-4b66-83d5-6f8d06d91328/031c2c86-4d17-4914-b976-733eea1fdde9/dcb07c5f-1687-47ba-afde-ae7cc2ce4ac8/pages/h12wq.js?analyze-imported-namespaces=true&init-platform-api-provider=true&get-app-def-id-from-package-name=false&disable-yarn-pnp-mode=false&bundler-traffic-to-aws=false&bundler-typescript-analysis=false&dependencies-token=3938&cache-buster=cachebuster2:2:3045) at Generator.eval [as _invoke] (https://bundler.wix-code.com/80739fe0-d4bc-4b66-83d5-6f8d06d91328/031c2c86-4d17-4914-b976-733eea1fdde9/dcb07c5f-1687-47ba-afde-ae7cc2ce4ac8/pages/h12wq.js?analyze-imported-namespaces=true&init-platform-api-provider=true&get-app-def-id-from-package-name=false&disable-yarn-pnp-mode=false&bundler-traffic-to-aws=false&bundler-typescript-analysis=false&dependencies-token=3938&cache-buster=cachebuster2:2:2798) at Generator.eval [as throw] (https://bundler.wix-code.com/80739fe0-d4bc-4b66-83d5-6f8d06d91328/031c2c86-4d17-4914-b976-733eea1fdde9/dcb07c5f-1687-47ba-afde-ae7cc2ce4ac8/pages/h12wq.js?analyze-imported-namespaces=true&init-platform-api-provider=true&get-app-def-id-from-package-name=false&disable-yarn-pnp-mode=false&bundler-traffic-to-aws=false&bundler-typescript-analysis=false&dependencies-token=3938&cache-buster=cachebuster2:2:3408) at f (https://bundler.wix-code.com/80739fe0-d4bc-4b66-83d5-6f8d06d91328/031c2c86-4d17-4914-b976-733eea1fdde9/dcb07c5f-1687-47ba-afde-ae7cc2ce4ac8/pages/h12wq.js?analyze-imported-namespaces=true&init-platform-api-provider=true&get-app-def-id-from-package-name=false&disable-yarn-pnp-mode=false&bundler-traffic-to-aws=false&bundler-typescript-analysis=false&dependencies-token=3938&cache-buster=cachebuster2:2:8280) at c (https://bundler.wix-code.com/80739fe0-d4bc-4b66-83d5-6f8d06d91328/031c2c86-4d17-4914-b976-733eea1fdde9/dcb07c5f-1687-47ba-afde-ae7cc2ce4ac8/pages/h12wq.js?analyze-imported-namespaces=true&init-platform-api-provider=true&get-app-def-id-from-package-name=false&disable-yarn-pnp-mode=false&bundler-traffic-to-aws=false&bundler-typescript-analysis=false&dependencies-token=3938&cache-buster=cachebuster2:2:8519)"

I’ve opened a support ticket, which I am hoping will find its way to the media manager team eventually, as this looks awfully like a bug to me but I’d love to be wrong and hear I am doing something dumb (again).

Simon.

Support is definitely the way to go for this. From your description and the code above I don’t think there’s anything wrong with the code.

I’ll also ping internally to see if we can get someone to look at this.

Edit: Can you also share your support ticket ID here so support can reference this thread?

Hi again Anthony.

The support ticket is Request 125598472.

They’re asking a for screencast too, which is a challenge for me at the moment as I am on holiday, but I am hoping they can manage without that…they may need some encouragement though, so I’ll tale any help I can get thanks.

Simon.

Got it. The support team is aware and looking at it. Not sure if the screencast is a hard requirement. You’ll need to follow up with them if it is.

Well after more than 2 weeks waiting, the dev team have invented a “request payload limit” in a web.js operation of 512KB. Not documented anywhere that I can find but that is what is causing this…and presumably it’s not applied in Preview Mode.

This is their reply in full:

After a very thorough investigation by our development team, it appears the request payload (for web methods - web.js files) is exceeding the limit of 512 KB.

The code should be refactored with consideration of the request payload limit.

For example, you can consider:

  • on frontend - convert file to base64, split it to chunks and save these chunks to Wix collection
  • on backend - download from Wix collections chunks, join them back to 1 string, then base64 to buffer, then call mediaManager.upload()

I’ve asked for clearer guidance from them but thought I’d post this here as well. I do not understand quite what they’re asking me to do, or how it will help. Can anyone throw any light on how I should approach this?

Simon.

I understand the frustration.

Another option you have is to use generateFileUploadUrl - Velo API Reference - Wix.com and then use fetch/axios or another library to HTTP PUT your CSV file to the generated URL.

So:

  • Generate a file upload URL in a web method and return it to the frontend
  • On the frontend do axios.put(that_url, file, options)

There’s also a resumeable upload equivalent: generateFileResumableUploadUrl - Velo API Reference - Wix.com

Thanks Anthony. Will take a look later. This job has already taken me way longer than expected, without hiccups like this.

So it turns out that the failure isn’t in the back-end at all…the back-end call fails when you try to send this much data to it…the limit is either 512kb or 512KB - the limit is very confusingly spec’d and communicated by Wix.

So QA suggested that I write the content to a collection as base64 encoded blocks, that can be retrieved in the back-end. Seems like a pain but I need to get this working, so I am giving it a try…but I can’t write this to a collection without getting a WDE0109 error (Payload is too large). Once again, this error is ONLY seen on the live site…it runs fine in preview. Doesn’t matter if I break it into 15 or 250 pieces, I still get this error…which once again is mentioned nowhere in the API or, for that matter, anywhere Google can crawl.

This is my code

const CHUNK_SIZE = 5; // Configurable chunk size

// Main function to process and save contacts
export async function exportCSV_click(event) {
  try {
    console.log("exportCSV Running");

    if (allContacts.length > 0) {
      let chunks = chunkArray(allContacts, CHUNK_SIZE);
      console.log(`Created ${chunks.length} chunks`);

      let base64Chunks = chunks.map(arrayToBase64);

      await saveChunks(base64Chunks);

      console.log('All chunks saved successfully.');
    } else {
      console.log("No contacts to process.");
    }

  } catch (err) {
    console.error("Error processing contacts:", err);
  }
  console.log("Sending id to back-end for retrieval", id)
  generateMembershipCsvFromContacts(id)
}

// Function to split the array into chunks
function chunkArray(array, chunkSize) {
  let result = [];
  for (let i = 0; i < array.length; i += chunkSize) {
    result.push(array.slice(i, i + chunkSize));
  }
  return result;
}

// Function to convert array to base64
function arrayToBase64(array) {
  try {
    return btoa(JSON.stringify(array));
  } catch (error) {
    console.error('Failed to convert array to base64:', error);
    throw error;
  }
}

// Function to save chunks to the collection
async function saveChunks(chunks) {
  let items = chunks.map((chunk, index) => ({
    chunkIndex: index,
    data: chunk,
  }));

  console.log('Items to insert:', items); // Log items before inserting

  try {
    let result = await wixData.bulkInsert('CSVUpload', items);
    console.log(`bulkInsert result: ${result.inserted}`);
  } catch (error) {
    console.error('Failed to save chunks:', error);
  }
}

I can’t find any limits on a bulkInsert() beyond the following:

* The maximum size of an item that you can add to a collection is 500kb.
* Bulk operations are limited to 1000 items per function call.

…and I am not violating either of those.

This seems quite odd to me again…any pointers would be appreciated.

Simon.

I broke the number of chunks in each bulkInsert down to a smaller number and it is working now. Can’t say I am happy with how I reached this point but at least my client gets what he wants.

Simon are you trying to just export a query into CSV? I have done this with a few sites by using an iFrame on the frontend. You can use your web module to query and map your data but the building of the csv and exporting can be done with the iFrame.

Here’s a brief example of how to do this:

Page Code


//export CM into CSV compatible file for excel
let dataToUpload = null; // We will add our data collection to this buffer

function sendMessageToComponent(message) {
    // send message to the HTML element
    $w("#htmlDownload").postMessage(message);

async function readDataCollection() {
    //i like to use an item[0] to help ensure formatting => not required but if your data has commas it will not be in the right column after exporting. item 0 ensures your headers will match.
    let firstItem = await wixData.query('Collection Name').eq('_id', '12345').find()
    let item0 = firstItem.items[0]

    // you can map the item and change the values if that's preferred like so:
    item0 = Object.keys(item0).map((item) => ({
             [item]: 'USED FOR EXPORT'
   })

    // get your data
    const dataToExportQuery = await getAllUnfilteredDataByDateRange({startDate: '', endDate: ''}

    dataToUpload = [item0].concat(dataToExportQuery)
    // Prep for download
    console.log("Create payload for download");
    let jsonPayload = JSON.stringify(dataToUpload);
    sendMessageToComponent({ action: "data", payload: jsonPayload })
}

const getAllUnfilteredDataByDateRange = webMethod(Permissions.Anyone, async (dateRange) => {
    const { startDate, endDate } = dateRange
    let query = await wixData.query(collectionName).ge('_createdDate', new Date(startDate)).lt('_createdDate', new Date(endDate).limit(1000).find()
    let allItems = query.items
    while (query.hasNext()) {
        query = await query.next();
        allItems = allItems.concat(query.items)
    }
    return allItems
})
}

iFrame Code

<html>
<body>
    <p id="noticeBoard">Document Loading...</p>
    <!-- <button onclick='download_file("my_file.txt", dynamic_text())'>Download</button> -->
    <script>
    
    document.onreadystatechange = function () {
        if (document.readyState === "complete") {
            console.log("Document Loaded!");
            updateNoticeBoard("Document Loaded!");
        }
    }

    function updateNoticeBoard(message) {
        let messageArea = document.getElementById('noticeBoard');
        messageArea.innerHTML = message;
    }

    window.onmessage = (event) => {
        console.log("Message Received: "+(event.data.action ? event.data.action : "Bad Message"));
        if (event.data) {
            messageReceived(event.data);
        }
    };

    function sendMessageToPage(message) {
        window.parent.postMessage(message, "*");
    }

    function messageReceived(message) {
        if (message && message.action === 'data' && message.payload) {
            // Convert JSON to csv
            download_file("data.csv", csvFromJson(message.payload));
        } else if (message && message.payload) {
            updateNoticeBoard(message.payload);
        }
    }

    function dynamic_text() {
        return "create your dynamic text here";
    }

    // Courtesy https://stackoverflow.com/questions/18249290/generate-csv-for-excel-via-javascript-with-unicode-characters
    function download_file(name, contents, mime_type) {
        mime_type = mime_type || "text/csv;charset=utf-8";

        var blob = new Blob(['\uFEFF', contents], {type: mime_type});

        var dlink = document.createElement('a');
        dlink.download = name;
        dlink.href = window.URL.createObjectURL(blob);
        dlink.onclick = function(e) {
            // revokeObjectURL needs a delay to work properly
            var that = this;
            setTimeout(function() {
                window.URL.revokeObjectURL(that.href);
            }, 1500);
        };

        dlink.click();
        dlink.remove();
    }

    function csvFromJson(jsonData) {
        const items = JSON.parse(jsonData);
        const replacer = (key, value) => value === null ? '' : value // specify how you want to handle null values here
        const header = Object.keys(items[0])
        let csv = items.map(row => header.map(fieldName => JSON.stringify(row[fieldName], replacer)).join(','))
        csv.unshift(header.join(','))
        csv = csv.join('\r\n')

        console.log(csv)
        return csv
    }
    </script>
</body>
</html>```

You can also use a CSV NPM if you want to do it from the backend.

My problem with the back end approach is that Wix have limited what you can send to the back-end in Studio. This idea to use an iFrame to do it all in the front end, obviating the need to put the file in the media manager is masterful though. Much better than waht I’ve done in the end.

1 Like