How to Resolve “WDE0116: request entity too large” with Google Cloud SQL Integration?

When making requests from Wix to my external Google Cloud SQL (PostgreSQL) database, I’m receiving the following error in Wix:

To investigate further, I checked the logs in Google Cloud, where I found this specific error:

PayloadTooLargeError: request entity too large  
    at readStream (/usr/lib/app/node_modules/raw-body/index.js:156:17)  
    ...

The data I’m sending is around 250 KB in total, with individual fields that might contain objects up to 150 KB. These sizes seem relatively small, and I wouldn’t expect them to exceed any reasonable limits. I find it unusual to encounter this error, as both Wix and Google Cloud Run should be able to handle payloads of this size comfortably.

Could anyone clarify what might be causing this issue? Are there specific configuration steps I need to take in Wix or Google Cloud to increase the request size limit? Any guidance on debugging or resolving this would be greatly appreciated.

The error you’re encountering, PayloadTooLargeError, indicates that the server or middleware processing your request is rejecting it due to exceeding a defined payload size limit. Here’s how to address the issue for both Wix and Google Cloud SQL (PostgreSQL):


Step 1: Check Wix Backend Code

In Wix, the payload size limit may be related to the HTTP request handling in the backend code. By default, Wix may use certain libraries (like fetch) that don’t explicitly set payload size limits but can fail if the server rejects large payloads. Ensure the following:

  1. Compress Your Payload: Reduce the size of your payload by compressing it, such as using gzip or JSON.stringify with optimization.
  2. Stream Data: If possible, divide large payloads into smaller chunks and process them incrementally.
  3. Log the Actual Payload Size: Add logging to track how large the payload is before sending the request:

javascript

CopyEdit

console.log(`Payload size: ${JSON.stringify(data).length} bytes`);

Step 2: Google Cloud Run/Cloud SQL Configuration

By default, Google Cloud Run has a maximum HTTP request payload size limit of 32 MB, so your payload of ~250 KB should not trigger the error. However, middleware or other configurations may set lower limits.

  1. Check Middleware in Your Cloud Run Service:
  • If you’re using libraries like express to handle incoming requests, ensure the payload limit is properly configured.
  • Example for express:

javascript

CopyEdit

const express = require('express');
const app = express();

app.use(express.json({ limit: '10mb' })); // Set a reasonable limit
app.use(express.urlencoded({ limit: '10mb', extended: true }));
  1. Inspect the raw-body Package:
  • The error is explicitly from the raw-body package used internally for handling request bodies. Update the configuration for this package in your server code:

javascript

CopyEdit

const bodyParser = require('body-parser');
app.use(bodyParser.json({ limit: '10mb' }));

Step 3: Verify Network Configuration

If you’re connecting directly to Cloud SQL:

  1. Ensure HTTPS: For secure communication, use HTTPS for the API endpoint.
  2. Use an API Gateway:
  • Implement a Google Cloud API Gateway or another proxy to handle large payloads efficiently.
  • This can also offload compression or streaming.

Step 4: Wix API Gateway Limits

Wix’s backend services have limits that might restrict payload sizes. For example:

  • Payload size limits for fetch() or http-functions in Wix may be smaller than expected.

To resolve this:

  1. Use External HTTP Proxy: Proxy your requests through a third-party service like Google Cloud API Gateway, AWS API Gateway, or a custom proxy.
  2. Adjust Payload on the Client Side: Break down large payloads and reassemble them server-side.

Example Fix: Splitting Payloads

If you cannot adjust the middleware limits, divide the payload:

javascript

CopyEdit

const MAX_SIZE = 100000; // 100 KB per chunk
const chunks = [];
let startIndex = 0;

while (startIndex < data.length) {
    chunks.push(data.slice(startIndex, startIndex + MAX_SIZE));
    startIndex += MAX_SIZE;
}

chunks.forEach((chunk) => {
    fetch('https://your-endpoint.com', {
        method: 'POST',
        body: JSON.stringify(chunk),
        headers: { 'Content-Type': 'application/json' }
    })
    .then(response => response.json())
    .then(data => console.log('Chunk processed', data))
    .catch(error => console.error('Error processing chunk', error));
});

Debugging Steps

  1. Log in Wix:
  • Check the exact payload size and content.
  1. Check Google Cloud Logs:
  • Verify the request size and ensure the error is from payload size, not misconfiguration.

By addressing these areas, you can prevent the PayloadTooLargeError and ensure smooth communication between Wix and Google Cloud SQL.

your reply is a ai generated reply and doesnt make sense a lot of the time. And i have looked in to most of it

Answer is :slight_smile:

This is the limiting factor. Wix velo when using an external database…

Thanks for the link.