how to modify the corvid code according to whether the page is crawled by a browser or by google robots ?

You can use a site: command in google search, this command show all indexed pages for this url

no, that does not suit me.

I would like to be able to modify the content of the page (example: no filter on the data) depending on whether it is viewed by a “end user” or by “googlebot”

You can check out the page settings, SEO tab and Advanced SEO tab. There you can change meta info and set your custom meta/link info

this is not what I want.
It is directly from the velo/corvid code in developper mode that I wanted to know if it is a standard user or a search engine that browses the page

Gets the page’s SEO-related link tags: https://www.wix.com/velo/reference/wix-seo/metatags

import wixSeo from'wix-seo';

const index = wixSeo.metaTags.findIndex((i) => {
 return i.name === 'robots' && i.content === 'noindex';
});

if (index > -1) {
  // no indexed page
} else {
  // indexed page
}

i try with this code


let NumParPage = 10 ;

var botPattern = "(googlebot\/|bot|google|baidu|bing|msn|slurp|yande)";
var re = new RegExp(botPattern, 'i');

let userAgent = '' ;
export function myRouter_Router(request) {
  userAgent = request.userAgent;
}

console.log(userAgent) ;

if (re.test(userAgent)) {
    NumParPage = 100 ;
    console.log('crawler');
}
else
{
   NumParPage = 10 ;
    console.log('browser');
}



but userAgent value is always empty ???

i am not understand this :

export function myRouter_Router(request) {
  userAgent = request.userAgent;
}