To chain multiple URLs for scraping in Puppeteer, you can create an array of the URLs you want to scrape and then iterate through them using a loop. Here's an example:
const puppeteer = require('puppeteer');
(async () => {
const urls = ['https://www.example.com/page1', 'https://www.example.com/page2', 'https://www.example.com/page3'];
const browser = await puppeteer.launch();
const page = await browser.newPage();
for (const url of urls) {
await page.goto(url);
// Add your scraping logic here
const title = await page.title();
console.log(`Title of ${url}: ${title}`);
}
await browser.close();
})();
In this example, we first define an array urls
containing the URLs we want to scrape. We then create a Puppeteer browser instance and loop through each URL in the array. Inside the loop, we navigate to the current URL using page.goto(url)
and then perform our scraping logic. In this case, we simply get the title of the page and log it to the console.
By chaining multiple URLs in this way, you can scrape data from multiple pages in a single Puppeteer script.