.info.catchedAjax
of the ScrapeNinja response to retrieve the dumped request. This is a global handler, you can also specify XHR request catcher per-step (see "Interact with page" sectin below).
.info.log
events array from the returned ScrapeNinja response (find this particular event by filtering let xhr = body.info.log.find(e => e.type == 'xhr' && e.stepIdx == {{idx+1}})
).
Examples of valid payloads:
- JSON payload:
{\"fefe\":\"few\"}
- www-encoded payload:
key1=val1&key2=val2
Content-Type: application/x-www-form-urlencoded
to headers in case of www-encoded POST and Content-Type: application/json
function (input, cheerio) {
let $ = cheerio.load(input);
return { title: $('#title').text().trim() }
}
Grab extracted results from .extractor json property of ScrapeNinja response. Leave empty to parse everything on your side.
Raw ScrapeNinja Response:
Latency: {{ responseLatency }}ms HTTP Status: {{ responseBody.info.statusCode }}{{ responseBodyFormatted }}
Unescaped target website response
responseJson.body
property.
Use {{cmdKey}}+F for a quick search in the response body. Copy&paste this body to Cheerio Sandbox to develop your extractor.
Quick video overview of the sandbox:
Code Generator
Launching the scraper in your own Node.js environment:
(async () => { [CODE GOES HERE] } )();
. Check your Node.js version: node -v
.
Step 1. Create project folder, initialize empty npm project, and install node-fetch
mkdir your-project-folder && \ cd "$_" && \ npm i -g create-esnext && \ npm init esnext && \ npm i node-fetch -y
Step 2. Copy&paste the code above
Create new empty file like scraper.js
and paste the code to this file.
Step 3. Launch
node ./scraper.js
Launching the scraper in your own Python environment:
Step 1. Create project folder, setup venv virtual environment
mkdir your-project-folder && \ cd "$_" && \ python3 -m venv venv && \ source venv/bin/activate
Step 2: Install requests library in your virtual environment
python3 -m pip install requests
Step 3. Copy&paste the code above
Create new empty file like scraper.py
and paste the code to this file.
Step 4. Launch
python3 ./scraper.py