I had a very specific need. Finding the current state of an Airbnb market in a specific location. Yea, tools for this exist, but that cost money. Here is a more interesting and indirect route. The result is a weird mash up of AI with a simple tampermonkey script. Yea, it did take twice as long as needed, but I had fun and learned a bit more about userscripts for your browser.
Part one: Get that Data
Full disclosure: I asked the AI to write this section, based on the tamper monkey code I wrote here. Prompt "describe this code from the perspective of a gonzo journalist gopher". Comments in the code are my own.
Hold on to your burrows, fellow mammals, because this gopher has got some JavaScript code to analyze! This script is a browser extension that digs deep into an Airbnb listing to download data automatically.
To start, there's a function called waitForElementToExist, which is like a gopher waiting to pounce on some tasty roots. This function keeps burrowing through the page until it finds an element that matches the provided CSS selector. When it finds it, it sends a signal back to the main script that the root is ready to be harvested.
function waitForElementToExist(selector) {
return new Promise((resolve, reject) => {
// An element might not be on the page, so we retry every 100 ms to find the element.
// Yea, it might block forever but it's good enough!
let intervalId = setInterval(() => {
let element = document.querySelector(selector);
if (element) {
clearInterval(intervalId);
resolve(element);
}
}, 100);
})
}
Once all the juicy elements have been harvested, the script goes into full tunnel mode with Promise.all. It waits for all of the promises returned by waitForElementToExist to be fulfilled, then snatches the innerText from each element and stores it like a gopher storing food for the winter.
// get at the data we want
const [cost, title, descriptionText] = await Promise.all([
// Airbnb uses these "lovely", "semantic" class names (not!).
waitForElementToExist('._tyxjp1').then(e => e.innerText),
// I think we can thank a css in js tool for this :grin:
waitForElementToExist('.hghzvl1').then(e => e.innerText),
waitForElementToExist('.ll4r2nl').then(e => e.innerText)]);
After that, the script builds a burrow of data by creating a JavaScript object with all the data it has gathered, including the cost, title, description text, URL, and number of rooms. Then it uses its keen eyesight to log that data to the console like a gopher signaling its friends.
Here's where things get really interesting. The script creates a Blob object from the data as a JSON string, and then creates a tunnel-like URL for the Blob object. What is a Blob object, you ask? It's like a burrow of soil, only it's a binary large object that can be used to store and transfer data. And that URL? It's like a tunnel that leads straight to a JSON file where the data resides.
Finally, the script surfaces like a gopher to create a link element, sets some attributes like download and href to the juicy data-filled Blob URL, and then simulates a click on the link. Just like that, the JSON file containing all the data is downloaded faster than a gopher can dig a new burrow.
// Create a Blob object from the data as a JSON string
var blob = new Blob([JSON.stringify( {cost, title, descriptionText, url, rooms}, null, 2)], {type: 'application/json'});
// Create a URL for the Blob object
var url2 = URL.createObjectURL(blob);
// Create a link element and click it to download the file
var link = document.createElement('a');
link.download = title.toLowerCase().split('').filter(c => c.match(/[a-z]/i)).join("") + '.json';
link.href = url2;
link.click();
// the end result here is that when you visit a page, it will automagically download a json file with the
So, my fellow gophers, that's how this script works. It's like the gopher of browser extensions, digging deep to gather data and bringing it back to the surface for all to see.
step 2: Use open AI to classify and find the most relevant listing to your property
For the next part, I load all this data I scraped from airbnb into a database, then analyze each row of the data and get some unique AI insights from each one.
Interesting? let me know.
Top comments (0)