In this guide, we’ll show you how to use a Make.com workflow to efficiently scrape data from websites. You can either search for resources or upload your own list of URLs, making it easy to gather the information you need.
Here are four ways this tool can help boost your organization’s profits:
Supplier and Partner Research: Quickly find and evaluate potential partners by scraping data from supplier websites, making your procurement process smoother and more cost-effective.
Lead Generation: Automatically gather contact information from targeted sites to fill your sales pipeline with qualified leads, increasing your chances of closing deals.
Market Trends: Stay updated on industry changes and customer preferences by collecting real-time data, allowing you to make informed decisions that drive higher sales.
Automated Blog Management: Keep your blog fresh with the latest news and insights by scraping relevant content, engaging your audience and establishing your brand as an industry leader.
By integrating HARPA AI with the Make.com workflow, you can further streamline your processes and achieve measurable results.
Complete the basic HARPA & Make.com setup:
Sign up for Make.com and install the official HARPA AI Web Browser Agent app.
Set up an active HARPA GRID Node.
Configure the [HARPA API Connection - NEED LINK] in Make.com.
An alternative way to extract data from a webpage is by using AI processing. To start, import the Convert Page to JSON or Database Enrichment command into your HARPA. This will convert webpage content into a structured and customizable JSON format.
You can find an example of this alternative approach in this guide.
Instead of Web Searching module, you can add a connection to another module (such as your CRM, Google Sheets, or another resource that contains a list of links where you need to scrape for information).
Let's say you don't have this data and want to find a list of links based on your search query - "Contact Email Car service Austin, Texas" to find contact information for selling your products.
TIP: When creating your search query, include the words "contact" or "email" right away. This helps find pages that contain contact information.
We now have a list of links where we can search for contact emails using either the Scraping Module or AI Command.
This module will help us process links one by one, whether they are found or already in your possession, parsing information from each website.
The "Scrap Web Page" module lets you convert webpage content to Markdown format using your browser. You can select specific elements using CSS, XPath, or Text selectors.
If you don't specify elements to grab, you'll get the full webpage text. In this example, we'll focus on extracting contact emails.
Don't worry if you're not familiar with CSS, XPath, or Text selectors - you can always ask AI for help.
This is what we'll try to accomplish.
{
"selector": "a[href^='mailto:']",
"selectorType": "css", // or "auto"
"at": "all",
"take": "innerText",
"label": "emails"
}
It's fine if emails are not found on some websites - they might not exist there, or you may need to search using two different selectors where one of them will work. E.g.:
{
"selector": "//*[contains(text(),'@')]",
"selectorType": "xpath",
"at": "all",
"take": "innerText",
"label": "emails2"
}
We have created a scenario that collects emails based on search terms targeted to your potential customers. You can send this information to your Database, CRM, Google Sheets or other storage systems. You can also use more HARPA AI Commands to create custom emails for each contact.
Create a Google Sheets file and add these column headers in row 1: Title, Website, Description, Email, Page Content (optional - for creating personalized emails later)
You can add a filter to the link between modules. For example, you can filter out websites where the email field is empty, if those sites don't interest you.
Your Scenario is ready - you can now save and test it.
All rights reserved © HARPA AI TECHNOLOGIES LLC, 2021 — 2025
Designed and engineered in Finland 🇫🇮