- type: say
message: >-
**Tip:** This command performs a thorough in-depth online search for a given
query or topic and is best used for research. It takes time and is token
hungry (~20k tokens on GPT-4o-mini per search depth).
- param: userQuery
message: '![icon](/img/commands/general-search-refraction.svg) Type your search query:'
type: ask
options: null
default: ''
- param: depth
message: >-
**Set the search depth.**
Search depth indicates how many search pages will be analyzed. Each depth
level generates an extra sub-query, improving search quality.
options:
- label: ๐ SEARCH
value: 1
- label: 2 SUB-QUERIES
value: 2
- label: 3 SUB-QUERIES
value: 3
- label: 4 SUB-QUERIES
value: 4
optionsInvalid: false
type: ask
default: ''
- steps:
- prompt: >-
You are a Research Agent. I need you to break down my search query into
{{depth}} clarifying queries in {{language}}, the answers to which are
necessary to form a detailed response to my query: "{{userQuery}}"
Follow the instructions:
- The first sub-query should be as close as possible to My Query in
wording. If my query is short and simple - repeat it in the first
sub-query.
- Subsequent sub-queries should clarify the first one by focusing on
different aspects or using alternative wordings.
- Each sub-query should include keywords from [my search query]. For
example, if asking about a product, include its name, brand, or key
features in all sub-queries to find relevant info.
- Use Google search operators like "OR", "AND", "site:", "-",
"filetype:", "intitle:", "intext:", "inurl:", and parentheses to refine
the sub-queries and get the best search results. For example:
- Use "OR" to include synonyms or related terms
- Use quotation marks for exact phrases
- Use parentheses to group search terms for complex queries
- Use "site:" or "-site:" to include or exclude specific domains (sparingly and only when relevant)
- Be cautious with the "-" and "-site:" operator to avoid excluding with the minus operator potentially relevant results
- If My Query is saying to exclude sites or information, then USE the
minus operator like "-site:" or "-term" IN EVERY query. If they say
something like "No site X and/or site Y", then use "-site:" for both
sites in every query.
- Sub-queries should be brief. Imagine typing a brief Google query for a
precise answer.
- Do not repeat my command and instructions.
- Response in JSON array with {{depth}} objects (queries).
- Write nothing other than the JSON.
Example input: "I want to buy a cordless vacuum cleaner under $500,
bagless, only from brand XY, and only handheld models that are long and
don't require bending down."
Example JSON response:
[
{
"query": "cordless vacuum cleaner (under OR less than OR up to) $500 (from OR by) brand \"XY\""
},
{
"query": "handheld vacuum \"brand XY\" (long OR \"no bending\")"
},
{
"query": "vacuum cleaner \"bagless\" (under OR up to) $500 (brand OR manufacturer) XY"
},
{
"query": "\"XY handheld vacuum\" (bagless OR \"without bags\") (long OR ergonomic OR \"no bending\") (price OR cost) \"under 500\""
}
]
JSON response:
param: subqueries
condition: '{{depth}} > 1'
type: gpt
silent: true
- type: calc
func: extract-json
to: subqueries
param: subqueries
index: ''
- message: >-
๐งฉ To give a comprehensive response, your question was split into
**{{subqueries.length}}** sub-queries.
type: say
- steps:
- value: '{{item.query}}'
func: set
param: query
format: text
type: calc
- message: 'โณ Scanning information for sub-query: "**{{query}}**".'
type: say
- param: information
value: '{{serp {{query}}}}'
type: calc
func: set
format: ''
- func: serp.extract-links
to: links
type: calc
from: information
- param: total
value: '{{links.length}}'
format: number
type: calc
func: set
- steps:
- value: '{{page {{item.url}}}}'
param: content
type: calc
func: set
format: ''
- type: js
code: |-
const content = args['content'];
return {
chars: content.length,
estimatedTokens: Math.ceil(content.length / 4),
estimatedWords: Math.ceil(content.length / 4 * 0.75)
};
param: count
timeout: 15000
onFailure: SAY STATUS
label: COUNT
args: content
silent: true
- param: pageUrl
value: '{{item.url}}'
type: calc
func: set
format: text
- code: |-
const regex = /^(?:https?:\/\/)?(?:www\.)?([^\/]+)/;
const testString = pageUrl;
const matches = testString.match(regex);
if (matches) {
const hostname = matches[1];
return hostname;
}
param: hostname
label: HOSTNAME
type: js
args: pageUrl
timeout: 15000
silent: true
- func: increment
param: index
type: calc
delta: 1
- args: index, total
code: |-
function calculatePercentage(index, total) {
index = Number(index);
total = Number(total);
if (isNaN(index) || isNaN(total)) {
return "Error";
}
return (index / total * 100).toFixed(1) + "%";
}
let result = calculatePercentage(index, total);
return result;
param: percentage
label: PERCENTAGE INDEX TOTAL
type: js
timeout: 15000
silent: true
- type: group
steps:
- steps:
- prompt: >-
Act as a professional researcher.
Analyze the given [web page content] and summarize
useful information to answer my query: "{{query}}"
Instructions:
- Imagine you are a Google Search engine, collecting
relevant information from various websites to form an
answer to my query in {{language}}.
- Focus on providing detailed factual information,
statistics, and specific examples when available.
- Gather all information that will be useful in
answering my question, including context, numbers, dates
and concrete details.
- Extract as much useful information as possible from
the page content.
- Avoid general phrases.
- When referring to someone's opinion about something,
mention the author. Quote if appropriate.
- If you haven't found anything useful, don't make up
information or guess - simply say "No relevant
information found" in {{language}}.
- Do not echo my prompt in your response.
- Respond with a JSON object containing a single field:
"info".
- Write nothing other than the JSON.
Example JSON response:
{
"info": "Write here textual information that would help answer my query. Write as much as you think is necessary, focusing on providing useful information for further processing."
}
[Web page content]: {{content}}.
JSON response:
param: data
type: gpt
isolated: true
silent: true
- index: first
type: calc
func: extract-json
to: data
param: data
- param: data.url
format: auto
type: calc
func: set
value: '{{item.url}}'
- param: data.title
value: '{{item.title}}'
type: calc
func: set
format: auto
- list: array
func: list-add
index: last
type: calc
item: data
- type: jump
to: SAY STATUS
condition: '{{content}} =~ ^[\s\S]{1000,}$'
label: FETCHED
type: group
condition: >-
{{hostname}} =~
^(?!.*(reddit\.com|redd\.it)).*(?:https?:\/\/|www\.)?[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}.*$
label: IS IT REDDIT?
- steps:
- args: item.url
code: >-
// Get URL from args parameter
const url = args['item.url'];
if (!url) {
return false;
}
const fullUrl = url.startsWith('http') ? url : 'https://' +
url;
window.location.href = fullUrl;
return true;
param: navigate.boolean
onFailure: ABORT NAVIGATE
label: NAVIGATE
type: js
timeout: 15000
silent: true
- type: wait
for: custom-delay
delay: '1500'
silent: true
label: ABORT NAVIGATE
- steps:
- param: setThread
value: '{{thread}}'
type: calc
func: set
format: ''
- code: |-
// For setThread
const text = args['setThread'];
const tokenCount = Math.ceil(text.length / 4);
return {
chars: text.length,
estimatedTokens: tokenCount,
estimatedWords: Math.ceil(tokenCount * 0.75)
};
onFailure: SOCIAL-MEDIA PROMPT
type: js
args: setThread
param: count
timeout: 15000
silent: true
label: COUNT
- prompt: >-
Act as a professional researcher.
Analyze the given [web page content] and summarize
useful information to answer my query: "{{query}}"
Instructions:
- Imagine you are a Google Search engine, collecting
relevant information from various websites to form an
answer to my query in {{language}}.
- Focus on providing detailed factual information,
statistics, and specific examples when available.
- Gather all information that will be useful in
answering my question, including context, numbers, dates
and concrete details.
- Extract as much useful information as possible from
the page content.
- Avoid general phrases.
- When referring to someone's opinion about something,
mention the author. Quote if appropriate.
- If you haven't found anything useful, don't make up
information or guess - simply say "No relevant
information found" in {{language}}.
- Do not echo my prompt in your response.
- Respond with a JSON object containing a single field:
"info".
- Write nothing other than the JSON.
Example JSON response:
{
"info": "Write here textual information that would help answer my query. Write as much as you think is necessary, focusing on providing useful information for further processing."
}
[Web page content]: {{setThread}}.
JSON response:
type: gpt
isolated: true
param: data
silent: true
label: SOCIAL-MEDIA PROMPT
label: SOCIAL-MEDIA
condition: >-
{{hostname}} =~
^(?:reddit|facebook|twitter|telegram|discord|whatsapp)\.com$
type: group
- steps:
- param: setPageThread
value: '{{page}} {{thread}}'
type: calc
func: set
format: ''
- code: |-
// For setPageThread
const text = args['setPageThread'];
const tokenCount = Math.ceil(text.length / 4);
return {
chars: text.length,
estimatedTokens: tokenCount,
estimatedWords: Math.ceil(tokenCount * 0.75)
};
onFailure: NOT-SOCIAL-MEDIA PROMPT
type: js
args: setPageThread
param: count
timeout: 15000
silent: true
label: COUNT
- prompt: >-
Act as a professional researcher.
Analyze the given [web page content] and summarize
useful information to answer my query: "{{query}}"
Instructions:
- Imagine you are a Google Search engine, collecting
relevant information from various websites to form an
answer to my query in {{language}}.
- Focus on providing detailed factual information,
statistics, and specific examples when available.
- Gather all information that will be useful in
answering my question, including context, numbers, dates
and concrete details.
- Extract as much useful information as possible from
the page content.
- Avoid general phrases.
- When referring to someone's opinion about something,
mention the author. Quote if appropriate.
- If you haven't found anything useful, don't make up
information or guess - simply say "No relevant
information found" in {{language}}.
- Do not echo my prompt in your response.
- Respond with a JSON object containing a single field:
"info".
- Write nothing other than the JSON.
Example JSON response:
{
"info": "Write here textual information that would help answer my query. Write as much as you think is necessary, focusing on providing useful information for further processing."
}
[Web page content]: {{setPageThread}}.
JSON response:
type: gpt
isolated: true
param: data
silent: true
label: NOT-SOCIAL-MEDIA PROMPT
condition: >-
{{hostname}} =~
^(?!(?:reddit|facebook|twitter|telegram|discord|whatsapp)\.com)[a-zA-Z0-9-]+\.[a-zA-Z]+$
label: NOT-SOCIAL-MEDIA
type: group
- type: calc
func: extract-json
to: data
param: data
index: first
- type: calc
func: set
param: data.url
format: auto
value: '{{item.url}}'
- type: calc
func: set
param: data.title
format: auto
value: '{{item.title}}'
- type: calc
func: list-add
index: last
list: array
item: data
label: NOT FETCHED
type: group
- message: >-
๐ Analyzed **{{index}} / {{total}}**,
[{{hostname}}]({{pageUrl}})
- Chars: {{count.chars}}
- Estimated Tokens: {{count.estimatedTokens}}
- Estimated Words: {{count.estimatedWords}}
label: โ๏ธSAY STATUS
condition: '{{index}} = {{total1}}'
type: say
- message: |-
๐ Analyzed **{{percentage}}**, [{{hostname}}]({{pageUrl}})
- Chars: {{count.chars}}
- Estimated Tokens: {{count.estimatedTokens}}
- Estimated Words: {{count.estimatedWords}}
condition: '{{index}} != {{total}}'
type: say
label: SAY STATUS
- code: |-
const content = args['array'];
const stringContent = JSON.stringify(content);
return {
chars: stringContent.length,
estimatedTokens: Math.ceil(stringContent.length / 4),
estimatedWords: Math.ceil(stringContent.length / 4 * 0.75)
};
param: arrayLength
label: ARRAY LENGTH
type: js
args: array
timeout: 15000
onFailure: SAY STATUS
silent: true
- message: |-
โ
**{{percentage}}** pages scanned for "**{{query}}**".
Last checked page: [{{hostname}}]({{pageUrl}})
- Chars: {{count.chars}}
- Estimated Tokens: {{count.estimatedTokens}}
- Estimated Words: {{count.estimatedWords}}
- Array Token length: **{{arrayLength.estimatedTokens}}**
condition: '{{index}} = {{total}}'
type: say
label: SAY STATUS
type: loop
list: links
type: loop
list: subqueries
- message: >-
๐ค Analyzed **{{array.length}}** pages from **{{subqueries.length}}**
Google searches.
Now preparing a comprehensive answer to your question.
type: say
label: DEPTH > 1
type: group
condition: '{{depth}} > 1'
- steps:
- message: โณ Give me a minute to search through the web for you...
type: say
- value: '{{userQuery}}'
type: calc
func: set
param: query
format: text
- type: calc
func: set
param: information
format: ''
value: '{{serp {{query}}}}'
- type: calc
func: serp.extract-links
from: information
to: links
- type: calc
func: set
param: total
format: number
value: '{{links.length}}'
- steps:
- label: TRYING TO FETCH
type: calc
func: set
param: content
format: ''
value: '{{page {{item.url}}}}'
- onFailure: FAILED COUNT
type: js
args: content
code: |-
const content = args['content'];
return {
chars: content.length,
estimatedTokens: Math.ceil(content.length / 4),
estimatedWords: Math.ceil(content.length / 4 * 0.75)
};
param: count
timeout: 15000
label: COUNT
silent: true
- type: calc
func: set
param: pageUrl
format: text
value: '{{item.url}}'
label: FAILED COUNT
- type: js
args: pageUrl
code: |-
const regex = /^(?:https?:\/\/)?(?:www\.)?([^\/]+)/;
const testString = pageUrl;
const matches = testString.match(regex);
if (matches) {
const hostname = matches[1];
return hostname;
}
param: hostname
timeout: 15000
silent: true
label: HOSTNAME
- type: calc
func: increment
param: index
delta: 1
- type: js
args: index, total
code: |-
function calculatePercentage(index, total) {
index = Number(index);
total = Number(total);
if (isNaN(index) || isNaN(total)) {
return "Error";
}
return (index / total * 100).toFixed(1) + "%";
}
let result = calculatePercentage(index, total);
return result;
param: percentage
timeout: 15000
silent: true
label: PERCENTAGE INDEX TOTAL
- type: group
steps:
- steps:
- type: gpt
prompt: >-
Act as a professional researcher.
Analyze the given [web page content] and summarize useful
information to answer my query: "{{query}}"
Instructions:
- Imagine you are a Google Search engine, collecting
relevant information from various websites to form an answer
to my query in {{language}}.
- Focus on providing detailed factual information,
statistics, and specific examples when available.
- Gather all information that will be useful in answering my
question, including context, numbers, dates and concrete
details.
- Extract as much useful information as possible from the
page content.
- Avoid general phrases.
- When referring to someone's opinion about something,
mention the author. Quote if appropriate.
- If you haven't found anything useful, don't make up
information or guess - simply say "No relevant information
found" in {{language}}.
- Do not echo my prompt in your response.
- Respond with a JSON object containing a single field:
"info".
- Write nothing other than the JSON.
Example JSON response:
{
"info": "Write here textual information that would help answer my query. Write as much as you think is necessary, focusing on providing useful information for further processing."
}
[Web page content]: {{content}}.
JSON response:
isolated: true
param: data
silent: true
- type: calc
func: extract-json
to: data
param: data
index: first
- type: calc
func: set
param: data.url
format: auto
value: '{{item.url}}'
- type: calc
func: set
param: data.title
format: auto
value: '{{item.title}}'
- type: calc
func: list-add
index: last
list: array
item: data
- type: jump
to: SAY STATUS
condition: '{{content}} =~ ^[\s\S]{1000,}$'
type: group
label: FETCHED
condition: >-
{{hostname}} =~
^(?!.*(reddit\.com|redd\.it)).*(?:https?:\/\/|www\.)?[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}.*$
label: IS IT REDDIT?
- steps:
- type: js
args: item.url
code: |-
// Get URL from args parameter
const url = args['item.url'];
if (!url) {
return false;
}
const fullUrl = url.startsWith('http') ? url : 'https://' + url;
window.location.href = fullUrl;
return true;
param: navigate.boolean
timeout: 15000
onFailure: ABORT NAVIGATE
label: NAVIGATE
silent: true
- type: wait
for: custom-delay
silent: true
label: ABORT NAVIGATE
delay: '1500'
- steps:
- type: calc
func: set
param: setThread
format: ''
value: '{{thread}}'
- type: js
args: setThread
code: |-
// For setThread
const text = args['setThread'];
const tokenCount = Math.ceil(text.length / 4);
return {
chars: text.length,
estimatedTokens: tokenCount,
estimatedWords: Math.ceil(tokenCount * 0.75)
};
param: count
timeout: 15000
onFailure: SOCIAL-MEDIA PROMPT
label: COUNT
silent: true
- type: gpt
prompt: >-
Act as a professional researcher.
Analyze the given [web page content] and summarize useful
information to answer my query: "{{query}}"
Instructions:
- Imagine you are a Google Search engine, collecting
relevant information from various websites to form an answer
to my query in {{language}}.
- Focus on providing detailed factual information,
statistics, and specific examples when available.
- Gather all information that will be useful in answering my
question, including context, numbers, dates and concrete
details.
- Extract as much useful information as possible from the
page content.
- Avoid general phrases.
- When referring to someone's opinion about something,
mention the author. Quote if appropriate.
- If you haven't found anything useful, don't make up
information or guess - simply say "No relevant information
found" in {{language}}.
- Do not echo my prompt in your response.
- Respond with a JSON object containing a single field:
"info".
- Write nothing other than the JSON.
Example JSON response:
{
"info": "Write here textual information that would help answer my query. Write as much as you think is necessary, focusing on providing useful information for further processing."
}
[Web page content]: {{setThread}}.
JSON response:
isolated: true
param: data
silent: true
label: SOCIAL-MEDIA PROMPT
type: group
condition: >-
{{hostname}} =~
^(?:reddit|facebook|twitter|telegram|discord|whatsapp)\.com$
label: SOCIAL-MEDIA
- steps:
- label: SET CONTENT
type: calc
func: set
param: setPageThread
format: ''
value: '{{page}} {{thread}}'
- type: js
args: setPageThread
code: |-
// For setPageThread
const text = args['setPageThread'];
const tokenCount = Math.ceil(text.length / 4);
return {
chars: text.length,
estimatedTokens: tokenCount,
estimatedWords: Math.ceil(tokenCount * 0.75)
};
param: count
timeout: 15000
onFailure: NOT-SOCIAL-MEDIA PROMPT
silent: true
label: COUNT
- type: gpt
prompt: >-
Act as a professional researcher.
Analyze the given [web page content] and summarize useful
information to answer my query: "{{query}}"
Instructions:
- Imagine you are a Google Search engine, collecting
relevant information from various websites to form an answer
to my query in {{language}}.
- Focus on providing detailed factual information,
statistics, and specific examples when available.
- Gather all information that will be useful in answering my
question, including context, numbers, dates and concrete
details.
- Extract as much useful information as possible from the
page content.
- Avoid general phrases.
- When referring to someone's opinion about something,
mention the author. Quote if appropriate.
- If you haven't found anything useful, don't make up
information or guess - simply say "No relevant information
found" in {{language}}.
- Do not echo my prompt in your response.
- Respond with a JSON object containing a single field:
"info".
- Write nothing other than the JSON.
Example JSON response:
{
"info": "Write here textual information that would help answer my query. Write as much as you think is necessary, focusing on providing useful information for further processing."
}
[Web page content]: {{setPageThread}}.
JSON response:
isolated: true
param: data
silent: true
label: NOT-SOCIAL-MEDIA PROMPT
type: group
label: NOT-SOCIAL-MEDIA
condition: >-
{{hostname}} =~
^(?!(?:reddit|facebook|twitter|telegram|discord|whatsapp)\.com)[a-zA-Z0-9-]+\.[a-zA-Z]+$
- type: calc
func: extract-json
to: data
param: data
index: first
- type: calc
func: set
param: data.url
format: auto
value: '{{item.url}}'
- type: calc
func: set
param: data.title
format: auto
value: '{{item.title}}'
- type: calc
func: list-add
index: last
list: array
item: data
type: group
label: NOT FETCHED
- type: say
message: |-
๐ Analyzed **{{index}} / {{total}}**, [{{hostname}}]({{pageUrl}})
- Chars: {{count.chars}}
- Estimated Tokens: {{count.estimatedTokens}}
- Estimated Words: {{count.estimatedWords}}
label: โ๏ธSAY STATUS
condition: '{{index}} = {{total1}}'
- type: say
message: |-
๐ Analyzed **{{percentage}}**, [{{hostname}}]({{pageUrl}})
- Chars: {{count.chars}}
- Estimated Tokens: {{count.estimatedTokens}}
- Estimated Words: {{count.estimatedWords}}
label: SAY STATUS
condition: '{{index}} != {{total}}'
- type: js
args: array
code: |-
const content = args['array'];
const stringContent = JSON.stringify(content);
return {
chars: stringContent.length,
estimatedTokens: Math.ceil(stringContent.length / 4),
estimatedWords: Math.ceil(stringContent.length / 4 * 0.75)
};
param: arrayLength
timeout: 15000
onFailure: SAY STATUS
label: ARRAY LENGTH
silent: true
- message: |-
โ
**{{percentage}}** pages scanned for "**{{query}}**".
Last checked page: [{{hostname}}]({{pageUrl}})
- Chars: {{count.chars}}
- Estimated Tokens: {{count.estimatedTokens}}
- Estimated Words: {{count.estimatedWords}}
- Array Token length: **{{arrayLength.estimatedTokens}}**
type: say
label: SAY STATUS
condition: '{{index}} = {{total}}'
type: loop
list: links
condition: '{{depth}} = 1'
label: DEPTH = 1
type: group
- prompt: >-
Please ignore all previous instructions. I want you to only respond in
{{language}}.
You are a Research AI Agent.
I would like you create a comprehensive answer to [MY QUESTION], adhering to
a specific format I provide:
- Utilize [Information from the Web], provided below.
- Extract as much useful information as possible from the web search results
- Integrate insights from multiple web search sources, and ensure inclusion
of relevant links in a markdown fully rendered format: [โ](URL), [โ](URL)
etc, to each bullet point.
- Every URL should correspond to only one symbol: โ โ โ โ โ โ โ โ โ โ.
- If multiple sources repeat the same information, describe it and cite all
sources at once.
- If you know the answer, add your own knowledge to make it more complete.
- Do not hallucinate facts or information.
- Do not use any other tools.
- Avoid general phrases and be more specific and detailed.
Follow the [RESPONSE FORMAT]:
## Key takeaway:
Provide a single, most important takeaway from the web search results in
{{language}}.
## Detailed answer:
Analyze and present detailed information that helps answer my question.
There should be no limit in words or bullet points to the report. Ensure
that all ideas, facts, and relevant information are concisely reported, and
the answer is comprehensive. Incorporate the maximum number of source links
within the text.
[MY QUESTION]: {{userQuery}}
[Information from the Web]:
{{array}}
[COMPREHENSIVE RESPONSE WITH SOURCE LINKS ([โ](URL))]:
type: gpt
isolated: true
param: gpt
dumb: false
This automation command is created by a community member. HARPA AI team does not audit community commands.
Please review the command carefully and only install if you trust the creator.
All rights reserved ยฉ HARPA AI TECHNOLOGIES LLC, 2021 โ 2025
Designed and engineered in Finland ๐ซ๐ฎ