How to implement a "deep research" mode for a Pickaxe?

Curious if there is a way to structure a Pickaxe so that it operates in a deep research type model, similar to how ChatGPT offers that mode option in their tool? Not sure if it’s just a matter of choosing a different OpenAI model like 4.1 vs. 4.1 Mini for my Pickaxe or if there is way to allow user to choose such a mode. I look forward to any thoughts folks have.

@danimal there is no deep reseach toggle in Pickaxe.

One (easy) option is to connect the sequential thinking MCP server to your pickaxe.

A more complex option is to build an action or an MCP server that let you run deep research.

1 Like

Here’s one I found: {
“command”: “npx”,
“args”: [
“-y”,
@smithery/cli@latest”,
“run”,
@ameeralns/DeepResearchMCP”,
“–key”,
“e7caa989-8b74-430f-9a05-198338579528”,
“–profile”,
“continued-cat-JxW5Pk”
]
}

Not as detailed as OpenAI’s deep research, but deeper than a normal websearch

2 Likes

@stephenbdiaz,

The easiest way to implement deep research in your Pickaxe is by:

  1. Running an MCP server using a token from Make.com
  2. Go to Google AI Studio and get a Gemini 2.5 Pro API key.
  3. Create a scenario in Make.com with an HTTP module and add your Gemini API key.
  4. Inform your Pickaxe system prompt of when it should trigger the MCP server.

An alternative route:

  1. Create a Make.com scenario running a Perplexity Search module - Model: Sonar/Sonar-Pro.

Hope that helps!

1 Like

Why not use deepseek as the backend LLM ?

It has deep research and the api call is cheap

1 Like

@netmstr you can choose Deepseek R1 from the Perplexity module in Make.com

Thank you Ned! I will look into this!

1 Like