Curious if there is a way to structure a Pickaxe so that it operates in a deep research type model, similar to how ChatGPT offers that mode option in their tool? Not sure if it’s just a matter of choosing a different OpenAI model like 4.1 vs. 4.1 Mini for my Pickaxe or if there is way to allow user to choose such a mode. I look forward to any thoughts folks have.
@danimal there is no deep reseach toggle in Pickaxe.
One (easy) option is to connect the sequential thinking MCP server to your pickaxe.
A more complex option is to build an action or an MCP server that let you run deep research.
1 Like
Here’s one I found: {
“command”: “npx”,
“args”: [
“-y”,
“@smithery/cli@latest”,
“run”,
“@ameeralns/DeepResearchMCP”,
“–key”,
“e7caa989-8b74-430f-9a05-198338579528”,
“–profile”,
“continued-cat-JxW5Pk”
]
}
Not as detailed as OpenAI’s deep research, but deeper than a normal websearch
2 Likes
The easiest way to implement deep research in your Pickaxe is by:
- Running an MCP server using a token from Make.com
- Go to Google AI Studio and get a Gemini 2.5 Pro API key.
- Create a scenario in Make.com with an HTTP module and add your Gemini API key.
- Inform your Pickaxe system prompt of when it should trigger the MCP server.
An alternative route:
- Create a Make.com scenario running a Perplexity Search module - Model: Sonar/Sonar-Pro.
Hope that helps!
1 Like
Why not use deepseek as the backend LLM ?
It has deep research and the api call is cheap
1 Like
Thank you Ned! I will look into this!
1 Like