I have multiple PickAxes - many reference the same information. I would love to have a central repository of a knowledge base that I can build up over time, and then have them loaded into the PickAxe. Eventually would be amazing to be able to just update the knowledge base.
For me as the leading thought leader - if I had a WIKI or something that I could update and fill with tons of information I could have a killer GPT. Maybe an integration with Notion or Guru?
A centralized shared knowledge base is part of the new redesign. You will be able to upload documents into a shared knowledge base and then toggle them on/off within individual tools.
That’s very helpful to know. Just to be clear, are you saying that the new Studio knowledge file is able to handle a link to a notion database without having to export as CSV and upload every time?
Like I could provide a URL to a notion database and it will dynamically update my knowledge files each time I use the pickaxe?
Actually I just tried it. I don’t even think it scraped my notion database at all…
I set my database as public and added the URL to the website knowledge file. Pickaxe said “Scraping” but it only showed one chunked item. It looked like it only got like the SEO URL description or something. I don’t think it received any data from my actual database/spreadsheet column and rows.
Here’s a link to the Database that I’m wanting to use.
My goal is to have a list of terminologies, definitions, and 5 output examples, with guidelines for each output so I could use this similar table setup again and again throughout multiple pickaxe builds.
Previously, I had to export as CSV file every time I update my pickaxe, it gets tedious. But I’d like Pickaxe to be able to dynamically update based on the notion database I give it.
I read on another forum that you mentioned the Pickaxe Knowledge web scraper doesn’t scrape cloud based doc editors like google docs and notion because they are not real webpages.
That gave me an idea to publish my Notion Database on Super.so as a “real webpage” then I wondered if your web scaper would recognize it. Because Super.so uses Notion databases as a CMS Content Managemen System and publishes live updates as a real webpage.
Sure enough after signing up with a free super.so account and publishing my notion database on there, your pickaxe webscraper worked. At least it went from one chunk using Notion public link to 13 chunks using Super.so notion → website publishing tool.
However, I read how you created your own specialized way of chunking CSV files in a really powerful manner. I have 28 rows on my Notion Spreadsheet and only 13 chunks.
Which tells me it didn’t latch onto your super cool way of chunking CSV files. It appears to have treated this Super.so website as a regular webpage not a spreadsheet. Which is progress! I could probably use this if I have a blog article on Notion or some document page that I’d like to dynamically update or use as a knowledge file. Tables seem out of the question for now.
Unless, is your team open to teaching your webscraper to recognize a table on a webpage and chunk it like it’s a CSV file? Here’s an example of a simple webpage I created. This would be incredibly convenient for my pickaxe knowledge files if I didn’t have to export a CSV file every time I want to tweak or test a new prompt. https://useful-rat.super.site/
I love the new Studio design, thank you, pickaxe team for all your hard work!