What are the specs on the Llama 3 Model option?

What are the parameters etx. for the new Llama 3 model we can now use?

  1. Context window size
  2. Do we need an API key?
  3. Multimodal?
  4. cost?
  5. .Credits used when using PickAxe version?
    Thanks,
    Gerd

Hey @devsdaccount

A1) You can see the max context window amount in the bottom right corner of your Pickaxe.

A2) You get credits to spend per use from Pickaxe (the amount varies depending on your subscription type).
-If you have a monetized studio and you have active users, it’s advised that you create an API account with the LLM provider and get an API for each Pickaxe you publish.

A3) Based on the LLM provider’s dedicated webpage, it’s in their road map:

A4) Check this out!
https://www.perplexity.ai/search/meta-lama-3-api-pricing-zB4qmVXcRaWRgKOOmiNAtQ#0

A5) Check out this community post regarding using Pickaxe credits

@admin_mike to confirm, which Llama 3 version is currently accessible on Pickaxe?

1 Like

This is all very correct!

The model is a special version of Llama 3.

You cannot use your own API key for the Llama model. We are hosting it on the cloud and then offering you access.

The advantage of this model over others, which we don’t particularly want to advertise, is that it is uncensored.

1 Like

Thank you, this helped a lot to understand the Llama optoin.

1 Like