Models
Which models are available, what they cost, and when to pick each one.
ContentAI exposes a curated list of models per provider. You can change the selection any time in Settings — and the choice is stored per provider, so switching providers doesn't lose your preferred model.
Groq models
| Model | Context | Speed | Best for |
|---|---|---|---|
llama-3.3-70b-versatile | 128k | Very fast | ⭐ Recommended default |
llama-3.1-8b-instant | 128k | Fastest | Short outputs, bulk runs |
mixtral-8x7b-32768 | 32k | Fast | Coding, structured output |
gemma2-9b-it | 8k | Very fast | Simple rewrites |
All Groq models are free at the time of writing. See the live model list at console.groq.com/docs/models.
Google Gemini models
| Model | Context | Best for |
|---|---|---|
gemini-2.5-flash | 1M | ⭐ Recommended default |
gemini-2.0-flash-exp | 1M | Experimental, fastest |
gemini-2.5-pro | 2M | Complex reasoning |
gemini-1.5-flash | 1M | Cheap, reliable legacy |
Free tier limits vary by model. gemini-2.5-flash typically offers the most daily requests on the free tier.
OpenAI models
| Model | Context | Pricing (approx) | Best for |
|---|---|---|---|
gpt-4o-mini | 128k | $ | ⭐ Budget default |
gpt-4o | 128k | $$$ | Best quality for most tasks |
gpt-4-turbo | 128k | $$$ | Long-form reasoning |
gpt-3.5-turbo | 16k | $ | Very cheap, short outputs |
Check openai.com/pricing for current rates.
Anthropic (Claude) models
| Model | Context | Best for |
|---|---|---|
claude-3-5-sonnet-20241022 | 200k | ⭐ Best for long-form writing |
claude-3-5-haiku-20241022 | 200k | Fast + cheap |
claude-3-opus-20240229 | 200k | Highest reasoning quality |
Choosing the right model
- Blog posts, articles, long-form → Claude Sonnet, GPT-4o, or Llama 3.3 70B
- Short social media, taglines → Llama 3.1 8B Instant or GPT-4o-mini
- SEO meta descriptions, keywords → Any fast model is fine
- Bulk runs for cheap → Groq (free) or GPT-4o-mini
- When you need structure / JSON → GPT-4o or Claude Sonnet
Adding a model
To add a model that's not in the dropdown, edit lib/store.ts:
export const providerModels: Record<AIProvider, string[]> = {
groq: [
"llama-3.3-70b-versatile",
"llama-3.1-8b-instant",
"mixtral-8x7b-32768",
"your-new-model-here",
],
// ...
};The model ID must match the one the provider's API expects.
Changing the default model
The defaults are set in lib/store.ts in the selectedModels initial state. Update the string for any provider:
selectedModels: {
groq: "llama-3.3-70b-versatile", // default
google: "gemini-2.5-flash",
openai: "gpt-4o-mini",
anthropic: "claude-3-5-sonnet-20241022",
}Next: Theming →