Getting Started with Askimo CLI
Dieser Inhalt ist noch nicht in deiner Sprache verfügbar.
After installing Askimo, choose a provider and a model, then start chatting. Askimo saves your settings locally, so you won’t need to repeat these steps next time.
👉 If you don’t choose a model, Askimo will use the default for that provider (except Ollama).
Quick start (works the same for any provider)
Section titled “Quick start (works the same for any provider)”askimo> :set-provider <ollama|openai|gemini|xai|anthropic|docker|localai|lmstudio>askimo> :models # list models available for that provideraskimo> :set-param model <model-id> # optional if the provider has a defaultaskimo> "Hello! Summarize this text."Using Ollama (local models)
Section titled “Using Ollama (local models)”- Install Ollama (see ollama.com)
- Pull a model, for example:
ollama pull <model-name>- In Askimo:
askimo> :set-provider ollamaaskimo> :modelsaskimo> :set-param model <model-name>askimo> "Explain Redis caching in simple terms."If :models is empty, pull one with ollama pull <name> and try again.
Using OpenAI
Section titled “Using OpenAI”- Get an API key → https://platform.openai.com/api-keys
- Configure Askimo and chat:
askimo> :set-provider openaiaskimo> :set-param api_key sk-...askimo> :modelsaskimo> "Explain Redis caching in simple terms."Use Gemini (Google)
Section titled “Use Gemini (Google)”- Get an API key → https://aistudio.google.com
- Configure and chat:
askimo> :set-provider geminiaskimo> :set-param api_key <your-gemini-key>askimo> :modelsaskimo> "Give me five CLI productivity tips."Use X AI (Grok)
Section titled “Use X AI (Grok)”- Get an API key → https://x.ai
- Configure and chat:
askimo> :set-provider xaiaskimo> :set-param api_key <your-xai-key>askimo> :modelsaskimo> :set-param model <model-id> # optionalaskimo> "What's new in Java 21?"Using Anthropic Claude
Section titled “Using Anthropic Claude”- Get an API key → https://console.anthropic.com/
- Configure and chat:
askimo> :set-provider anthropicaskimo> :set-param api_key <your-anthropic-key>askimo> :modelsaskimo> "Analyze this code for potential improvements."Using Docker AI
Section titled “Using Docker AI”- Enable Docker AI model runner:
docker desktop enable model-runner --tcp 12434- Pull a model:
docker model pull <model-name>- Configure Askimo and chat:
askimo> :set-provider dockeraskimo> :modelsaskimo> :set-param model <model-name>askimo> "Explain containerization concepts."📌 Default endpoint: http://localhost:12434
Using LocalAI
Section titled “Using LocalAI”- Install LocalAI (see localai.io)
- Configure and chat:
askimo> :set-provider localaiaskimo> :set-param base_url http://localhost:8080 # your LocalAI endpointaskimo> :modelsaskimo> :set-param model <model-name>askimo> "Help me debug this function."Using LM Studio
Section titled “Using LM Studio”- Install LM Studio (see lmstudio.ai)
- Start the local server in LM Studio
- Configure Askimo and chat:
askimo> :set-provider lmstudioaskimo> :set-param base_url http://localhost:1234 # default LM Studio portaskimo> :modelsaskimo> :set-param model <model-name>askimo> "Generate a regex pattern for email validation."Switch any time
Section titled “Switch any time”You can switch providers/models on the fly; Askimo remembers your last choices.
askimo> :set-provider ollamaaskimo> :set-param model <model-name>askimo> :set-provider openai