AI Providers
Cyberstrike supports multiple AI providers, giving you flexibility in choosing the best model for your security assessments.
📸 SCREENSHOT: s05-provider-dialog.png
Provider seçim dialogu
📊 DIAGRAM: provider-selection-flow.mermaid
Provider seçim akış diyagramı
Supported Providers
| Provider | Models | Best For |
|---|---|---|
| Anthropic | Claude Opus 4.5, Sonnet 4, Haiku | Complex reasoning, long context |
| OpenAI | GPT-4o, o1, o1-preview | General purpose, fast responses |
| Gemini 2.0 Flash, Pro | Multimodal, large context | |
| AWS Bedrock | Claude, Llama, Titan | Enterprise, on-prem |
| Azure OpenAI | GPT-4, GPT-4o | Enterprise compliance |
| Google Vertex AI | Gemini, Claude | Enterprise GCP |
| OpenRouter | Many models | Free tier, model variety |
| Groq | Llama 3.3, Mixtral | Speed, cost efficiency |
| Ollama | Local models | Privacy, offline use |
Adding Credentials
Interactive Login
cyberstrike auth loginSelect your provider and enter your API key.
Environment Variables
Set API keys in your environment:
# Anthropicexport ANTHROPIC_API_KEY="sk-ant-..."
# OpenAIexport OPENAI_API_KEY="sk-..."
# Googleexport GOOGLE_API_KEY="AI..."
# OpenRouterexport OPENROUTER_API_KEY="sk-or-..."
# Groqexport GROQ_API_KEY="gsk_..."Add to your shell profile (~/.bashrc or ~/.zshrc) for persistence.
Configuration File
Store credentials in ~/.cyberstrike/config.json:
{ "provider": { "anthropic": { "options": { "apiKey": "{env:ANTHROPIC_API_KEY}" } } }}Anthropic
Claude models are recommended for complex security analysis.
Authentication
cyberstrike auth login# Select: Anthropic# Enter API key from console.anthropic.comOr via environment:
export ANTHROPIC_API_KEY="sk-ant-..."Available Models
| Model | ID | Context | Best For |
|---|---|---|---|
| Claude Opus 4.5 | claude-opus-4-5-20250514 | 200K | Complex analysis |
| Claude Sonnet 4 | claude-sonnet-4-20250514 | 200K | Balanced performance |
| Claude Haiku 4.5 | claude-haiku-4-5-20250514 | 200K | Quick tasks |
Usage
# Via command linecyberstrike --model anthropic/claude-sonnet-4-20250514
# Via config{ "model": "anthropic/claude-sonnet-4-20250514"}Configuration Options
{ "provider": { "anthropic": { "options": { "apiKey": "{env:ANTHROPIC_API_KEY}", "timeout": 300000 } } }}OpenAI
GPT-4 models provide fast, reliable responses.
Authentication
cyberstrike auth login# Select: OpenAI# Enter API key from platform.openai.comOr via environment:
export OPENAI_API_KEY="sk-..."Available Models
| Model | ID | Context | Best For |
|---|---|---|---|
| GPT-4o | gpt-4o | 128K | General purpose |
| GPT-4o Mini | gpt-4o-mini | 128K | Cost efficiency |
| o1 | o1 | 200K | Complex reasoning |
| o1 Preview | o1-preview | 128K | Advanced tasks |
Usage
cyberstrike --model openai/gpt-4oConfiguration Options
{ "provider": { "openai": { "options": { "apiKey": "{env:OPENAI_API_KEY}", "baseURL": "https://api.openai.com/v1" } } }}Gemini models excel at multimodal tasks.
Authentication
export GOOGLE_API_KEY="AI..."Or via Google Cloud:
gcloud auth application-default loginAvailable Models
| Model | ID | Context | Best For |
|---|---|---|---|
| Gemini 2.0 Flash | gemini-2.0-flash-exp | 1M | Fast, experimental |
| Gemini 1.5 Pro | gemini-1.5-pro | 2M | Large context |
| Gemini 1.5 Flash | gemini-1.5-flash | 1M | Speed |
Usage
cyberstrike --model google/gemini-2.0-flash-expOpenRouter
Access many models through a single API with a free tier.
Authentication
cyberstrike auth login# Select: OpenRouter# Get API key from openrouter.ai/keysOr via environment:
export OPENROUTER_API_KEY="sk-or-..."Free Tier Models
| Model | ID |
|---|---|
| Llama 4 Scout | meta-llama/llama-4-scout:free |
| Gemini 2.0 Flash | google/gemini-2.0-flash-exp:free |
| Mistral Small | mistralai/mistral-small-3.2-24b-instruct:free |
Usage
cyberstrike --model openrouter/meta-llama/llama-4-scout:freeConfiguration
{ "model": "openrouter/meta-llama/llama-4-scout:free", "provider": { "openrouter": { "options": { "apiKey": "{env:OPENROUTER_API_KEY}" } } }}AWS Bedrock
Use Claude and other models through AWS.
Authentication
Configure AWS credentials:
# Via AWS CLIaws configure
# Or via environmentexport AWS_ACCESS_KEY_ID="..."export AWS_SECRET_ACCESS_KEY="..."export AWS_REGION="us-east-1"Available Models
| Model | ID |
|---|---|
| Claude Sonnet | anthropic.claude-sonnet-4-20250514-v1:0 |
| Claude Haiku | anthropic.claude-haiku-4-5-20250514-v1:0 |
| Llama 3.3 | meta.llama3-3-70b-instruct-v1:0 |
Configuration
{ "model": "amazon-bedrock/anthropic.claude-sonnet-4-20250514-v1:0", "provider": { "amazon-bedrock": { "options": { "region": "us-east-1" } } }}Azure OpenAI
Enterprise Azure deployment of OpenAI models.
Authentication
export AZURE_API_KEY="..."export AZURE_RESOURCE_NAME="your-resource"Configuration
{ "provider": { "azure": { "options": { "apiKey": "{env:AZURE_API_KEY}", "resourceName": "your-resource", "deploymentId": "gpt-4o" } } }}Google Vertex AI
Enterprise Google Cloud AI.
Authentication
gcloud auth application-default loginConfiguration
{ "provider": { "google-vertex": { "options": { "project": "your-project-id", "location": "us-central1" } } }}Ollama (Local)
Run models locally for privacy and offline use.
Setup
- Install Ollama:
curl -fsSL https://ollama.com/install.sh | sh- Pull a model:
ollama pull llama3.3- Start Ollama:
ollama serveUsage
# No API key neededcyberstrike --model ollama/llama3.3Available Models
| Model | Pull Command |
|---|---|
| Llama 3.3 | ollama pull llama3.3 |
| CodeLlama | ollama pull codellama |
| Mistral | ollama pull mistral |
| Qwen | ollama pull qwen2.5 |
Configuration
{ "model": "ollama/llama3.3", "provider": { "ollama": { "options": { "baseURL": "http://localhost:11434" } } }}Groq
Fastest inference for supported models.
Authentication
export GROQ_API_KEY="gsk_..."Available Models
| Model | ID |
|---|---|
| Llama 3.3 70B | llama-3.3-70b-versatile |
| Mixtral 8x7B | mixtral-8x7b-32768 |
Usage
cyberstrike --model groq/llama-3.3-70b-versatileCustom OpenAI-Compatible
Connect to any OpenAI-compatible API.
Configuration
{ "provider": { "custom-api": { "id": "custom-api", "name": "Custom API", "options": { "baseURL": "https://api.custom.example.com/v1", "apiKey": "{env:CUSTOM_API_KEY}" }, "models": { "custom-model": { "name": "Custom Model", "contextLength": 32000 } } } }}Usage
cyberstrike --model custom-api/custom-modelProvider Selection Tips
For Complex Security Analysis
Use Claude Opus 4.5 or Sonnet 4:
{ "model": "anthropic/claude-opus-4-5-20250514"}For Cost Efficiency
Use OpenRouter free tier or Groq:
{ "model": "openrouter/meta-llama/llama-4-scout:free"}For Privacy
Use Ollama with local models:
{ "model": "ollama/llama3.3"}For Enterprise
Use AWS Bedrock or Azure OpenAI with your organization’s credentials.
Listing Credentials
View stored credentials:
cyberstrike auth listOutput:
Credentials ~/.cyberstrike/auth.json Anthropic api OpenRouter api
Environment ANTHROPIC_API_KEYRemoving Credentials
cyberstrike auth logout# Select provider to removeTip
For penetration testing tasks requiring complex reasoning, we recommend Claude Opus 4.5 or Claude Sonnet 4.