Skip to main content

Cyberstrike is now open source! AI-powered penetration testing for security professionals. Star on GitHub

AI Providers

Cyberstrike supports multiple AI providers, giving you flexibility in choosing the best model for your security assessments.

📸 SCREENSHOT: s05-provider-dialog.png

Provider seçim dialogu

📊 DIAGRAM: provider-selection-flow.mermaid

Provider seçim akış diyagramı

Supported Providers

ProviderModelsBest For
AnthropicClaude Opus 4.5, Sonnet 4, HaikuComplex reasoning, long context
OpenAIGPT-4o, o1, o1-previewGeneral purpose, fast responses
GoogleGemini 2.0 Flash, ProMultimodal, large context
AWS BedrockClaude, Llama, TitanEnterprise, on-prem
Azure OpenAIGPT-4, GPT-4oEnterprise compliance
Google Vertex AIGemini, ClaudeEnterprise GCP
OpenRouterMany modelsFree tier, model variety
GroqLlama 3.3, MixtralSpeed, cost efficiency
OllamaLocal modelsPrivacy, offline use

Adding Credentials

Interactive Login

Terminal window
cyberstrike auth login

Select your provider and enter your API key.

Environment Variables

Set API keys in your environment:

Terminal window
# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
# OpenAI
export OPENAI_API_KEY="sk-..."
# Google
export GOOGLE_API_KEY="AI..."
# OpenRouter
export OPENROUTER_API_KEY="sk-or-..."
# Groq
export GROQ_API_KEY="gsk_..."

Add to your shell profile (~/.bashrc or ~/.zshrc) for persistence.

Configuration File

Store credentials in ~/.cyberstrike/config.json:

{
"provider": {
"anthropic": {
"options": {
"apiKey": "{env:ANTHROPIC_API_KEY}"
}
}
}
}

Anthropic

Claude models are recommended for complex security analysis.

Authentication

Terminal window
cyberstrike auth login
# Select: Anthropic
# Enter API key from console.anthropic.com

Or via environment:

Terminal window
export ANTHROPIC_API_KEY="sk-ant-..."

Available Models

ModelIDContextBest For
Claude Opus 4.5claude-opus-4-5-20250514200KComplex analysis
Claude Sonnet 4claude-sonnet-4-20250514200KBalanced performance
Claude Haiku 4.5claude-haiku-4-5-20250514200KQuick tasks

Usage

Terminal window
# Via command line
cyberstrike --model anthropic/claude-sonnet-4-20250514
# Via config
{
"model": "anthropic/claude-sonnet-4-20250514"
}

Configuration Options

{
"provider": {
"anthropic": {
"options": {
"apiKey": "{env:ANTHROPIC_API_KEY}",
"timeout": 300000
}
}
}
}

OpenAI

GPT-4 models provide fast, reliable responses.

Authentication

Terminal window
cyberstrike auth login
# Select: OpenAI
# Enter API key from platform.openai.com

Or via environment:

Terminal window
export OPENAI_API_KEY="sk-..."

Available Models

ModelIDContextBest For
GPT-4ogpt-4o128KGeneral purpose
GPT-4o Minigpt-4o-mini128KCost efficiency
o1o1200KComplex reasoning
o1 Previewo1-preview128KAdvanced tasks

Usage

Terminal window
cyberstrike --model openai/gpt-4o

Configuration Options

{
"provider": {
"openai": {
"options": {
"apiKey": "{env:OPENAI_API_KEY}",
"baseURL": "https://api.openai.com/v1"
}
}
}
}

Google

Gemini models excel at multimodal tasks.

Authentication

Terminal window
export GOOGLE_API_KEY="AI..."

Or via Google Cloud:

Terminal window
gcloud auth application-default login

Available Models

ModelIDContextBest For
Gemini 2.0 Flashgemini-2.0-flash-exp1MFast, experimental
Gemini 1.5 Progemini-1.5-pro2MLarge context
Gemini 1.5 Flashgemini-1.5-flash1MSpeed

Usage

Terminal window
cyberstrike --model google/gemini-2.0-flash-exp

OpenRouter

Access many models through a single API with a free tier.

Authentication

Terminal window
cyberstrike auth login
# Select: OpenRouter
# Get API key from openrouter.ai/keys

Or via environment:

Terminal window
export OPENROUTER_API_KEY="sk-or-..."

Free Tier Models

ModelID
Llama 4 Scoutmeta-llama/llama-4-scout:free
Gemini 2.0 Flashgoogle/gemini-2.0-flash-exp:free
Mistral Smallmistralai/mistral-small-3.2-24b-instruct:free

Usage

Terminal window
cyberstrike --model openrouter/meta-llama/llama-4-scout:free

Configuration

{
"model": "openrouter/meta-llama/llama-4-scout:free",
"provider": {
"openrouter": {
"options": {
"apiKey": "{env:OPENROUTER_API_KEY}"
}
}
}
}

AWS Bedrock

Use Claude and other models through AWS.

Authentication

Configure AWS credentials:

Terminal window
# Via AWS CLI
aws configure
# Or via environment
export AWS_ACCESS_KEY_ID="..."
export AWS_SECRET_ACCESS_KEY="..."
export AWS_REGION="us-east-1"

Available Models

ModelID
Claude Sonnetanthropic.claude-sonnet-4-20250514-v1:0
Claude Haikuanthropic.claude-haiku-4-5-20250514-v1:0
Llama 3.3meta.llama3-3-70b-instruct-v1:0

Configuration

{
"model": "amazon-bedrock/anthropic.claude-sonnet-4-20250514-v1:0",
"provider": {
"amazon-bedrock": {
"options": {
"region": "us-east-1"
}
}
}
}

Azure OpenAI

Enterprise Azure deployment of OpenAI models.

Authentication

Terminal window
export AZURE_API_KEY="..."
export AZURE_RESOURCE_NAME="your-resource"

Configuration

{
"provider": {
"azure": {
"options": {
"apiKey": "{env:AZURE_API_KEY}",
"resourceName": "your-resource",
"deploymentId": "gpt-4o"
}
}
}
}

Google Vertex AI

Enterprise Google Cloud AI.

Authentication

Terminal window
gcloud auth application-default login

Configuration

{
"provider": {
"google-vertex": {
"options": {
"project": "your-project-id",
"location": "us-central1"
}
}
}
}

Ollama (Local)

Run models locally for privacy and offline use.

Setup

  1. Install Ollama:
Terminal window
curl -fsSL https://ollama.com/install.sh | sh
  1. Pull a model:
Terminal window
ollama pull llama3.3
  1. Start Ollama:
Terminal window
ollama serve

Usage

Terminal window
# No API key needed
cyberstrike --model ollama/llama3.3

Available Models

ModelPull Command
Llama 3.3ollama pull llama3.3
CodeLlamaollama pull codellama
Mistralollama pull mistral
Qwenollama pull qwen2.5

Configuration

{
"model": "ollama/llama3.3",
"provider": {
"ollama": {
"options": {
"baseURL": "http://localhost:11434"
}
}
}
}

Groq

Fastest inference for supported models.

Authentication

Terminal window
export GROQ_API_KEY="gsk_..."

Available Models

ModelID
Llama 3.3 70Bllama-3.3-70b-versatile
Mixtral 8x7Bmixtral-8x7b-32768

Usage

Terminal window
cyberstrike --model groq/llama-3.3-70b-versatile

Custom OpenAI-Compatible

Connect to any OpenAI-compatible API.

Configuration

{
"provider": {
"custom-api": {
"id": "custom-api",
"name": "Custom API",
"options": {
"baseURL": "https://api.custom.example.com/v1",
"apiKey": "{env:CUSTOM_API_KEY}"
},
"models": {
"custom-model": {
"name": "Custom Model",
"contextLength": 32000
}
}
}
}
}

Usage

Terminal window
cyberstrike --model custom-api/custom-model

Provider Selection Tips

For Complex Security Analysis

Use Claude Opus 4.5 or Sonnet 4:

{
"model": "anthropic/claude-opus-4-5-20250514"
}

For Cost Efficiency

Use OpenRouter free tier or Groq:

{
"model": "openrouter/meta-llama/llama-4-scout:free"
}

For Privacy

Use Ollama with local models:

{
"model": "ollama/llama3.3"
}

For Enterprise

Use AWS Bedrock or Azure OpenAI with your organization’s credentials.


Listing Credentials

View stored credentials:

Terminal window
cyberstrike auth list

Output:

Credentials ~/.cyberstrike/auth.json
Anthropic api
OpenRouter api
Environment
ANTHROPIC_API_KEY

Removing Credentials

Terminal window
cyberstrike auth logout
# Select provider to remove

Tip

For penetration testing tasks requiring complex reasoning, we recommend Claude Opus 4.5 or Claude Sonnet 4.