ai

package
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 3, 2025 License: Apache-2.0 Imports: 17 Imported by: 0

Documentation

Overview

Package ai provides AI-powered test case generation for gotests.

Index

Constants

View Source
const (
	// MaxResponseSize is the maximum size of HTTP responses (1MB)
	MaxResponseSize = 1 * 1024 * 1024
	// MaxFunctionBodySize is the maximum size of function body in prompts (100KB)
	MaxFunctionBodySize = 100 * 1024
)
View Source
const (
	ProviderOllama     = "ollama"     // Ollama native API
	ProviderOpenAI     = "openai"     // OpenAI-compatible API (works with many providers)
	ProviderClaude     = "claude"     // Anthropic Claude native API
	ProviderGemini     = "gemini"     // Google Gemini native API
	ProviderCompatible = "compatible" // Alias for openai (OpenAI-compatible)
)

Provider name constants.

View Source
const (
	DefaultOllamaEndpoint = "http://localhost:11434"
	DefaultOpenAIEndpoint = "https://api.openai.com/v1"
	DefaultClaudeEndpoint = "https://api.anthropic.com/v1"
	DefaultGeminiEndpoint = "https://generativelanguage.googleapis.com/v1beta"
)

Default endpoints for providers.

View Source
const (
	DefaultOllamaModel = "qwen2.5-coder:0.5b"
	DefaultOpenAIModel = "gpt-4o-mini"
	DefaultClaudeModel = "claude-3-haiku-20240307"
	DefaultGeminiModel = "gemini-1.5-flash"
)

Default models for each provider.

Variables

View Source
var CommonEndpoints = map[string]string{
	"openai":     "https://api.openai.com/v1",
	"azure":      "https://{resource}.openai.azure.com/openai/deployments/{deployment}",
	"together":   "https://api.together.xyz/v1",
	"groq":       "https://api.groq.com/openai/v1",
	"anyscale":   "https://api.endpoints.anyscale.com/v1",
	"fireworks":  "https://api.fireworks.ai/inference/v1",
	"deepseek":   "https://api.deepseek.com/v1",
	"mistral":    "https://api.mistral.ai/v1",
	"perplexity": "https://api.perplexity.ai",
	"openrouter": "https://openrouter.ai/api/v1",
	"lmstudio":   "http://localhost:1234/v1",
}

Common OpenAI-compatible endpoints for reference. Users can use -ai-provider openai with any of these endpoints.

Functions

func GetCommonEndpoint

func GetCommonEndpoint(name string) string

GetCommonEndpoint returns the endpoint URL for a well-known provider name. Returns empty string if the provider is not in the common endpoints list.

func ListSupportedProviders

func ListSupportedProviders() []string

ListSupportedProviders returns a list of supported provider names.

func ValidateGeneratedTest

func ValidateGeneratedTest(testCode, pkgName string) error

ValidateGeneratedTest checks if the generated test code compiles. Uses in-memory parsing and type-checking without writing files.

Types

type ClaudeProvider

type ClaudeProvider struct {
	// contains filtered or unexported fields
}

ClaudeProvider implements the Provider interface for Anthropic Claude API.

func NewClaudeProvider

func NewClaudeProvider(cfg *Config) (*ClaudeProvider, error)

NewClaudeProvider creates a new Claude provider with the given config. Returns an error if the API key is missing or endpoint URL is invalid.

func (*ClaudeProvider) GenerateTestCases

func (c *ClaudeProvider) GenerateTestCases(ctx context.Context, fn *models.Function) ([]TestCase, error)

GenerateTestCases generates test cases using Claude API.

func (*ClaudeProvider) IsAvailable

func (c *ClaudeProvider) IsAvailable() bool

IsAvailable checks if the Claude API is accessible with the configured API key. Claude doesn't have a simple health check endpoint, so we verify the API key format.

func (*ClaudeProvider) Name

func (c *ClaudeProvider) Name() string

Name returns the provider name.

type Config

type Config struct {
	Provider       string // Provider name: "ollama", "openai", "claude", "gemini"
	Model          string // Model name (e.g., "qwen2.5-coder:0.5b", "gpt-4o", "claude-3-sonnet")
	Endpoint       string // API endpoint URL
	APIKey         string // API key (for cloud providers)
	MinCases       int    // Minimum number of test cases to generate (default: 3)
	MaxCases       int    // Maximum number of test cases to generate (default: 10)
	MaxRetries     int    // Maximum number of retry attempts (default: 3)
	RequestTimeout int    // HTTP request timeout in seconds (default: 60)
	HealthTimeout  int    // Health check timeout in seconds (default: 2)
}

Config holds configuration for AI providers.

func DefaultConfig

func DefaultConfig() *Config

DefaultConfig returns the default AI configuration.

type GeminiProvider

type GeminiProvider struct {
	// contains filtered or unexported fields
}

GeminiProvider implements the Provider interface for Google Gemini API.

func NewGeminiProvider

func NewGeminiProvider(cfg *Config) (*GeminiProvider, error)

NewGeminiProvider creates a new Gemini provider with the given config. Returns an error if the API key is missing or endpoint URL is invalid.

func (*GeminiProvider) GenerateTestCases

func (g *GeminiProvider) GenerateTestCases(ctx context.Context, fn *models.Function) ([]TestCase, error)

GenerateTestCases generates test cases using Gemini API.

func (*GeminiProvider) IsAvailable

func (g *GeminiProvider) IsAvailable() bool

IsAvailable checks if the Gemini API is accessible with the configured API key.

func (*GeminiProvider) Name

func (g *GeminiProvider) Name() string

Name returns the provider name.

type OllamaProvider

type OllamaProvider struct {
	// contains filtered or unexported fields
}

OllamaProvider implements the Provider interface for Ollama.

func NewOllamaProvider

func NewOllamaProvider(cfg *Config) (*OllamaProvider, error)

NewOllamaProvider creates a new Ollama provider with the given config. Returns an error if the endpoint URL is invalid or unsafe.

func (*OllamaProvider) GenerateTestCases

func (o *OllamaProvider) GenerateTestCases(ctx context.Context, fn *models.Function) ([]TestCase, error)

GenerateTestCases generates test cases using Ollama. Now uses Go code generation instead of JSON for better small-model compatibility.

func (*OllamaProvider) IsAvailable

func (o *OllamaProvider) IsAvailable() bool

IsAvailable checks if Ollama is running and accessible.

func (*OllamaProvider) Name

func (o *OllamaProvider) Name() string

Name returns the provider name.

type OpenAIProvider

type OpenAIProvider struct {
	// contains filtered or unexported fields
}

OpenAIProvider implements the Provider interface for OpenAI-compatible APIs. This provider works with any service that implements the OpenAI Chat Completions API format:

  • OpenAI (api.openai.com)
  • Azure OpenAI
  • Together AI (api.together.xyz)
  • Groq (api.groq.com)
  • Anyscale (api.endpoints.anyscale.com)
  • Fireworks AI (api.fireworks.ai)
  • DeepSeek (api.deepseek.com)
  • Mistral AI (api.mistral.ai)
  • Perplexity (api.perplexity.ai)
  • OpenRouter (openrouter.ai)
  • LM Studio (localhost:1234)
  • And many more...

func NewOpenAIProvider

func NewOpenAIProvider(cfg *Config) (*OpenAIProvider, error)

NewOpenAIProvider creates a new OpenAI-compatible provider with the given config. This works with any service implementing the OpenAI Chat Completions API. Returns an error if the API key is missing or endpoint URL is invalid.

func (*OpenAIProvider) GenerateTestCases

func (o *OpenAIProvider) GenerateTestCases(ctx context.Context, fn *models.Function) ([]TestCase, error)

GenerateTestCases generates test cases using OpenAI API.

func (*OpenAIProvider) IsAvailable

func (o *OpenAIProvider) IsAvailable() bool

IsAvailable checks if the API is accessible. For cloud providers, it verifies the API key by making a lightweight request. For local providers, it just checks connectivity.

func (*OpenAIProvider) Name

func (o *OpenAIProvider) Name() string

Name returns the provider name.

type Provider

type Provider interface {
	// GenerateTestCases generates test cases for the given function.
	GenerateTestCases(ctx context.Context, fn *models.Function) ([]TestCase, error)

	// IsAvailable checks if the provider is available and configured.
	IsAvailable() bool

	// Name returns the provider name for logging/debugging.
	Name() string
}

Provider is the interface for AI test case generation backends.

func NewProvider

func NewProvider(cfg *Config) (Provider, error)

NewProvider creates a new AI provider based on the configuration. It returns an error if the provider is unknown or configuration is invalid.

Supported providers:

  • "ollama": Ollama native API (local, free)
  • "openai" or "compatible": OpenAI-compatible API (works with OpenAI, Azure, Together, Groq, etc.)
  • "claude": Anthropic Claude native API
  • "gemini": Google Gemini native API

For OpenAI-compatible providers, you can use any endpoint that follows the OpenAI API format.

type TestCase

type TestCase struct {
	Name        string            // Test case name (e.g., "positive_numbers")
	Description string            // Optional description
	Args        map[string]string // Parameter name -> Go code value
	Want        map[string]string // Return value name -> Go code value
	WantErr     bool              // Whether an error is expected
}

TestCase represents a single generated test case.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL