anthropic

package
v0.1.14 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 20, 2025 License: MIT Imports: 12 Imported by: 64

Documentation

Index

Constants

View Source
const (
	RoleUser      = "user"
	RoleAssistant = "assistant"
	RoleSystem    = "system"
)
View Source
const MaxTokensAnthropicSonnet35 = "max-tokens-3-5-sonnet-2024-07-15" //nolint:gosec // This is not a sensitive value.

MaxTokensAnthropicSonnet35 is the header value for specifying the maximum number of tokens when using the Anthropic Sonnet 3.5 model.

Variables

View Source
var (
	ErrEmptyResponse            = errors.New("no response")
	ErrMissingToken             = errors.New("missing the Anthropic API key, set it in the ANTHROPIC_API_KEY environment variable")
	ErrUnexpectedResponseLength = errors.New("unexpected length of response")
	ErrInvalidContentType       = errors.New("invalid content type")
	ErrUnsupportedMessageType   = errors.New("unsupported message type")
	ErrUnsupportedContentType   = errors.New("unsupported content type")
)

Functions

func EphemeralCache added in v0.1.14

func EphemeralCache() *llms.CacheControl

EphemeralCache creates a standard ephemeral cache control for Anthropic with 5-minute duration.

func EphemeralCacheOneHour added in v0.1.14

func EphemeralCacheOneHour() *llms.CacheControl

EphemeralCacheOneHour creates a 1-hour ephemeral cache control for Anthropic.

func MapError added in v0.1.14

func MapError(err error) error

MapError maps Anthropic-specific errors to standardized error codes.

func WithBetaHeader added in v0.1.14

func WithBetaHeader(header string) llms.CallOption

WithBetaHeader adds a custom beta header for accessing Anthropic's experimental features. This is useful for testing new features before dedicated support is added.

Usage:

llm.GenerateContent(ctx, messages,
    anthropic.WithBetaHeader("new-feature-2025-01-01"),
)

func WithExtendedOutput added in v0.1.14

func WithExtendedOutput() llms.CallOption

WithExtendedOutput enables 128K token output for Claude 3.7+. Standard models are limited to 8K tokens, but this beta feature allows generating much longer responses.

Usage:

llm.GenerateContent(ctx, messages,
    llms.WithMaxTokens(50000),
    anthropic.WithExtendedOutput(),
)

func WithInterleavedThinking added in v0.1.14

func WithInterleavedThinking() llms.CallOption

WithInterleavedThinking enables thinking between tool calls for Claude 3.7+. This allows the model to use reasoning tokens to plan tool usage and interpret results.

Usage:

llm.GenerateContent(ctx, messages,
    llms.WithTools(tools),
    llms.WithThinkingMode(llms.ThinkingModeMedium),
    anthropic.WithInterleavedThinking(),
)

func WithPromptCaching added in v0.1.14

func WithPromptCaching() llms.CallOption

WithPromptCaching enables Anthropic's prompt caching feature. This allows frequently-used prompts and system messages to be cached for improved performance and reduced costs.

Usage:

llm.GenerateContent(ctx, messages,
    anthropic.WithPromptCaching(),
)

Types

type LLM

type LLM struct {
	CallbacksHandler callbacks.Handler
	// contains filtered or unexported fields
}

func New

func New(opts ...Option) (*LLM, error)

New returns a new Anthropic LLM.

func (*LLM) Call

func (o *LLM) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)

Call requests a completion for the given prompt.

func (*LLM) GenerateContent added in v0.1.4

func (o *LLM) GenerateContent(ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption) (*llms.ContentResponse, error)

GenerateContent implements the Model interface.

func (*LLM) SupportsReasoning added in v0.1.14

func (o *LLM) SupportsReasoning() bool

SupportsReasoning implements the ReasoningModel interface. Returns true if the current model supports extended thinking capabilities.

type Option

type Option func(*options)

func WithAnthropicBetaHeader added in v0.1.13

func WithAnthropicBetaHeader(value string) Option

WithAnthropicBetaHeader adds the Anthropic Beta header to support extended options.

func WithBaseURL added in v0.1.6

func WithBaseURL(baseURL string) Option

WithBaseUrl passes the Anthropic base URL to the client. If not set, the default base URL is used.

func WithHTTPClient added in v0.1.6

func WithHTTPClient(client anthropicclient.Doer) Option

WithHTTPClient allows setting a custom HTTP client. If not set, the default value is http.DefaultClient.

func WithLegacyTextCompletionsAPI added in v0.1.8

func WithLegacyTextCompletionsAPI() Option

WithLegacyTextCompletionsAPI enables the use of the legacy text completions API.

func WithModel

func WithModel(model string) Option

WithModel passes the Anthropic model to the client.

func WithToken

func WithToken(token string) Option

WithToken passes the Anthropic API token to the client. If not set, the token is read from the ANTHROPIC_API_KEY environment variable.

type ToolResult added in v0.1.11

type ToolResult struct {
	Type      string `json:"type"`
	ToolUseID string `json:"tool_use_id"`
	Content   string `json:"content"`
}

Directories

Path Synopsis
internal

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL