Documentation
¶
Index ¶
- Constants
- Variables
- func EphemeralCache() *llms.CacheControl
- func EphemeralCacheOneHour() *llms.CacheControl
- func MapError(err error) error
- func WithBetaHeader(header string) llms.CallOption
- func WithExtendedOutput() llms.CallOption
- func WithInterleavedThinking() llms.CallOption
- func WithPromptCaching() llms.CallOption
- type LLM
- type Option
- type ToolResult
Constants ¶
const ( RoleUser = "user" RoleAssistant = "assistant" RoleSystem = "system" )
const MaxTokensAnthropicSonnet35 = "max-tokens-3-5-sonnet-2024-07-15" //nolint:gosec // This is not a sensitive value.
MaxTokensAnthropicSonnet35 is the header value for specifying the maximum number of tokens when using the Anthropic Sonnet 3.5 model.
Variables ¶
var ( ErrEmptyResponse = errors.New("no response") ErrMissingToken = errors.New("missing the Anthropic API key, set it in the ANTHROPIC_API_KEY environment variable") ErrUnexpectedResponseLength = errors.New("unexpected length of response") ErrInvalidContentType = errors.New("invalid content type") ErrUnsupportedMessageType = errors.New("unsupported message type") ErrUnsupportedContentType = errors.New("unsupported content type") )
Functions ¶
func EphemeralCache ¶ added in v0.1.14
func EphemeralCache() *llms.CacheControl
EphemeralCache creates a standard ephemeral cache control for Anthropic with 5-minute duration.
func EphemeralCacheOneHour ¶ added in v0.1.14
func EphemeralCacheOneHour() *llms.CacheControl
EphemeralCacheOneHour creates a 1-hour ephemeral cache control for Anthropic.
func MapError ¶ added in v0.1.14
MapError maps Anthropic-specific errors to standardized error codes.
func WithBetaHeader ¶ added in v0.1.14
func WithBetaHeader(header string) llms.CallOption
WithBetaHeader adds a custom beta header for accessing Anthropic's experimental features. This is useful for testing new features before dedicated support is added.
Usage:
llm.GenerateContent(ctx, messages,
anthropic.WithBetaHeader("new-feature-2025-01-01"),
)
func WithExtendedOutput ¶ added in v0.1.14
func WithExtendedOutput() llms.CallOption
WithExtendedOutput enables 128K token output for Claude 3.7+. Standard models are limited to 8K tokens, but this beta feature allows generating much longer responses.
Usage:
llm.GenerateContent(ctx, messages,
llms.WithMaxTokens(50000),
anthropic.WithExtendedOutput(),
)
func WithInterleavedThinking ¶ added in v0.1.14
func WithInterleavedThinking() llms.CallOption
WithInterleavedThinking enables thinking between tool calls for Claude 3.7+. This allows the model to use reasoning tokens to plan tool usage and interpret results.
Usage:
llm.GenerateContent(ctx, messages,
llms.WithTools(tools),
llms.WithThinkingMode(llms.ThinkingModeMedium),
anthropic.WithInterleavedThinking(),
)
func WithPromptCaching ¶ added in v0.1.14
func WithPromptCaching() llms.CallOption
WithPromptCaching enables Anthropic's prompt caching feature. This allows frequently-used prompts and system messages to be cached for improved performance and reduced costs.
Usage:
llm.GenerateContent(ctx, messages,
anthropic.WithPromptCaching(),
)
Types ¶
type LLM ¶
func (*LLM) GenerateContent ¶ added in v0.1.4
func (o *LLM) GenerateContent(ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption) (*llms.ContentResponse, error)
GenerateContent implements the Model interface.
func (*LLM) SupportsReasoning ¶ added in v0.1.14
SupportsReasoning implements the ReasoningModel interface. Returns true if the current model supports extended thinking capabilities.
type Option ¶
type Option func(*options)
func WithAnthropicBetaHeader ¶ added in v0.1.13
WithAnthropicBetaHeader adds the Anthropic Beta header to support extended options.
func WithBaseURL ¶ added in v0.1.6
WithBaseUrl passes the Anthropic base URL to the client. If not set, the default base URL is used.
func WithHTTPClient ¶ added in v0.1.6
func WithHTTPClient(client anthropicclient.Doer) Option
WithHTTPClient allows setting a custom HTTP client. If not set, the default value is http.DefaultClient.
func WithLegacyTextCompletionsAPI ¶ added in v0.1.8
func WithLegacyTextCompletionsAPI() Option
WithLegacyTextCompletionsAPI enables the use of the legacy text completions API.