LLM Token Counter

Estimate token counts and costs for GPT-4, Claude, Llama, and other LLM models in real time

Input Text
0 characters
Characters0
Words0
Lines0
GPT-4 est. tokens0~4 chars/token
Model Comparison

Click a price to edit it. Prices are per 1M tokens in USD.

ModelProviderTokensContextInput $/1MOutput $/1MInput costOutput cost
GPT-4oOpenAI
GPT-4 TurboOpenAI
GPT-3.5 TurboOpenAI
Claude 3.5 SonnetAnthropic
Claude 3 OpusAnthropic
Claude 3 HaikuAnthropic
Llama 3 70BMeta
Llama 3 8BMeta
Gemini 1.5 ProGoogle
Gemini 1.5 FlashGoogle
GPT-style (BPE)Byte-pair encoding approximation — ~4 chars/token. Used for OpenAI and Google models.
Claude-styleAnthropic tokenizer approximation — ~3.5 chars/token, slightly more efficient vocabulary.
Llama (SentencePiece)SentencePiece BPE approximation — ~3.8 chars/token. Used for Meta Llama models.
WhitespaceSimple split on whitespace — counts words only. Useful as a lower-bound baseline.

Note: All counts are estimates. Exact token counts require the model's official tokenizer library (e.g., tiktoken for OpenAI).