PulseAugur
LIVE 06:18:22
tool · [1 source] ·
0
tool

Developer uses over 4 billion tokens with Claude Code and Codex in two months

A developer documented their extensive use of Anthropic's Claude Code and OpenAI's Codex over two months, consuming over 4 billion tokens in total. April saw a significant surge, with 3.77 billion tokens used, largely driven by parallel agent execution and the adoption of newer models like Opus-4-7 and GPT-5.5. The analysis revealed a high dependency on caching for both tools, with Claude Code primarily using cache reads and Codex relying on cached inputs, which helped manage costs despite higher per-token prices for newer models. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

IMPACT Highlights the practical implications of large-scale AI tool usage and the cost-efficiency of caching mechanisms for developers.

RANK_REASON The article details a user's experience and usage patterns with existing AI coding tools, rather than a new release or significant industry event.

Read on dev.to — Anthropic tag →

COVERAGE [1]

  1. dev.to — Anthropic tag TIER_1 · naoki_JPN ·

    I used Claude Code + Codex for 2 months and hit 3.77 billion tokens in a single month

    <blockquote> <p><strong>Note:</strong> Information in this article is current as of May 2026.</p> </blockquote> <p>I've been using both Claude Code and OpenAI Codex for personal development for two months. I wanted to get a clearer picture of my actual usage, so I tracked token c…