Time travel debugging with cloud code’s conversation history

by
0 comments
Time travel debugging with cloud code's conversation history

Author(s): Vikas Tiwari

Originally published on Towards AI.

A few weeks ago, I was working on a legacy project that had over 100 microservices. I encountered a bug in this particular backend service that looked really familiar, but I couldn’t remember where or how I fixed it. Then I thought the cloud would remember the code but then I wouldn’t remember which session to start.

Source: Image by author

I started wondering where all those conversations are stored and then after some reading discovered that CC keeps everything in ~/.claude/projects/(folder-names)/(uuid).jsonl As JSONL files.

For example, for one of my apps ~/ws/nq/news-bulletin/app Conversation history will be stored ~/.claude/projects/-Users-vikas-t-ws-nq-news-bulletin-app/bf8ecb66-fc60-4187-9c92-cded3ea68f58.jsonl. UUID is created here.

what’s really there

each file is JSONL (JSON lines). One JSON object per line. Each line is a turn in the conversation.

{
"type": "assistant",
"uuid": "abc-123-def",
"timestamp": "2026-01-14T04:51:53.996Z",
"sessionId": "fe82a754-0606-4bcf-b79a-f7b6f2a72bc8",
"message": {
"id": "msg_01ABC123",
"type": "message",
"role": "assistant",
"model": "claude-opus-4-5-20251101",
"content": (
{
"type": "tool_use",
"id": "toolu_xyz",
"name": "Read",
"input": {"file_path": "/path/to/file.py"}
}
),
"stop_reason": "tool_use",
"usage": {
"input_tokens": 1234,
"output_tokens": 567,
"cache_creation_input_tokens": 5000,
"cache_read_input_tokens": 0
}
}
}

Your signals are there with the cloud’s responses, every tool call it makes (read, write, edit, bash). token counts For each turn, which model replied to each message (Opus 4.5, Sonnet 4.5), and basically everything that happened during that session.

what can you actually do with it

time travel debugging
You shipped a feature 2 weeks ago, and now there’s a bug. You remember that Cloud mentioned an edge case workaround but don’t remember what it was.

grep -A5 -B5 "edge case" ~/.claude/projects/my-feature/*.jsonl

And it is there.

Recover lost work

The terminal crashed in the middle of the session. You didn’t commit. Generally you get screwed. Not with JSONL. Remove all write/edit tool calls and copy the code back:

cat crashed-session.jsonl | jq -r '.message.content()? | select(.name == "Write" or .name == "Edit") | .input'

find that one order

You ran a complex Bash pipeline through the cloud 3 months ago. It worked perfectly. what was it?

cat old-session.jsonl | jq -r '.message.content()? | select(.name == "Bash") | .input.command' | grep "docker"

token cost forensics

You’ve reached your weekly rate limit and want to know which session caused this. Find out which session was the greediest.

for f in ~/.claude/projects/*/*.jsonl; do
echo "$f: $(cat $f | jq '.message.usage.input_tokens' | paste -s -d+ - | bc) tokens"
done | sort -t: -k2 -n

That 50K-line refactoring session is shown just below.

Learning what really worked

You tried 3 different methods to fix the display issue. Two failed. Which work was done and why? Read the session. See when the cloud changed strategies, what failed, why it failed. Better than StackOverflow because it is Yours crisis.

code review audit trail

When someone asks why you implemented something a certain way, extend the session and show your team the actual conversation where Cloud explained the tradeoffs, alternatives considered, and the reasoning behind the decision. You get the entire thought process, not just a “because Claude said so” answer.

extract document

Claude explained some complex things very well during one session. Save that explanation:

cat session.jsonl | jq -r '.message.content()? | select(.type == "text") | .text' > explanation.md

Now this is documentation.

Files expire after one month by default

CC has a default setting to automatically clear local conversation history after 30 days. It’s like losing your entire coding memory. You can change this behavior by configuring cleanupPeriodDays Settings in your settings.json file. To apply this setting globally to your user, create or edit a file ~/.claude/settings.json.

If you ask me I’ll keep it forever, it doesn’t take up much space on disk. On my machine, it’s a few hundred MB.

To effectively disable auto-delete, you can set the cleanup period to a very large number. For example, add the following to your ~/.claude/settings.json file:

{
"cleanupPeriodDays": 99999
}

Another solution would be to simply take a backup.

things to keep in mind

These files get large during long debugging sessions because there’s a lot of back-and-forth, lots of references being cached and re-read, which is completely normal for complex work.
They contain everything you exposed to the cloud during the session, including your code, any secrets you pasted for debugging, file contents, command output, and more. Don’t push them to GitHub unless you’ve carefully checked what’s there.

how i actually use it

I have a bash function in my .zshrc:

ccsearch() {
grep -r "$1" ~/.claude/projects/*/
}

so ccsearch "authentication fix" Searches all my sessions ever. Found solutions I had completely forgotten about.
Most people never look at them. But if you’re debugging something tricky or trying to understand what exactly happened in a session, they’re useful.

So, back to our original story. I searched for the problem in the JSONL files and it returned thousands of lines of text. It took me some time to narrow down what I wanted. Turns out I was trying the correct set of commands, there was just a typo error in one of them. 😀

Follow for more details!

Published via Towards AI

Related Articles

Leave a Comment