I Tracked My Focus for 30 Days — Here's What I Learned
Building Focus Meter required me to be honest with myself about how I spend my day, and it turns out I wasn't. Not in any dramatic way — no hidden addiction to Reddit, no secret Netflix marathons. Just the usual knowledge-worker self-delusion: I thought I was producing more than I was, I thought my distractions were smaller than they were, and I thought my most productive hours were different than they actually are.
This post is the most interesting findings from 30 days of continuous, on-device tracking of my own Mac. Same setup you'd run if you installed the app today.
The setup
- Focus Meter running continuously on my main Mac.
- URL tracking enabled for Arc, Safari, and Chrome.
- Idle threshold at 3 minutes.
- My category definitions, which matter for the numbers below:
- Productive: editor (Cursor + Xcode), terminal, GitHub, docs, localhost, Figma.
- Neutral: email, calendar, Finder, password manager, system utilities.
- Distracting: Slack (yes), Twitter, HN, Reddit, YouTube (except technical talks).
30 days of data, covering March 10 to April 8, 2026. 22 working days. I didn't change my behavior during the period — the point was to baseline, not to improve.
Finding 1: I overestimated my coding time by about 2x
My self-estimate for active coding, before seeing data: "probably 5 hours a day."
The data: 2h 17m per working day average. Highest day: 4h 50m. Lowest: 12 minutes (a meetings-dominated Thursday).
The 12-minute day jumped out. It didn't feel like a 12-minute coding day. It felt like a normal busy day — 5 meetings, a handful of code reviews, a lot of Slack. But in terms of actually writing or reading code, 12 minutes was accurate.
The gap between estimate and reality comes from perceived effort compressing time. Coding is demanding, so an hour feels like two. Slack is easy, so 45 minutes feels like 15. Calibrating against the data re-sets this, at least temporarily. After two weeks of watching the data daily, my estimates got much closer to reality. But I suspect they'd drift back within a month of not looking.
Finding 2: Slack ate more than I thought, in smaller pieces than I thought
My self-estimate for Slack: "30-45 minutes a day."
The data: 1h 49m per working day average. And more interesting: the median session length in Slack was 2.4 minutes. I was in Slack for about 45 separate sessions per day.
That second number is the story. It's not that Slack was a long chunk of time — it's that Slack was constantly interrupting other chunks of time. Every Slack check was a context switch in and a context switch out, which is the expensive part. The 1h 49m of actual Slack time was the smaller cost; the 45 context switches were much bigger. (See the post on context switching cost for why.)
This is the finding I'd most expect to be universal. If you install any tracker and look at Slack, I'd bet on you finding roughly the same pattern — under-estimated total time, plus a lot of fragmenting short sessions.
Finding 3: My "most productive hours" weren't where I thought
Before tracking, if you'd asked me, I would have said: "I do my best work from 9am to noon."
The data, after 22 days:
- 9-11am: actually quite fragmented. Lots of Slack catch-up, some email, a pile of smaller tasks. Medium Focus Score contribution.
- 11am-1pm: where most of my peak focus sessions actually happened. Longer sessions, higher switch-free time.
- 1-3pm: post-lunch dip is real. Lower focus, more distraction-tab-switching.
- 3-6pm: a second, smaller peak. Shorter than the 11-1 peak but relatively clean.
My self-image of "morning person" was wrong — or, more charitably, directionally right but specifically wrong. I'm more like a "late morning" person with a "mid-afternoon" second wind. Early morning is lost to comms.
The practical fix, which I've since adopted: I now decline anything before 11am except cross-timezone meetings. That's a small calendar change that cost me almost nothing and measurably added about 45 minutes of deep work to my typical day.
Finding 4: Distracted days had a specific signature
Looking at the five worst-scoring days of the month (Focus Score in the 30s) vs. the five best (scores in the 70s), a pattern:
Worst days: high switch count per hour (25-35), short average session length (under 15 min), Slack/email > 50% of tracked time, at least one 30+ min YouTube or HN session.
Best days: lower switch count (10-18), longer average session length (35+ min), more than one 45-min+ deep work block, Slack/email under 25%.
None of those individually was shocking. What was interesting was that the best days didn't involve working more hours. Total tracked time was roughly the same on good and bad days (6-8 hours of active Mac use). The difference was entirely in the structure of that time. The same hours, reorganized, produced dramatically different outputs.
This is the main reason I think focus tracking beats time tracking for knowledge workers. Total time isn't really the lever. Structure is.
What I changed after 30 days
Three experiments, based on the data:
1. No Slack before 11am. Directly addresses finding 3. Single biggest win.
2. Notifications off for everything except calendar and phone. Addresses finding 2 and the context-switching signature from finding 4. Dropped my Slack session count from ~45/day to ~12/day without reducing actual time spent on the platform.
3. One "meetings-free morning" per week. Blocked Wednesday 9am-noon. I was skeptical this would feel different from a regular day. It does — those mornings are reliably 4+ hour deep work blocks, well above my normal ceiling.
All three are boring advice. The difference is I wasn't sure they'd matter for me specifically until I saw the data that said they would.
What I'd change about the setup
Three things I'd tune if starting over:
- Categorize faster. I was slow to tag my top 30 apps. The first week of data was noisier than it needed to be because half my usage was in "Uncategorized."
- Enable URL tracking on day one. I turned it on at day 4. Before that, Chrome was one undifferentiated blob, which made the productive/distraction split wildly wrong for a browser-heavy job.
- Don't look daily for the first week. I checked the score every evening for the first 7 days. It doesn't mean much until you have a week of stable data. I'd recommend a day-one setup, then look at the dashboard the following Monday with a week of data in hand.
What 30 days of tracking actually accomplishes
The data isn't the point. The data is the permission to take specific interventions seriously. "Turn off notifications" sounds like generic productivity advice; "turn off notifications because my Slack session count is 45/day and dropping it to 12 adds 60 minutes of daily deep work" is specific and therefore actually works.
Without the measurement, you're taking someone else's general advice on faith. With it, you're running a small experiment on yourself with a clear signal to measure against. That's the part worth the $19.
