How a UX Writer Reclaimed 11 Hours a Week Using AI and the Friction Ladder

A case study of one knowledge worker's distraction audit, Friction Ladder implementation, and the surprising patterns her AI check-ins revealed over six weeks.

Lena Vasquez had tried three site blockers, a browser extension that replaced social media feeds with task lists, and a no-phone-before-noon rule. None held past three weeks.

The problem, she concluded, was not discipline. She was a disciplined person — she delivered work on time, managed complex projects, and ran five days a week. The problem was something about how the systems she was using were designed.

When she decided to document a six-week experiment with a different approach, she kept careful notes. What follows is a reconstructed account of that experiment, drawn from her logs.


Baseline: What the Audit Actually Found

Lena works as a UX writer at a mid-sized software company. Most of her work involves writing interface copy, user onboarding flows, and error messages — work that requires sustained, precise language decisions and is easily disrupted by cognitive interruption.

She had a clear intuition about her worst distractions: Instagram and Twitter. She was wrong.

For the three-day baseline audit, she logged every attention break she took that was not scheduled or task-relevant. At the end of day three, she ran her log through Claude with a structured analysis prompt.

The output ranked her distraction categories by total estimated time cost (including recovery windows). The top three:

  1. Excessive Slack monitoring — checking channels and DMs proactively, without incoming notifications, roughly every eight to ten minutes: ~3.5 hours per week estimated
  2. Semi-work browsing — reading articles about UX, design, content strategy, technology: ~2.5 hours per week
  3. Instagram — the category she had assumed was the problem: ~1.5 hours per week

Slack and semi-work browsing together accounted for more than three times the distraction cost of Instagram. Neither had appeared in her mental model of the problem.

The audit also flagged a pattern she had not noticed: her self-interruptions clustered heavily in the late morning, between 10am and noon, and in the mid-afternoon around 3pm. The trigger pattern was consistent: both windows corresponded to cognitively demanding copywriting phases, not shallow or transitional work.

The implication was uncomfortable. Her highest-cost distractions were not recreational escapism. They were avoidance behaviors triggered by difficult work — and dressed up as productivity in the case of semi-work browsing.


Version 1: The Wrong Fix

Lena’s first response to the audit was to address the obvious items. She installed a site blocker for semi-work browsing sites, turned off Slack notifications, and moved Instagram to a nested folder on her phone.

Week one results: measurable improvement. She estimated recovering about four hours compared to her baseline week. The Slack impulse diminished somewhat with notifications off. Semi-work browsing dropped.

Week two: the system started to drift. The Slack behavior did not need notifications to persist — she was checking manually, the same as before, just without the notification as the trigger. The semi-work browsing block was circumvented by reading the same content in her RSS reader, which she had not blocked. Instagram dropped to near zero, but she noticed a new pattern: LinkedIn browsing, which she had never classified as a distraction because it felt professional, was filling roughly the same time window.

By week three, she had a site blocker running and roughly the same distraction load she started with.

She described this phase later as “treating the outlet as if it were the source.”


Redesign: Addressing the Trigger

The breakthrough came from a check-in prompt at the end of week three.

I'm reviewing my Friction Ladder results. My biggest override pattern is Slack checking — I check it manually every 8-10 minutes even with notifications off. The pattern is strongest between 10am and noon and 3pm-4pm. Both windows are when I'm doing hard copywriting work. What would you suggest addressing?

The response reframed the problem. The Slack checking was not a Slack problem. It was a task-difficulty avoidance behavior that happened to express itself through Slack. The intervention needed to be at the task level, not the platform level.

Three specific suggestions came back:

  1. Before each copywriting session, write a three-sentence brief defining the specific output and the “done” criterion. Ambiguity about what to produce is one of the strongest task-aversion triggers.

  2. Use a “parking lot” — a running document where you capture the half-formed ideas and tangential questions that arise mid-session, removing the need to act on them immediately.

  3. Schedule a Slack processing window at 10am and 2pm — fixed times when checking is the designated task, not a distraction from another task.

Lena implemented all three. The result was not that the urge to check Slack disappeared. It was that when the urge arose mid-session, there was a direct response available: write the partial thought in the parking lot, continue the task. The urge did not need to be suppressed; it needed to be routed.


The Friction Ladder at Full Implementation

By week four, Lena’s Friction Ladder looked like this:

  • Slack: Rung 3 — app on phone logged out after each use; desktop app set to check-only mode during defined 10am and 2pm windows via scheduled Do Not Disturb; other times checked only on break
  • Semi-work browsing: Rung 3 — RSS reader moved off toolbar, login-gated; blocked domain list expanded to include RSS reader during deep work blocks
  • Instagram: Rung 3 — app deleted, accessible only via mobile Safari (login required each session)
  • LinkedIn: Rung 2 — removed from browser toolbar, added to blocked list during morning deep work block
  • News sites: Rung 2 — bookmarks removed, sites accessible but not quick-access

She also implemented two environmental changes that came from the AI check-in process: phone moved to a separate room during deep work blocks (consistent with Adrian Ward’s finding on mere-presence costs), and Slack desktop closed between the scheduled processing windows rather than minimized.


Weeks Five and Six: Stable State

By week five, the system had reached what Lena described as “the point where I stopped thinking about it every day.” The friction was in place; the task-structure interventions were habits; the scheduled Slack windows handled legitimate communication needs without opening a compulsive monitoring loop.

Her week-six time estimate, based on her continued logging: roughly 11 hours per week recovered relative to her baseline. The breakdown:

  • Slack monitoring reduced from ~3.5 hours to ~45 minutes (scheduled windows only): ~2.75 hours recovered
  • Semi-work browsing reduced from ~2.5 hours to ~30 minutes: ~2 hours recovered
  • Instagram effectively eliminated during work hours: ~1.5 hours recovered
  • LinkedIn and news sites reduced substantially: ~1.5 hours recovered
  • Secondary effect: fewer recovery windows needed due to fewer interruptions: ~3 additional hours of more effective work

The 11-hour figure includes both direct distraction time and the recovery windows that previously followed each interruption. The Gloria Mark figure of roughly 23 minutes average recovery per significant interruption makes these secondary gains substantial.


What the Six Weeks Revealed

Three things surprised Lena about the process.

The category misdiagnosis. Her mental model of her distraction problem was built around recreational social media. The audit showed the real problem was semi-work avoidance and compulsive monitoring behaviors she did not classify as distractions. AI pattern analysis of actual log data, rather than intuition, was the only way to surface this.

The trigger was the problem, not the platform. Increasing friction on Slack without addressing the task-aversion trigger produced limited and temporary results. Once the trigger was diagnosed and addressed at the task level, the platform-level friction became sustainable. The two interventions are not independent — they work together.

Overrides were diagnostic, not failures. Every time she overrode her friction system, the check-in prompt treated it as information rather than backsliding. This changed how she responded to override events: instead of self-criticism followed by renewed resolve, she asked what the trigger had been and whether the system needed adjustment. The system became self-correcting because overrides fed back into calibration.

She continues using Beyond Time’s weekly review workflow for her Friday check-ins, which include a distraction audit component alongside her weekly planning session — so the recalibration is built into an existing ritual rather than requiring a separate commitment. You can see how that workflow is structured at beyondtime.ai.


Lessons for Your Own Experiment

Lena’s experience is specific to her work type and distraction profile. Your pattern will differ. But three transferable principles emerged from the six weeks.

First, audit before you intervene. Your intuition about your worst distractions is probably wrong in important ways. Three days of logged data is more reliable than years of self-observation.

Second, classify triggers before assigning friction. External triggers respond to access management. Internal triggers require behavioral responses at the task level. Distinguishing them early prevents Version 1’s failure mode.

Third, treat overrides as calibration data. The check-in that asks “what was the trigger for this override?” will teach you more about your distraction system than any amount of renewed determination to follow the rules.


Run your own three-day distraction audit this week, logging trigger type alongside platform and duration — then paste the log into Claude and ask it to identify your highest-pull category before you decide anything else.


Related:

Tags: distraction elimination case study, Friction Ladder, UX writer productivity, AI distraction management, attention reclaimed

Frequently Asked Questions

  • How much time can someone realistically recover using a friction-based distraction system?

    It varies significantly by baseline distraction load. In this case study, the subject estimated recovering 11 hours per week — a figure that includes both direct distraction time and estimated recovery windows. More typical recoveries in the 3–6 hour range are more common for knowledge workers with moderate distraction patterns.
  • How long did it take before the system showed results?

    The initial audit took about three days. Measurable change in distraction frequency appeared in the first week. The deeper behavioral shifts — particularly around internal trigger management — took three to four weeks to stabilize.
  • What was the most surprising finding from the AI check-ins?

    The biggest surprise was that the subject's most costly distraction was not social media — which she had assumed — but a mix of excessive Slack monitoring and semi-work browsing that she had not classified as 'distraction' at all. The audit reframed what counted as a distraction.