The Timesheet Problem
Time tracking in engineering teams usually means one of two things: manually logging hours against tasks at the end of each day, or running a timer while working.
Both approaches share a fatal flaw: they measure input, not output. Knowing that an engineer spent 6 hours on a task tells you nothing about whether the task is 50% done or 95% done, whether the approach is correct, or whether the remaining work will take 1 hour or 10 hours.
Worse, manual time tracking is consistently inaccurate. Studies show that people can’t reliably recall how they spent their time even a few hours later. End-of-day timesheet entries are fiction — well-intentioned fiction, but fiction nonetheless.
What You Actually Want to Know
When managers ask for time tracking, they usually want answers to one of these questions:
- “Are we on track for the deadline?” — This is answered by progress tracking (how much work is done vs. remaining), not time tracking.
- “How much does this project cost?” — This is answered by team allocation (which people are assigned to which project), not hour-by-hour logging.
- “Why did this take so long?” — This is answered by lead time analysis and blocker identification, not timesheets.
- “Is anyone overloaded?” — This is answered by WIP tracking and assignment distribution, not billable hours.
None of these questions require individual hour logging. They all have better answers through work-level metrics.
Lead Time: The Better Metric
Lead time — the elapsed time from when a task starts to when it’s completed — captures everything that matters about delivery speed. It includes active work time, waiting time, review time, and blocked time. It’s objective (based on status transitions, not self-reporting). And it’s automatically captured when the team uses the board correctly.
A team with an average lead time of 3 days for standard tasks and 8 days for complex tasks has useful predictive data. When a stakeholder asks “when will this ship?”, you can answer probabilistically: “based on our lead time distribution, there’s an 85% chance it’s done within 5 days.”
This is more honest and more useful than “the engineer estimates 16 hours, and they have 6 hours of availability per day, so… maybe Thursday?”
Cycle Time for Process Improvement
Cycle time — the time from when active work starts to completion, excluding backlog wait time — is the metric for process improvement. If your average cycle time is 4 days but tasks spend 2 of those days waiting for code review, you’ve identified a specific bottleneck worth addressing.
FlowEra calculates cycle time from the first status transition into an “in progress” category to the first transition into a “done” category. This is automatic — no manual time logging required. The team just uses the board normally, and the metrics emerge from the data.
When Time Tracking Is Legitimate
There are valid reasons to track time in engineering:
Client billing. If you’re billing clients by the hour, you need timesheets. This is a business requirement, not a productivity tool. Accept that the data is approximate and build a buffer into your billing.
Compliance. Some industries require time logging for regulatory reasons. Again, this is a compliance requirement, not a management tool.
Personal productivity. Some individual engineers find that tracking their own time helps them understand where their day goes. This is a personal choice and should remain voluntary.
For all other purposes — delivery prediction, process improvement, workload management — task-level metrics (lead time, cycle time, throughput, WIP) are more accurate and require zero manual effort.
FlowEra’s Approach
FlowEra focuses on automatic task-level metrics rather than manual time logging:
- Lead time distribution per flow and iteration
- Cycle time calculated from status transitions
- Throughput — tasks completed per time period
- Burndown — remaining work vs. time in iteration
These metrics are generated automatically from the team’s normal workflow. No timers to start, no timesheets to fill, no end-of-week compliance nagging. The board is the input. The analytics are the output.