One survey page in a notebook filled out
All articles
Apr 17, 2014 | Updated Apr 25, 2021

Holistically Measure Projects in Just One Page: Part 2

In aiming to make a digestible but comprehensive summary, I knew I needed it to be visually clean yet detailed. I wanted to use the functionality of Excel, but I didn’t want it to look like a spreadsheet when it was printed.

In my last post, I wrote about narrowing my focus as I sought to achieve my goal of increasing project transparency at Blink. I wanted each person to know where we “ended up” on a project, a goal widely shared among both large and small consulting firms.

In aiming to make a digestible but comprehensive summary, I knew I needed it to be visually clean yet detailed. I wanted to use the functionality of Excel, but I didn’t want it to look like a spreadsheet when it was printed. I wanted that supporting data to be easy to trace – so other people could figure out where all the information came from on each project – but not overwhelming to the casual reader.

The Summary Sheet needed to focus on the bigger picture, not “just” the financials. With that in mind, I began with the high-level learnings: What went well, and what could go better? In the template, I created a bullet point list that would be filled in during a wider team discussion.

Below those high-levels, I put a mad-libs style sentence where we tied the projects outcomes to the Blink vision: how did we change the world, and for whom?

After the wrap up, I created a “Project Stats” section, where I really let myself nerd out. I thought about the metrics that would matter to our delivery teams, and focused myself on duration and scope. How long did this project take? How long did we think it would take? I listed out the deliverables according to the SOW, and made an area where the project manager (who would fill the in the sheet for each of her recently wrapped projects) could fill in what we actually delivered.

Budgets are nothing more than a plan, sometimes a better conceptualized plan than other times. My goal was to look at what we thought would happen, what really did happen, and allow an easy way to compare, understand, and explain any variances. I also included some Blink-specific and CFO-favorite metrics, like Rate per Hour. Our company has a targeted Rate per Hour, and while it is a metric that doesn’t always resonate with, say, a researcher, it would really make sense for Karen, our CEO, or Lauren on our Biz Dev team.

Then there’s a metric that our CFO doesn’t really care about, but our Biz Dev & PM teams really are excited by: The “Next Time SOW” amount. If we did this project again, what would we charge? What’s realistic? If we knew a certain rate per hour was appropriate (and this is usually the same rate we planned the original project to hit), and we knew the hours the project would take (because we just DID the project), and we now know our expenses (hindsight 20/20 and all!) This means that we have a gut check of what this project “should” cost on our next scope of work. Maybe it’s too high for our client to pay. Maybe we have learned efficiencies so we can bring that number down. It’s all good – let’s write it down and talk about it. This number gives us a place to begin that conversation.

Below the project stats and Next Time SOW costs, I broke out hours by roles so we could see where we were under or over-scoped, and where we could possibly be more efficient in the future. Our teams have found this extremely helpful as a way to dig deeper into a simple overage in hours. Was PM time totally over? Why? Did the client need some extra attention? Was research time under? Because we thought it would be a more complicated study then it ended up? All of these data points were really ways to increase the conversation that would occur when the team gets together to discuss the project.

The financials were included to show how the project ended up from that perspective, but also as a tool to deepen the conversation around what the team thought of the project. Were we under scoped from the beginning? Wait, do we always under scope hardware prototype research projects? Let’s change it!

Now comes the absolutely critical phase: We take the team out for coffee or lunch (leaving the building is key!) and have an honest and positive conversation about how our work went. Through these conversations, we populate the more qualitative “what went right” and “what we can do better” summaries. We bring everyone’s perspective to the table (literally) and have time to discuss the ins-and-outs of the project and make sure to include everyone’s perspectives and individual feedback.

After the PMs have collected several datasheets, we’ll get together and discuss overarching themes and bring those forward in our company meetings. We’re learning from each project – the good, the bad, and the ugly – and making sure those learnings are allowing us to have even better project and client relationships in the future.

In the end, these Summary Sheets did more than measure each project. The small exercises of filling in an Excel spreadsheet taking an hour outside the office with the team has had far reaching implications. We now have a dedicated opportunity to speak with each team member and solicit their viewpoints. We are diligent in asking clients for feedback. We can more easily and accurately identify trends, core competencies, and areas for improvement across the company. It’s a simple reminder of how powerful a thoughtful, flexible, adaptable process can be in supporting the day-to-day operations of a busy, growing, firm.

Client agnostic Project Wrap Up _ screen shot