How our team “does Agile”

Super quick background: Our project started Jan 2015. To kick things off on the Agile methodology, our Scrum master conducted a brief training (less than half a day) to the team, and we’ve been playing it by ear ever since.

Over the course of many sprints, retros and releases, we’ve made adjustments on how we’re doing Agile. I’m not sure if there are Agile Purists who would frown down and shake their heads at us for the variations we’ve made. But the thing is, despite our possibly non-canon approaches, what still matters most is the team closely working together to deliver working software.

This post might be TL;DR. But in case you’re still interested in having a peek at how our little team does Agile Scrum, go on and have a read…

Daily Scrum

First noticeable difference is on how we go about our daily scrum. Many references describe it to take 15 minutes with each person answering 3 guide questions:

  1. What did you do yesterday?
  2. What will you do today?
  3. Are there any impediments in your way?

Ours is scheduled at 30 minutes but on most days we do finish in 15. Extensions are usually for connectivity issues or when there are announcements or issues that need to be brought up to the whole team. At the start, we tried working with the usual format of answering the 3 questions. I’m not sure if it was just because we were new to agile then (or we just didn’t know how to give good updates) but we ended up shifting from person-by-person to story-by-story. We’d go over each user story to get the updates or find out if there’s any impediments. This also helped us see whether the higher priority user stories are indeed being prioritized first.

Bottom line: The team is on the same page on what’s to be done, and impediments are brought up and discussed.

Sprint Reviews

These are described to be done at the end of the sprint wherein the team conducts a demo to the product owner (PO). This gives a chance to gather feedback on the user stories that were implemented during the sprint, and the feedback could then be added as new user stories in the product backlog.

For our team, we have a PO who is very involved in the project and who is willing to look into the application under test even before the end of the sprint. When the user story is already “demo-able”, we assign it to him on the board for his review. For some cases, our PO requests for a demo and we schedule that within the sprint. For most of the other cases, he accesses the application directly and reviews the user story. In case we get any feedback, we evaluate whether we can cater for it within the sprint or if it needs to be added into the product backlog for another sprint. In case our PO accepts the user story, he marks it as done on the board.

Bottom line: PO gets to give very timely feedback on the user stories.

Product backlog grooming

Apparently, from what I’ve read, “product backlog grooming is not yet an official Scrum meeting”.

For our project, the PO, the UX designer and I (as BA) have regular meetings where we discuss possible items for the next sprints. The UX designer also prepares mock-ups as needed, and I help write up the user stories in our product backlog. This helps so that when we do Sprint Planning with the whole team, the user stories already captures the PO’s requirements. It’s already fleshed out so we don’t need to spend the sprint planning for fleshing things out. We do still groom further as we size and clarify on the user stories.

Bottom line: Team participates in the grooming.

User story definitions

Typically user stories follow the template:

As a <role>, I want <some goal> so that <some reason>.

But lately, since I’m the one fleshing out the user stories, I’ve been using a different format wherein I list down the expected feature changes, and those items sort of double as the acceptance criteria as well. I asked the team whether they had any concerns on how I was writing things out and they were fine with it. An advantage I see is that we just have one field to look at and update for the details, and no need to repeat the same info in the Description field and in the Acceptance Criteria field.

I did google a bit on whether our format is “acceptable” for Agile user stories. As-a-I-want-So-that seems so standard that doing it any other way might be frowned upon. But I did stumble upon a concept that was new to me called FDD or Feature-Driven Development which writes up user stories in this format:

<action> the <result> <by|for|of|to> <object>

And I also found this July 2015 post written by Mike Cohn entitled Not Everything Needs to Be a User Story: Using FDD Features. He happens to be same guy who wrote Advantages of the “As a user, I want” user story template back in April 2008.

Bottom line: Template may vary but what’s most important is to capture requirements in such a way that the team gets it.

Handling bugs

I don’t know the official Agile way of handling bugs. But what we’ve been doing so far seems consistent with how I’ve read other teams are doing it. When testing out an implemented user story, if we come across a behavior that causes the acceptance criteria (or the detailed description) to not be met, then we log that item as a bug. If it’s behavior that isn’t quite captured in the user story description or if it’s something we’re on the fence on, involved members of the team discuss and it has 3 possible outcomes:

  • Log it as a bug – This’ll need to be fixed for the user story to be done.
  • Log it as an enhancement – We add this in the backlog and decide if it’s something we need to or can include in the current sprint. Like other user stories, we size enhancements.
  • Not fix it – This is still a viable option.

I’ve found there are also some online discussions on not logging bugs at all, and just directly collaborating with the devs to get them fixed. I’m partial to logging though because

  1. I don’t really trust committing to memory alone — people forget stuff, and
  2. In our team, it’s possible that a different tester would verify the bug fix so having a bug report to refer to is helpful, and
  3. It gives a nice paper trail that whatever related bugs have been addressed before we send the user story out for review or mark it as done.

Bottom line: Bugs that deter the acceptance of a user story are fixed.

Handling unfinished work

As much as we’d like to stick to our commitments, we don’t always complete the user stories that we committed to in the sprint. It could be a variety of reasons but what’s most important is that it is shared openly within the team so we can do the needed adjustments.

During our sprint planning, what we do is:

  • We size what’s been done and what still needs to be done for the unfinished user story.
  • We include the size of what’s been done in the previous sprint.
  • We move the unfinished user story (with its updated size) in the next sprint (i.e., the one that we’re planning for).

Bottom line: We process unfinished work and plan for it.

Bottom bottom line

I think this is the most I’ve ever used the words “bottom line”. For each of the instances above, I just meant it to share the things that we should try our best to accomplish. They may seem like obvious truths but I’ve seen so often how the obvious could easily be overlooked. So there. I guess there’s no exact formula on how a team should “do Agile” but what’s essential is to work with the team in finding what works for the team.

Much thanks to whoever has read this far. 🙂

3 thoughts on “How our team “does Agile”

  1. We have been doing Agile for years and follow a very similar process. Kudos to you all for applying this with very little formal training. One thing to consider if you are not already doing it is Retrospectives to identify improvements. Keep up the good work.

    • Thanks! 🙂 Retrospectives are indeed something I appreciate in Agile. In other projects, they tend to do retros at the end of project phases but by then a lot of the learning has already been forgotten.

Leave a comment