Purpose of SCRUM
An agile framework allowing the delivery team to focus on delivering the highest possible value to the business in the shortest possible time.
SCRUM vs AGILE
Agile is a holistic approach which encompasses concepts including SCRUM, Kanban etc
Traditional waterfall project methodologies rely on ALL requirements to be locked in prior to task scheduling.
SCRUM permits this, however generally relies on much smaller development cycles. Each sprint contains a number of tasks which are defined, developed, tested and handed over ('emergent'). This way, the project can dynamically react to any changes and learnings along the way.
Waterfall project methodology is very front-heavy, there can be weeks or months at the front where little to no value is tangibly delivered to the business, whilst the requirements are gathered, design is performed and testing is conducted.
With SCRUM, very small (but simpler) outcomes are regularly achieved. For instance, instead of a full set of 50 reports, the first sprint may focus on the 2 most valuable reports, deliver these, then use the learnings from these reports to help shape the remaining report builds. This will not only help the business realise value quickly, but the remaining reports will be re-evaluated regularly and quite often culled before time is wasted developing them.
- Aim to use a single list, even if there are multiple products. This ensures prioritisation can be performed correctly. It becomes unambiguous.
- Product owner is responsible for populating the product backlog and can have input on priorities in the backlog
- The delivery team is responsible for determining how much work they can deliver within a sprint period and how to produce the outcomes.
- Should cover only up to 85 items in a backlog. This doesn't include items in a wishlist/icebox (further than 4 - 6 weeks away, unrelated to the current projects/stories).
- Review the backlog order regularly (ideally at least weekly)
Designed to get the conversation started, these are either
- large (EPIC) items - more than 1 sprint; or
- small items - only one sprint
A user story will look at who (the user is), what (they require), why (the business value) and the Acceptance Criteria (specifics such as "must be web based", "need to see all departments on a dashboard", etc)
- Expertise: What to build
- Focus: Return on investment
They interface with stakeholders, customers/users, as well as the scrum master and developers. They are a central role interfacing with many groups. Main 4 responsibilites
- Domain Expert
- Great stakeholder manager
- Available for every sprint
- Empowered to make decisions (4 - 6 week horizon)
- Expertise: How to work together
- Focus: Process and people
The Scrum Master owns the Scrum process. They are a coach, facilitator and impediment remover. They help the Scrum Team develop high performance traits.
A Scrum Master should ideally focus on one team (i.e. not be Scrum Master for 3 or 4 teams) and should avoid being multiple roles (e.g. Scrum Master and Developer and Product Owner).
- Expertise: How to build
- Focus: Delivery
The Developer team is a Cross functional team. It doesn't demand for each team member to be cross functional, but the team needs to consist of the following functions:
- build engineer
- test / QA
Ideally each engineer needs to have some experience in all three. For instance, if you're designing a chair but have never sat in a chair, you will not be very effective.
Objective: To work out how to deliver a 'done' increment as a team
- Product Owners
- Scrum Master
- Increased focus
- Bite-size pieces of work, time-boxed
- Allows for regular inspection, review, and adaptation
- Creates a reliable work rate (velocity)
Length of Sprint
- for software team, 2 weeks is appropriate.
- For other teams, 1 week or 4 weeks may be fine.
- Try to aim for between 1 and 4 weeks
- Ensure each sprint is a consistent duration.
Use the retrospective to review the first few sprints and determine if the duration is appropriate.
These are the limits you should strive for
- 2 hours per week - Sprint planning (e.g. a 3 week sprint might need 6 hours of planning)
- 15 minutes per day (1 hour 15 minutes per week) - daily scrum
- 4 hours per week - Product backlog refinement
- 1 hour per week - sprint review
- 45 minutes per week - Sprint Retrospective
So - up to 9 hours per week for non-work tasks
There should be a limit placed on Works In progress, otherwise teams may start too many Product Backlog Items without closing out items.
Definition Of Done
A clear set of exit criteria determined by the whole scrum team. Ensure it is technically feasible (and for tasks - within a single sprint). Be conservative/realistic too.
A clear DoD builds continual confidence in where the product is up to.
Mainly applicable for software development.
Tests need to be driven by the specifications provided in User Stories, rather than interpretations by the developers.
Automated testing needs to be included within the Definition Of Done.
Ideally look for a way to develop 'living documentation' based on the automated testing, rather than relying on Wiki style documentation which inevitably is rarely updated.
Estimation at both short term (sprint level) and long term (release level).
Story points less useful for sprint planning, but more for forecasting how long a story /epic will take to be completed
The relative points approach compares the effort required for similar tasks previously completed. For instance, climbing the stairs on a 200m tower would be around twice as much effort as the stairs on a 100m tower, so the points (respectively) might be 2 and 1, or 8 and 4. The points aren't time, the points aren't number of stairs, they are just a relative measurement.
If you don't have any point of reference (e.g. if you've never done the task or have never done any sprints), start with your best estimate, review in the sprint retrospective and adjust for future sprints.
Dealing with Defects
Generally an omission from the developer. For example, releasing a new feature but forgetting to set the colour of the page correctly as per the specifications.
Bug (non urgent)
This is an unexpected problem, perhaps introduced by the new feature - for example the colour is correctly applied to the page (as per specifications), but the text for the menu items is now the same colour. The user doesn't mind as they rely on icon images, but it needs to be fixed
- create story in backlog
Also an unexpected problem. For example - the colour is correctly applied to the page (as per specifications) but when printing automated reports, this results in the whole page being black as the text colour is very close to the page background colour.
A dedicated person to fix problems as they arise (they are not doing project work). They are the first point of contact for project bugs / issues.
Sprint Interference Buffer:
Allocate a percentage of time for fixing problems (e.g. 80% for project, 20% for unexpected fault remediation)
Have a support team involved in the scrum process for remediating these problems. The difference is they don't do a longer-term plan (e.g. 2 week sprint), perhaps they run a daily mini sprint to address the issues as they arise.
Tooling - Kanban Task Boards
This is a good visual aid to show where various tasks are up to, especially
- Who is working on what
- Which tasks are blocked
- What has been started or is yet to be started
You can use many small methods to help with visualisation, including
- Columns for each status
- Us post-it notes, spin the note 45 degrees if blocked
- Use colours for categorisation (person, project, or some other grouping)
Early and frequent verifications between developers, Product Owners, etc
This avoids wasting time.
This is a showcase of what we have achieved - should include some demonstrations.
During a review, run through
- what we have achieved in the prior sprint, demonstrations
- what we are looking to achieve in the coming sprint period.
- burn-down etc.
- Lessons Learnt
This is different to the sprint planning session.
A Sprint Review should take place even if there is no final-product released (to 'showcase')
Needs to be a non-threatening review where people can freely discuss what worked well and what can be improved. Less focus on the product, instead focusing on the delivery process.
- What was done well?
- What could be done better?
- What issues need to be escalated?
- What changed since last retrospective?
- What can we try for the next sprint?
Sometimes it will be tempting to skip the retros. Try to ensure you still hold them - even if only 10 minutes.
Some of the common improvement themes:
- Product Owners and Developers not talking
- Stakeholders bypassing the Product Owner
- Unstructured intra-team communication
- Tweaks become major scope changes
- Scope was not clear / immature in sprint planning. Perhaps Acceptance Criteria not clear enough
- Inconsistent adherence to user-story format
- Definition Of Done - over-baked or not clear enough
- Too much manual regression testing
- Waterfall sprints - testing left until the very end instead of continuous
- attitudes toward continuous improvement
- fear of experimentation
- balancing qualifications of people with their qualities