Wednesday, June 6, 2012

24 Hours of Pure Agile



 

A few weeks back, my company conducted an experiment. It was intended to be a motivating, fun, team-building exercise fostering creativity and innovation. And it was that…but it ended up being much more.

 

The Concept

It was our "FedEx" day ("when it absolutely, positively, has to be [done] overnight"). The framework was simple:
  • Post up project ideas. It could be anything (product, service, an internal process, a marketing piece). The main requirement was that it had to be intended to benefit the organization in some way.
  • Review the list, and sign up for whatever idea interests you.
  • Each team should produce something in 24 hours.
  • Everyone participates.
That was it. Beyond that, there were no rules. No formal processes, no defined roles, no checkpoints, no required documentation or technologies. Simply starting Thursday afternoon and ending Friday afternoon. From my perspective as the Agile specialist, I was keenly interested in it as a laboratory experiment on the Possible, but I told no one.
Teams formed and sectioned off naturally into their own spaces. Team sizes ranged from ten or twelve down to just two. Everyone was involved, from our firm founder to the newest employee. I joined a team intent on making a video highlighting what it was like to work at our firm. White boards were engaged, post-its began appearing (though not for every team, because they were self-norming) and the work commenced. People enjoyed themselves…the atmosphere was light…but work WAS progressing, you could see that.
When we first discussed the concept, I was asking whether people were staying overnight. I have a long commute, and was ready to stick around to work all night. I mean, it was a 24 hour project, right? Maybe ancient recollections of my time in college were creeping back in, and subconsciously, I thought that would be "fun." But nobody else was planning on doing that, so the staff went home for the evening at later, but reasonable, hours.

 

The Process

What was it like working on the teams? I can tell you from my own experience. We were producing the video, so our backlog looked something like this:
  • Develop video concept
  • Script video
  • Select people for speaking parts
  • Enable technology assets (camera, microphones, video editing software, etc.)
  • Shoot video
  • Edit video
Our team consisted of four people from various departments in the company: myself from project delivery, along with one of our recruiters (Robin), our Operations Manager (Peggy), and one of our software developers (Mani). Most of those skills had no direct bearing on this kind of project, but that didn't matter. We all just pitched in and picked up whatever needed to be done, whatever we were most equipped to help with in terms of talent, experience, resources, or personal contacts. We needed a camera and mics: Robin's contact had professional cameras. Microphones were tough, so we shot with the camera's mic. Peggy came up with a great concept and we ran with it. Mani has a penchant for technology, so he became our cameraman/video editor. Since we have a lot of public material about what we do as a firm, we wanted to show more who we were as a group of people. To do that, we enlisted employees from other teams for the speaking parts (our thanks to their generous cooperation. And in 24 hours, we had completed, shot, and edited a 2-3 minute video, reaching our goal. Did we finish at 23 hours, 50-some minutes? Yes, but this was a 24 hour project, after all.

 

The Results

As the teams demo'ed their completed work to a panel of judges, I was pleasantly shocked. Team after team turned out really great product. And it was varied product. Features were working, plans and processes were well thought-out and documented, etc. While some of the ideas are proprietary and will be developed into product or service offerings, I can tell you the projects were as varied as:
  • New technology products
  • New technology service offerings
  • Fresh new employee onboarding processes
  • New comprehensive technology development standards
  • A comprehensive employee outing plan & process
  • Our promotional video
From an Agile laboratory concept, I couldn't be more pleased. How did these people accomplish these goals, and do it so well?
  • They formed self-organized teams. And not just self-organized once they got on a team, but even WHICH team they wanted to join.
  • They worked without formal requirements that had to be fleshed out before they could do anything.
  • They prioritized their own work without executive direction, knowing they only had so much time to complete something.
  • They FOUND the skills on their own teams to fill needs they discovered.
  • Each team developed their own best way to work, but with the same basic framework idea.
  • They improvised, adapted, and overcame impediments (certainly on my team I saw that).
  • They did it WITHOUT OVERWORKING. Nobody I know of came close to pulling an all-nighter, no sleepless heroics.
  • They came and CO-presented demos of WORKING PRODUCT. And it was high quality.
  • They all had fun. No fights or 23rd hour death marches.
  • And the process WORKED, NOT ONLY for software.

 

The Challenge

So you might be thinking: "yeah, that's great, for a quick overnight fun exercise. But we run a business at my place, and that can't work in the real world." Oh really? I challenge that. Better yet, how close is your current development process, your current project hierarchy, your current organizational culture, and your current results, to what I outline here? Can you get closer to this than you are now? And can you gain these real benefits today, in terms of productivity, organizational culture, and team morale?
If you never try, how will you ever know?


As always, I welcome your comments.

Wednesday, March 14, 2012

Put the “User” Back in User Acceptance Testing


I think it's about time we really focus on making the lives of the users better in our User Acceptance Testing (UAT).



Here's how I propose to do that, and gain the advantages within an iterative Agile framework.

First off, let's settle on definitions. How exactly is UAT defined, and what is its purpose? Most definitions of UAT on the web offer a variant of "having actual users test the application just prior to delivery, to ensure it meets requirements." Some organizations hand users the detailed requirements to test against. Some provide the users with actual test cases or scripts. To these definitions and methods, I would say, respectfully…hogwash.

In UAT, I am interested in whether or not the product makes the users' lives better. Does it make their jobs easier? Does it accurately reflect and support the way they actually work, and not whether what we captured in requirements reflects the way they work? Testing the product against requirements, or test cases, is the job of the team…not the users. What if the requirements are wrong? Maybe we missed some need, or some use cases. Or what if, even though the requirements reflect what the users asked for, once they get their hands on the product, they realize it's not what they need?

Here is my proposed solution to the issue.

First, my Utopian scenario. In an Agile framework, my ideal scenario for UAT is to have direct access to users during the iteration, to gather their feedback as the feature is actually being developed. Show them a screen; let them spend some time with an isolated module. If we have that luxury, the cost of making changes based on their feedback is far less.

Absent that, here's my process framework for incorporating UAT into an iterative development process. NOTE: this process is tailored for a multi-iteration release cycle:

  1. Iteration review/demo: We have finished iteration 1. We have the Product Owner identify and invite interested users (our UAT group) to the demo at the end of the iteration. Let them meet the team, and use the demo of the features developed in that iteration as an introduction to the UAT group of what they will be testing. Gather their initial feedback from the demo and review it with the Product Owner to see if the P.O. wants it in the Product Backlog.
  2. UAT Cycle Start: Use the product demo as a launching pad for a UAT cycle. The UAT group has seen the features in action, they've been able to ask questions and get an initial impression. Now, during iteration 2, it's time to get their hands on it. Structure this in such a way that:
    1. They are using the product in a way actually reflective of the way they work,
    2. It has minimal impact on their busy schedules,
    3. They aren't necessarily testing the same areas,
    4. You get periodic feedback from them in a group setting.
  3. UAT standup: While the team is developing in iteration 2, your Agile team lead (or Scrum Master or business analyst), maybe with the Product Owner, is having quick feedback-gathering sessions with the UAT team…maybe 2x/week. Do this as a group, so members can hear each other's observations. Ask questions such as "How does this help make your job easier?" or "What would you change about these features?" Notice these are NOT 'yes' or 'no' questions. Open-ended questions elicit discussion and feedback…and that's what you want here.
  4. Cycle feedback into the Product Backlog: The Scrum Master or business analyst can now review that feedback with the Product Owner, for possible inclusion in the backlog for a future iteration. Create user stories from the feedback, and have the P.O. prioritize them.
  5. Bring feedback stories into future sprints: So now, let's say iteration 3 is beginning. And feedback from the UAT held during iteration 2 has made it to the top of the backlog and is brought into iteration 3. At the demo for this iteration, your UAT group will see their feedback brought in as actual working product before the release.
  6. Lather. Rinse. Repeat.
Iterative UAT

What are the benefits we're likely to see from this process?

  1. UAT users become invested parties: They are providing valuable feedback, helping shape the product as it's being developed;
  2. Product better aligned to user needs: By having UAT testers interact with the product according to their actual work patterns, instead of a script, we are more likely to get realistic, usable feedback;
  3. UAT users become community advocates: Imagine your reaction to this process if you are one of the UAT group. "I saw the product in development, tried it, gave feedback, and they actually listened and made the changes I suggested before they released the product!" These users will become strong SMEs on the product, and advocates for your process within the user community. And THAT is a big win in an Agile adoption.

 

I would be very interested to hear your thoughts and observations on UAT in iterative development. So as always, I welcome your comments.

Sunday, January 29, 2012

How Really Simple is Simplicity?


Why this Agile principle is not as simple as it seems…and why it is a must


 
"Simplicity, simplicity, simplicity! I say, let your affairs be as two or three, and not a hundred or a thousand; instead of a million count half a dozen…and reduce all other things in proportion."
Henry David Thoreau




"Simplicity – the art of maximizing the amount of work not done – is essential." This is one of the twelve essential principles behind the agile Manifesto, and I think it is one of the most critical and challenging for people and organizations attempting Agile.


Why stress simplicity?

Just what makes this principle so critical, anyway? I mean, when we say "maximize the amount of work not done", are we advocating laziness? Of course not, but I would say we are advocating efficacy. Increased efficacy – maximizing output for a given input of effort – means we are producing the most value for a customer with our expenditure of effort. And simplifying what we do…our processes, our code, our communication, any use of our time…will maximize the amount of effort we can focus towards producing quality product for the customer above anything else. And that is one of the value propositions of well-practiced Agile…maximum output for a given input. Simplifying software can also make it more flexible to extend and adapt, easier to maintain, and easier to learn for new team members. But simplicity itself can be hard at first.


Natural or unnatural? Science or art?

Our lives are getting more complicated every day. It seems to be a natural tendency for us to allow, or enable that to happen. Think about it…do you find it more of an effort to simplify, or complicate? A quick search on the term "simplify" on Amazon's book list yielded more than 1,300 books on the topic at the time of this writing. Want more proof that we crave help in simplifying? Think of the Staples "Easy" button. One of the more successful ad campaigns in recent memory was so successful, I contend, because it speaks to such a cross-section of our society that struggles with an overly-complex business and personal life. And every generation says that, in their youth, life was simpler (I'm even doing it now, with my own kids). It's actually become "outside the box thinking" to focus on simplifying. In fact it isn't natural, so it isn't easy…it takes focus and effort to simplify. And it is an art, rather than a science, because the answers aren't prescriptive. Ultimately, the answer to maximizing simplicity is between your ears – it's how you think and how you approach situations. You have to actively look for the simple answer, the simple solution. If you don't, when you finally step back and take stock of what you've built – system or process – you may have that "Rube Goldberg moment"…the realization that what you've built is something very complex to accomplish the very simple.

Example (click to enlarge):



Recently, I was watching the Discovery Channel series "One Man Army." It's a reality show where contestants with backgrounds in the military, law enforcement and extreme sports compete in various challenges for the title. In one challenge, they had to breach a series of barriers using only the tools provided. The barriers got progressively more challenging including a cinder block wall they had to breach with a sledge hammer while being sprayed with water. By the time they got to barrier 5, the contestants were adrenaline-charged and bent on destroying anything in their paths. At this point, the host said "what they don't know is that the door on barrier 5 is unlocked, and they only have to try the handle to move on." How often do we assume in our work that the answer to a challenge has to be complex…it couldn't possibly be that simple…could it? Yes, it could.



How to do it

While, as I said, the answers aren't prescriptive, here are a few pointers and concepts to keep in mind as ways to simplify your practices.

  • Focus on simplicity…continuously. The importance of diligence can't be overstated. As I noted previously, our natural tendency is to make things more complex. Things will consistently drift in this direction – you'll see it if you watch for it – so keeping this a continual daily focus is important.
  • Challenge your assumptions. Don't assume complexity is a forgone conclusion. Established processes, legacy code, tools and procedures…all are likely to remain static and unchanging unless questioned. One note of caution: I'm not telling you to go raise Cain across your organization and change the world overnight. Start with even your own practices and see if there are opportunities to simplify. You might find you end up leading others towards this way of thinking by your example.
  • Framework, not process. Agile frameworks like Scrum are indeed simple in structure…even deceptively so. Resist the temptation to extend and augment them with more process. Again, some people will tend to add on additional processes to their Agile practices over time, assuming something so simple in structure can't be effective. Don't be one of those people.
  • User stories. User stories are so powerful in capturing users' needs precisely because they are so straightforward and focus on what the user wants to accomplish, not what "the system should do" or the technical implementation should be. Use them as they were initially intended…as a starting point for a conversation between customer and implementation team. Try them, trust them, and practice face-to-face communication for requirements, and you will be amazed at the rapid rate of sound understanding that develops.
  • "POW" modeling. I won't name them, but there are one or two modeling software programs out there I just can't stand to work with. They are complex, awkward, and by the time you've built the model being discussed in a meeting, creativity, flow, and some aspects of the design may be gone. Instead, I like to use Scott Ambler's "POW" modeling…the Plain Old Whiteboard. Quick, simple, dynamic…and just pull out a smart phone, snap a picture and email it to the team and/or stakeholders and *poof* you have model documentation.
  • Manual task boards. My experience is that this is consistently the first Agile practice that nay-sayers change to something more complex. "Seriously…we build software. We can come up with something slicker than post-its, can't we?" "This looks like a children's classroom…can't we clean it up a little and use something electronic?" Now, I like electronic ALM tools, and there are several terrific ones out there. They will do a great many valuable things for you and make many aspects of your Agile life easier. But replace the good old post-it task board? Sorry…no. The effectiveness of a manual task board is in itself worthy of a separate post, but for now, trust me. I have lived it personally. Buy stock in your favorite office supply store or sticky note manufacturer. Then, go buy a ton of them and use them, and promote their use with others.

These are just a few tips on practicing simplicity. I would love to hear and share yours, and your views on the subject, as well. So as always, I welcome your comments and questions.


Editor's note: In my Agile meetup group, we're exploring the twelve Agile principles, one per month, over the course of 2012. If you are in town, please join us.

Wednesday, October 12, 2011

Team Reward Systems in Agile


"You Get What You Measure."


It's football season again, and as a long time player, it's my favorite time of the year. But this year is extra special for me, because my son is playing his first year. I was interested to see that his coach gave out helmet stickers as awards for doing well, but they were not allowed to put them on their helmets! For my son playing soccer, they each get a star if the team wins and extra stars for special on-field performance on defense and showing teamwork, but not for scoring goals. It all makes me think back to my last two years playing in high school, and how really true it is that "you get what you measure."

In those two years, we had two very different head coaches. My junior year, the head coach gave out helmet stickers for everything. If you scored a touchdown, you got a sticker. If you scored the extra point, you got a sticker. If you made the "hardest hit" of the week, you got a sticker. If the long snapper and holder didn't fumble any snaps during the game, they each got a sticker (but wait, wasn't that just their
job?). It went on and on and on. And as the season wore on, some helmets became just covered with stickers, while others were pretty bare.

My senior year, the new head coach (an assistant the previous year) changed all that. Nobody got any stickers unless we won the game. When we won, every player on the team got one sticker, period. The only other stickers were given out by the head coach: two or three each week, to seniors that exhibited effort above and beyond the call and served as examples for others to follow. It came the night before the game, along with a handwritten note from the coach about how great you were doing. It was very special to get one…I still have mine in a scrapbook.

The interesting point was that these two very different rewards systems, with essentially the same team, yielded two very different behavior systems. In the individual reward model, the game-within-the-game became how many stickers you had. It spawned individual focus, jealousy, backbiting, and hero behavior. All of these were counter to teamwork, pulling on the same end of the rope, etc. "I'm about to get tackled, but why should I pitch to my teammate? If I do, he might score and I want the glory myself." With the team reward system, it became about the team winning, not about individual accolades.

In Agile, we want to foster this same team-based cohesiveness. We want the team pulling together, looking for what they can do to make the iteration succeed. There should be no such thing within an Agile team as "I succeeded in my iteration, but you failed in yours." But especially here in the United States, we have built compensation models based on individual performance: reviews, raises, bonuses are all based on individual achievement. Cultural analysis also reveals this: in Geert Hofstede's Five-Dimension analysis of national cultures, the United States rates extremely high (91%) in orientation towards individual achievement. Certain European cultures show an affinity for group achievement, and Agile coaches working in Europe report it being much easier to foster a team-oriented mindset.

When your Agile team members are under individual reward models, those same issues I saw on our football team will show up. Hero developers can put the project at risk trying to crank out lines of code without communicating and coordinating, working on tasks that aren't visible to make powerful stakeholders happy. Struggling team members can hide their challenges, fearing being called out and losing out in their performance review rather than asking for help. Backbiting and jealousy among team members can cause them to avoid helping another team member to "save themselves" (watch the scene in Titanic when the ship first sinks). Team members will not have the spirit of chipping in to do whatever is needed to succeed, but instead you may see the "it's not my job" behavior. (If you're looking for behavior patterns we want in Agile team members, I wrote about that in my blog "Are Your Teams 'Wired for Agile?'").

Changing your rewards system to a team performance-based system will be an organizational challenge, and not one that can be handled at the team level. As an Agile champion in your organization, you should need to engage management and Human Resources to look at structuring a model that will foster the kind of team behavior we want. Here are some suggestions to consider:


  • Iteration Success Rates: Team members all receive the same bonus based on successful iteration completion rates. You might set a standard that if a team successfully completes X% of their iterations in a given timeframe, each team member gets a bonus.
  • Non-monetary compensation: When money is the only motivator, you will keep cohesion and loyalty only as long as the pay rate stays "high enough." But people are motivated by more than just money: consider offering paid training or conference attendance to advance desired skill sets, or free time, or team outings to increase camaraderie.
  • Team-based reviews: Reporting managers are likely not on the ground on a daily basis with their reports, when those reports are on an Agile team (in fact, it's a dysfunction if they are). As such, they aren't the best positioned to provide feedback from the trenches. But fellow team members are. Gathering feedback on every team member form every other team member can give managers a good, holistic view of how each person is contributing to team success. The large sample size can also help smooth out backbiting that can occur between team members that don't get along and may snipe each other in reviews.
  • Team-awarded "MVP" Awards: Much as my senior football coach did, award exemplary performance awards to team members that go above and beyond the call. But instead of the manager deciding who gets these, have that decision solely in the hands of the team. Being voted an award by your peers will often mean a lot more than by a committee or manager.
These are but a few steps you can take to foster team cohesion and a teamwork mindset. Just remember to stay away from tons of helmet stickers.

As always, I welcome your comments.

Friday, July 15, 2011

Story points: Who gets the Credit?


How getting caught up in the accounting can make us miss the point







My cell phone rang and it was my Agile coaching compadre on the other end of the line, exasperated. Uh oh, time for an Agile intervention.

He had a team that took on 10 points of user stories in a sprint, but only finished 3 points. Now they were planning for the second sprint, and they were starting in a hole with work to complete on the 7 unfinished story points, plus another 10 points of work they were taking. They were trying to keep their average velocity at 10 points per sprint. I asked him what they were going to do about the credit for the 7 story points that spanned the sprints?

"They're going to get credit for them in sprint 2, since that's when they completed the work." Well, he couldn't give them credit for the points in sprint 1, since they weren't completed. That was the right answer there. I asked how much of the work was left over for the stories. "About 10% left, all testing" was his answer. Hmmm…OK. Something was bothering me about that. The majority of the work on these points was done in sprint 1, but they weren't completed in sprint 1, so the credit couldn't go there. But to get full credit for 7 story points in sprint 2, when only 10% of the work had to be completed, wasn't striking me as kosher either. What about partial credit in each sprint…90% credit (6.3 points) in sprint 1, and 10% credit (0.7 points) in sprint 2? Well that seemed like ridiculous mathematical gymnastics, and besides, it would send the wrong message to the team: that it's OK to NOT finish work in a sprint. 


This team was indeed in a box. They had already completed most of the work (and wound new code deep into the existing code base) for those 7 story points. They started all of them in motion at the same time in sprint 1, too, (the Scrum Master should have advised against this) which had limited their flexibility to move some back to the product backlog. So now, it had to be completed.

We continued debating the issue and trying to decide where the points should go. They had to go somewhere, didn't they? And then it hit me.

Those 7 points go nowhere. 

No credit. But they DO have to complete the work. "Well, that's an answer that everyone will hate" was his response. They probably will, but its tough love for the Agile team. And here is why I think it's the right thing to do.

What are we measuring with story points and level of effort? First off, it's NOT "time tracking." It's not meant to be a measure of how the team spent their time. What we are measuring is business value delivered. Stories are worth nothing to the business if they are not completed. We want teams to start a sprint entering into a commitment with the Product Owner that they are taking on work, and will finish it in the sprint. Of course, there are times during a sprint when work takes longer than expected and teams can't finish everything they committed to, but that's when the Scrum Master facilitates communication with the Product Owner to move the lowest priority not started items back to the product backlog, ensuring that whatever stays in the sprint can get to "done." I covered this in a previous blog post.

You honor the commitment, but story point credit and velocity has predictive value in measuring how much work the team starts and completes. Since this was not the case with this work, no credit is given. Would everyone (team, Scrum Master, Product Owner) hate this solution? Sure, but it makes for a great retrospective topic, and a lesson learned for the future.

So how do you, as a Scrum Master or Agile team lead, avoid having this happen in your teams? Here are a couple of tips:


  • Use solid velocity measurements in sound planning: Scrum Masters should use velocity as a rolling average, not a single number, and in concert with good capacity planning (vacations, holidays, other commitments, other drag factors) help keep the team from "biting off more than they can chew" in their commitment in sprint planning. This is precisely why I feel you don't get credit for partially completed stories at all…it provides no predictive value.
  • Minimize WIP: The fewer number of stories a team has in flight at any one time, the greater their flexibility to adjust and move not started stories around without having wasted effort and having to unwind code from the codebase.
  • Inspect and adapt daily: No matter what, keep the daily standup sacrosanct and effective. It remains the team and Scrum Master's primary channel for surfacing and mitigating potential problems before they become real issues.
  • Practice "tough love" servant leadership: Remember, a Scrum Master is not a manager of the team, they are the coach and protector of the process framework. The coach does not take the field; he/she observes, advises, supports, and puts the team in a position to succeed. And by practicing tough love like in the example here, a Scrum Master helps the team, and him/her self, learn a valuable lesson from a mistake they are unlikely to repeat in the future.
As always, I welcome your comments.






 

Thursday, April 21, 2011

Effort Estimation is Child’s Play

Our schools are teaching Agile techniques to 2nd graders.

My 2nd grader asked my wife for help with his math homework. As soon as she heard the word "math," she said "go ask your father." He brought it over to me. They are learning about weights and measures and how to use scales. He showed me his homework and said "dad, I don't get it. What is this?" I looked at it and smiled. "Well," I said, "that's what dad teaches grown-ups to do every day at work. You're learning about effort estimation."

Here is his actual homework sheet:

 

 

Each example gives you choices of absolute measurements (pounds), BUT you don't have the tool to weigh each object. Instead, you're relying on past experience and knowledge to estimate the weight. Have you ever held a baby? Probably, and even if not, you know they are a lot closer to 8 pounds than 20 or 70 pounds. And while you've certainly never weighed an elephant, you know from your visits to zoos and circuses that they are huge, and your experience with objects that weigh 100 pounds or 500 pounds tell you that an elephant is certainly much more than that, so given the choices, 11,000 pounds is easily the right answer.

This is essentially what we do with effort estimation in Agile product development. We are asking team members to measure how much work is required to complete the product feature, not how long the work will take. And the measurements are not on an absolute scale, they are on a relative scale, meaning "how much work is this item, relative to the amount of work the simplest item you can think of is?" When we start using effort estimation (BTW, a.k.a "story points" or "effort points") we have teams baseline on a common simple activity in their domain, and use that as the base measure. After they agree on what that is, all other features are measured relative to that baseline.

Many, if not most, Agile teams use a numerical scale called the Fibonacci scale to measure effort. The Fibonacci scale is named after Leonardo of Pisa, who was known as "Fibonacci" (son of Bonaccio). He introduced the concept to the West in the early 11th Century. These are the typical Fibonacci numbers used for effort estimation:

1, 2, 3, 5, 8, 13, 21, 40, 100

Without getting into a doctoral dissertation in mathematics, let's just say it gives you a scale with steps that increase in magnitude as you go up, so when you have to decide between two of the larger numbers, it's not so fine grained. Seeing it graphically really helps. Here is a "Fibonacci spiral" which you get by making boxes with areas the same as the Fibonacci numbers, and drawing circle arcs corner to corner, and connecting them:

And where else have you seen that shape in nature?


The increase in step values forces you to choose the closer "bucket" rather than argue over minutiae. And that's why it works so well in effort estimation…it forces you to "pick one!" and not get bogged down in the details, which is the point of effort estimation…the right amount of detail at the right time. We use it early on in the process, before you can know all the detail you would need to know to get down to estimates in hours.

Pick Your Measure

Here's one of the great things about effort estimation in Agile…you don't have to use Fibonacci. It's hard enough, when we're used to a professional lifetime of giving estimates in hours or days, to mentally switch to measuring effort. If your team is having a hard time with Fibonacci, and divorcing a numeric measure from absolute time measurements, don't use it. It's only important that your chosen measure has relative sizing that the whole team can relate to; it doesn't have to be numeric.

Teams have used T-shirt sizing for effort estimation (Small, Medium, Large, eXtra Large). I'm coaching a team right now that uses sea life: guppy, salmon, dolphin, shark, whale, Kraken. ("Release the Kraken, Jim!"…props to you). I even heard of a team that measured stories in "Frito bags." Why? They baselined a 1-Frito bag story on the amount of work it would take this one developer (who ate Fritos all day, every day) to complete while eating one bag of Fritos. You really can use any scaling method like that.

But in the end, remember to focus on simplicity. Effort estimation doesn't have to be difficult, They are teaching the concept to 2nd graders.

As always, I welcome your comments.

Saturday, April 2, 2011

Why People Don’t Read Your Tweets


Look out: you may have lots of followers on Twitter, but c'mere, I'll tell you a secret: people may not be reading anything you post.

It's no revelation that social media has exploded in popularity. According to recent research, Social Networking will be a more popular communication channel than voice by 2015. While Facebook is huge, for my money, the number one player in the world of the quick update, especially on mobile devices, is still Twitter.

Recently reported to be valued at $4.5 billion (with a "B"!), Twitter isn't going away any time soon. Although, my personal open note to Twitter: please remember what you do best, and don't try to change to an all-stop-shop. I'm a big fan of apps that focus on one thing or area and do it well.

When I first downloaded the app I use for Twitter on my phone, I was checking it every time the alert sounded that I had new tweets. But pretty quickly, I noticed some patterns in frivolous tweets that started irritating me...so much so, I usually ignore the notification now. And that's no good for anyone: people with relevant tweets are not getting their message out to me, and I'm missing out on potentially great information. Instead of giving up, though, I decided to tell you about some practices in tweets that bother me, and will cause me to ignore or even "un-follow" people. These are patterns I try to avoid in my tweets, and I think you should as well.

Patterns that will get you un-followed on Twitter:

  1. Failure to understand your audience - Is your Twitter account a professional account, a personal account, a marketing account, or news account? Your followers started following you presumably for a reason. Keep your posts relevant to that reason. For a great success story, look at the account "s**tmydadsays." His name is Justin, and he does just one thing with his account. He focuses on the funny. His Twitter bio says it all:
    "I'm 29. I live with my 74-year-old dad. He is awesome. I just write down s**t that he says"
    And he's parlayed that into a book and television show deal. At the time of this writing, he has over 2 million followers. And what I find even funnier is that HE is only following ONE person...LeVar Burton. Even that is just funny. He understands his audience.
  2. Frivolous Personal Updates - This is related to pattern #1, but deserves its own mention. This has even been lampooned in TV commercials:

    People following your professional Twitter account likely don't care that you're "sitting on the patio." I pulled up a frightening number of results just searching the phrases "at the laundromat" and, even worse, "thinking about nothing" on Twitter.
  3. Personal Conversations - Holding a lot of conversations with another user, that your followers aren't a part of, gives them only half the conversation. You're leaving them out and cluttering their tweets. Use Twitter's Direct Message function instead. Then, it stays between you and the recipient.
  4. Running Commentary – You should tweet the important tweets, but avoid a long string of tweets on the same topic. Sometimes, people will sit at an event and continually tweet about what they're watching or doing. I'm not complaining about pearls of wisdom from the speaker (which I would like to hear too, assuming I'm not there myself), or pictures of the action. Those are great. I'm talking about the commentary without substance, like this:
    1. 5:32 p.m. "Boarding the plane to L.A. now"
    2. 5:35 p.m. "A guy just sat next to me"
    3. 5:38 p.m. "Closing the airplane door now."
    4. 5:39 p.m. "I hope my flight will be on time."
    5. 5:40 p.m. etc., etc., etc…

These are my biggest pet peeves that will make me un-follow someone on Twitter. What are yours?