If you don’t want to click on each download link for the 12 files for each of the VSTS 2008 and VS 2010 Beta 2 VPC image files you can use the free download manager and these two text files (download them from here) that have all the URLs and you can import them to make it a bit easier.
Thursday, December 24, 2009
New Virtual PC Images for VSTS 2008 and Visual Studio 2010 Beta 2 Released
Microsoft has released a series of virtual images with Visual Studio 2008 SP 1 and 2010 Beta 2 for educational purposes. These images contain all you need to get started and are great for learning and demos (which is what I use them for). The VSTS 2008 versions expire in Jan 2011 so you have a whole year of Visual Studio goodness. The Visual Studio 2010 Beta 2 expires much sooner (April 2010) since it is beta software. There are versions for Virtual PC and Hyper V with ones that have just TFS and ones that have TFS and all of the Visual Studio components.
Here are the links for the 2008 images:
VSTS “all-up” Virtual PC/Virtual Server image (7.5 GB download, expands to 17.5 GB)
TFS “only” Virtual PC/Virtual Server image (3.31 GB download, expands to 8 GB)
VSTS “all-up” Hyper-V image (7.5 GB download, expands to 17.5 GB)
TFS “only” Hyper-V image (3.31 GB download, expands to 8 GB)
And here are the 2010 Beta 2 images:
Virtual PC 2007 SP1/Virtual Server 2005 SP1 virtual machine (7.2 GB download, expands to 21.5 GB)
Windows Virtual PC virtual machine (7.2 GB download, expands to 21.5 GB)
Hyper-V/Hyper-V R2 virtual machine (7.2 GB download, expands to 21.5 GB)
For more information about the release of these images check out Brian Randell’s blog post.
Tuesday, December 22, 2009
Free Microsoft Event in Nashville with Keynote by Steve Ballmer
I’ve been lucky enough to have been invited to speak at the Lap Around PDC event here in Nashville on Jan. 20, 2010. Steve Ballmer, CEO of Microsoft, will be delivering the keynote. I will be doing a 70 minute session on Visual Studio 2010. The event is free but seating is limited so click here to register.
Here is some more info about the event:
Did you miss the Professional Developers Conference in November? If so, join us as we bring the “highlights” from the PDC09 conference to you! At this special FREE event we will cover the latest Microsoft technologies and exciting announcements from PDC09 and deliver over 16 sessions presented by Microsoft, Partners, MVPs and Community Leaders. The keynote will be delivered by Microsoft CEO Steve Ballmer, and will cover Microsoft’s three screens and the cloud strategy.
Microsoft is releasing more than 25 products and frameworks over the next six months, more than at any other time. Our goal is to get you up to speed on what is coming, and how it can help you do more with what you have. The sessions will cover multiple technologies such as Windows Azure, Visual Studio 2010, .NET 4, Silverlight 4, SharePoint 2010, SQL Server 2008 R2, Windows® 7, and Windows Server® 2008 R2.
During the event, we encourage you to network with your peers and chat one-on-one with the speakers at the “Ask the Experts” area to get all of your questions answered. You can also visit the Customer Showcase area to see how customers are using the latest technologies in real world applications, and the amazing results they are seeing. We will also showcase the latest Windows powered gadgets and mobile devices in the Customer Showcase area.
If seeing isn't enough, we will also have a fully staffed Hands-On-Lab area packed with PC's, labs you can practice with, and expert staff to help you learn the new tools.
We will be hosting the event at Vanderbilt University Student Life Center in Nashville, TN.
For your convenience, please find specific details below on complimentary event parking and shuttle services, which will be emailed to you as well.
Breakfast and lunch will be provided.
Every attendee will receive a Visual Studio 2010 Beta2 DVD, a t-shirt and an event bag filled with a collection of product information, case studies and other great resources. We will also be raffling off some great prizes at the end of the day.
Come spend the day with us and learn about the future of developer technology!
We look forward to your attendance!
Tuesday, December 01, 2009
Scrum for Team System V3.x Beta 2 Released
Conchango has released the second beta of their Team Foundation Server process template for Scrum. This version works with the current beta 2 release of TFS 2010. Click here to download.
Monday, October 26, 2009
Visual Studio 2010 Webcasts
Visual Studio 2010 and MSDN are BIG – and they are getting bigger! In this session, we’ll explore all the changes to the Visual Studio family of products, including Team Foundation Server as well as recent upgrades and changes to our MSDN line-up. Please join us for a 60 minute “CAN’T MISS” – high level overview, where we will discuss the many changes to our offerings which will likely impact current customers and future customers. We promise you won’t be disappointed!
Speaker: Tim Adams, Tony Jimenez and Randy Pagels, Microsoft Corporation
Join the Meeting
Audio Call Only
Conference Call ID
9:00 AM- 10:30 AM (CST)
2:00 PM- 3:30 PM (CST)
9:00 AM- 10:30 AM (CST)
2:00 PM- 3:30 PM (CST)
Wednesday, October 14, 2009
The Agile Adoption Mixing Board
When I talk to clients about adopting the basic values from the Agile Manifesto, I tell them to imagine sliders between the four entries on the left and right. A good adoption is like mixing a band and as the band plays you adjust the mix constantly to get the best sound. Purists may claim you must adopt each value to its fullest which is like sliding all the sliders on the mixing board to one side which never sounds very good.
I’ve had a client who had recently invested in offshore resources and a new remote office for the development teams. They could not immediately get the development team together so they could not fully embrace the “Individuals & Interactions” value or some of the practices attributed to it. So we used some tools like Skype, WebEx, and Team Foundation Server and put in a little process around communications between the sites to help retain some of the collaborative aspects of the team. So I see this like moving the slider on my Agile mixing board a little over to the right and the “Process & Tools” side. This client does see the value of having everyone together so there is a plan that once they can feasibly move development to one location, they will do so. And when they do we can remove some of the processes put in place and stop using the tools so the slider moves back to the left a little more.
Claims that you must adopt Agile in its entirety and never change your adoption to meet your current environment are not very practical for some companies. I definitely think you should always evaluate the impediments you perceive are preventing you from transitioning to Agile practices to ensure you are not reverting to muscle memory, but there are definitely real world scenarios that can make some adoptions possibly detrimental to the business if immediately (and blindly) adopted. I like this blog from Ryan Cooper about how conflict can arise from these situations.
So I think good Agile coaches are like good sound men who are constantly listening to the band and as the environment changes they adjust the mix appropriately. We listen to the team and the stakeholders to determine how much to adopt and what may have to wait for later. Anything that you do determine should wait needs to be logged with the following details:
- What it is exactly you are not going to adopt.
- What the benefit of adopting it would be.
- The cost of not adopting the practice.
- “Smells” that will indicate you should definitely adopt the practice.
This way we acknowledge that we are not adopting one of the best practices prescribed by Agile (or whatever flavor we are adopting) and then outline the cost and the return for justification later if the smells start to occur. This adoption log can also be great fodder for retrospectives.
So listen to your team and be vigilant when the environment changes so that you adjust your mix of Agile adoption to best fit the situation.
Tuesday, October 13, 2009
Visit with the Team System 2010 Team in Raleigh, NC
Last week I had to opportunity to visit with the team working on Team System in Raleigh, NC. I contacted Jason Barile via Twitter a few weeks before and asked if I could stop by and he went all out to accommodate me and a few coworkers. I was pleasantly surprised when the meeting started and most of the Product Managers and Test Leads for the build, test, source control, and install/administration parts of Team Foundation Server were in attendance.
We talked about many aspects of VSTS 2010 during the meeting:
- How TFS Basic can be a "gateway drug" for those wanting to move from Visual SourceSafe but are not going to use all the features of the Team Foundation Server platform. The team talked about how easy it will be to upgrade from basic to the full version and that all your history will immediately be dumped into the warehouse as soon as you upgrade and your history will look as if the warehouse was installed from day one.
- There was plenty of discussion around implementing Agile practices with Team System and how much more tools will be available in 2010 to support those practices. There are plenty of new additions coming from Conchango to continue their support for Scrum in Team System.
- We spent some time talking about all the new offerings around testing that I was especially interested in since my company sold our QA product suite so there was a new gap in our offerings around automated feature testing that VSTS 2010 will fill in very nicely.
- My team was very excited that the upcoming VSTS 2010 Beta 2 release will include a go live license so some of our clients interested in early adoption can go ahead and implement this release with some support from Microsoft and a direct upgrade path to the final release.
After the meeting we went to lunch with Jason, Buck Hodges, and Adam Barr and continue our conversation. It was interesting during lunch when Jason brought up my views on exclusive check out I expressed in my interview with David Starr from the Elegant Code Cast. I explained that as a consultant I am dropped into teams of various make ups and not all of them have good practices around source control implemented even when they are using TFS. I have frequently had issues with developers making changes to solution and project files with shared check out and not being diligent when merging changes into source control. This has resulted in broken builds, missing files, etc. and me pulling my hair out. It has also been an issue with projects that have not implement good development practices like the SOLID principles and I have had entire data access layers in one class and people constantly stepping on each other's toes. I've had to adapt my source control practices to work around some situations like these that as a contractor I had little control over and no ability to change.
Below is a short (and poorly shot) video of the Microsoft team in Raleigh and a quick, walking interview with Buck, Adam, and Jason on their favorite features in VSTS 2010.
Friday, September 11, 2009
What's Wrong with this Code?
So I'd like to post this little demo project for people to download and then tell me what is wrong with it and refactor it to something more adherent to the principles we all know we should follow, but sometime don't always get the chance.
Here is a little background on the code. The Order class is used to hold data for an order (for what does not really matter to the task at hand) for a given Customer. We assign the Order to a specific Customer by assigning it a Customer number which is a 3 digit alpha-numeric identifier. We serialize these Orders and drop them into a directory for storage and processing. The Orders are given a sequential 3 digit order number based on the last Order in the drop directory.
Sunday, September 06, 2009
Agile 2009 Wrap Up
Agile 2009 is in the books and this was my first year to attend and lucky enough to present. Unfortunately duty called and I was only able to attend the first two days, but I still had a great experience.
This year's host city was Chicago and I had never visited before. We were right downtown with easy access to everything and my wife and kids went with me and had a blast at Navy Pier, the local parks, and Fields museum. The conference was setup at the Hyatt hotel. The hotel was nice but the wireless access was only available in the open area and not in most of the break out rooms. Plus I don't think anyone's cell phone worked as we were many levels underground. I had also wished there had been a pool for my girls, but it was still a very nice place to stay.
I've helped put on a conference before (http://www.devlink.net/) so I know how much work goes into getting everything setup. The staff was very well organized and they all did a great job. I loved the open jam area. I spent many hours sitting in a bean bag checking email or chatting with attendees. The session rooms were large, well laid out, and equipped with very nice projectors and large screens.
The first night I spent a few hours in the Music Masti room jamming with a handful of other attendees. There were some seriously good musicians and playing some very cool jazz. I only wish I could have spent more time there.
I presented "Implementing Scum and XP using Team System" Monday morning. There were many other big hitters presenting at the same time and with .NET being the minority representative I was pleasantly surprised with the number of people who attended. Everything went well and the crowd was very engaging. Plenty of conversations sprang up afterwards and someone even recorded the audio and should be posting it soon. The main criticism was that many wanted to see VSTS 2010 rather than 2008. Next year!
Since I was only there for two days and I had several long chats with colleagues that captured a large amount of my time, I only was able to attend two full sessions. While I have seen most of the presentation before, I went to Robert Martin's Software Craftsmanship session because he is always an inspirational speaker. My table had a few people new to the concepts and a few zealots. Our resulting conversation afterwards inspired me to start writing a blog post on Software Craftsmanship for the Working Class Developer to be posted next week. The idea is that most shops do not initially have the experience or commitment to implement many of the practices from this movement. I think there is a middle ground for these shops to strive for as a gateway to a more in-depth adherence in the future.
I went to two sessions, one on TDD and one of refactoring, that were not introducing any new concepts to me and were in Java and Ruby so I was not as in tune with the examples. While these were both good presentations, I left early to connect with some people I only get to see every now and then at gathering like this (more on this later).
The second full session I attended was the keynote on the second day by Alistair Cockburn entitled "I Come Not to Praise Agile, But to Bury It." Of course everyone will talk about his grand entrance complete with bagpipes and his own twist on the matching soliloquy from Shakespeare's Julius Caesar. I really liked his presentation as he talked about how Agile is not dead, but ready to evolve into something new. He did not say what that something would be but talked about some of the concepts that would be involved.
One of my main goals for this conference was to make some good connections and I was not disappointed. I ran into Corey Haines who I met at devLink 2008. I did not get to spend anytime in his infamous debates because he as volunteering the days I was there, but there is always next year. I finally got to meet David Starr in person after talking to him online for years. We had some great chats in the Open Jam bean bags covering Scrum certifications, software craftsmanship, and Team System. David and I recorded an episode for his podcast at http://www.elegantcode.com/ that should be out next month.
I got to meet several members of the Team System product team which included the main man Sam Guckenheimer. Sam was great. As we talked, if I mentioned any resource I needed or contact I wanted to make he whipped out his laptop and boom sent the email to get me what I needed. I also ran into the guys from Conchango, Colin Bird and Simon Bennett. Colin and I dove into the beta version of their Scrum process template for VSTS 2010. It was nice to see many of the common extensions I made to the 2008 version made it into the new one. Simon and I talked about the changes coming to the Sprint Task Board and some other new applications coming out with the release of 2010. They also had a very cool Microsoft Surface app for planning poker that can be yours for the low, low price of $10K!
It was also very cool to be able to meet some of the biggest thought leaders in Agile. Early the first day they had not yet setup all the signs for the conference and as I wondered around I asked a nice lady for directions only to find out she was Mary Poppendieck who was so great. Later that same day I was having a great conversation with some of the guys from Thoughtworks when Martin Fowler joins us. I ran into Jeff Sutherland again who I was lucky enough to attend his CSM course last year. I would love to name drop some more, but those were the main people I met.
While there were tons of great sessions I absolutely love just sitting with other Agile enthusiasts talking shop. Day one I was sitting with some guys from CarFax who were describing a great sounding XP shop where Ron Jefferies had come in and worked with management as well as the developers to get them setup. We had a very interesting conversation around technical debt.
There were many conversations about using Team System for Agile and implementing practices from the Software Craftsmanship movement. I t was nice to see many people from Microsoft there being so involved in the Agile community. It was a bit sad that so many people still make cracks about .NET (and Microsoft) being second class citizens in the Agile movement. Sometimes for a community supposedly open to embracing everyone they can be a bit elitist.
The one conversation I was disappointed with was the one around Scrum certifications. I totally understand that the CSM and CSP are benign as a measure of anyone ability. I currently hold both of these and they really only signify that you have been exposed to a certain amount of training on the subject. My main concern is that the Scrum framework itself gets the brunt of the rancor from the community. Scrum is not hurting Agile, poorly trained people are hurting Agile. Even with the paper tiger certifications, the Scrum Alliance marketed the process like no one else and helped get Agile more widely accepted in the mainstream. If our worry is that the core concepts are going to be diluted by the mainstream adoption of Scrum's mechanics, then let's not bash Scrum, let's find ways to maintain the ideals.
I cam away from this year's conference reinvigorated and ready to get back out there even more so in the Agile community. Next year's conference is going to be in Nashville and I am already trying to get our local Agile community amped up. Can't wait to see everyone here!
Thursday, July 23, 2009
VersionOne's State of Agile Survey 2009 Open
Thursday, July 09, 2009
Evolution of an Agile .NET Developer in Books
I've been re-reading a few books lately and in the process it made me start to think of how differently I approach software development than when I first started. It seemed that each time I reach another plateau of understanding one or more books helped me get started on the next level. So in this post I wanted to chronicle my growth (which is still happening!) as a developer using .NET along with the key books that helped me. While I did development before .NET (VB & some Java) I am going to only stick to .NET for brevity.
Phase One: Programmer
When you first learn how to write software you learn the syntax of the language (C# in my case) and how to construct programs using that language. I myself did not go to school to be a developer (I was actually on my way to being a professional pilot) so I had very little pretense on theory or approaches. So I wrote code in a very functional way usually.
When I first started in C# I bought Programming C# from Oreilly. This link is to the fifth edition which covers C# 3.0, but I started with the first edition on 1.0 which is now out of print. I always loved the Oreilly books because they were very direct and to the point. This book was a great help in getting started with the language.
As I started to do more with .NET I bought C# in a Nutshell also by Oreilly. This is not a "how to" book but rather a reference for the C# language and the .NET framework. I turned to this book over and over again to figure out which class to use and how to use it. It was a great companion to the Programming C# book.
Phase Two: Developer
Eventually I was more sure of my abilities that I did not have to concentrate so hard on the coding part of developing software. I had a decent grasp of the language and the framework so I started wanting to write software in a more consistent and productive way. Another consultant told me to pick up Code Complete and it started me on my way to adopting some best practices that I still adhere to today. While it had nothing to do with C#, its content was applicable to any development language.
Also during this time I read (or skimmed sometime) quite a few books from WROX on several areas of programming in .NET which all helped me start to address specific problems in my applications.
Phase Three: Architect (or so I thought)
As I started to get better a writing applications using C#, I was being asked more and more to help design these applications then systems of multiple applications. I was lucky to work with many people who were much smarter than I was and they turned me on to quite a few good resources.
During one design session I was complaining about finding a solution to a particular problem and a colleague shook his head at me and asked, "Haven't you read the gang of four Design Patterns book?" I had to sheepishly admit I have never heard of it and then immediately went out and read it that week. After which I realized I had been describing a picture perfect scenario for an Abstract Factory pattern. I then bought two books on design patterns for C#: Design Patterns in C# and C# Design Patterns: A Tutorial . Using all three books I really started to delve into learning how to write more solid, reusable software.
Phase Four: Team Lead
Once I had been the architect or lead developer on a few projects I was being asked to lead development teams on bigger projects and that when I started to dabble in the project management and business analysis side of the software development lifecycle more. Either I was frustrated by the people in these positions or the project was small enough where the development team took on these duties. Either way I felt there were better ways to do both.
One of the project managers I worked with who was very good at his job loaned me a copy of the Mythical Man Month and it helped me start to think about projects in a wider scope than just the development effort. I was a big help in understanding how to interface with customers and managers and understand how to measure the progress of a project.
I was also given a copy of Software Requirements which gave me a new perspective on gathering requirements and mapping features to coding efforts. At this time I was also introduced to Rational's RequisitePro product which implemented a good deal of the concepts behind the book.
Phase Five: The Reluctant Agilist
By this time I had been leading software development projects for a few years with a good degree of success. I will admit I thought I had it all pretty under control and then a project that went bad started me rethinking how I developed software. Methods that had been tried and true for so long had failed and I was looking for some new ideas. At a new company I was re-introduced to the Scrum process for Agile development. I had heard of this before but to be honest I had dismissed it as the inmates running the asylum. Some of the engineering practices were appealing but overall I was fairly skeptical.
We did a pretty good job of adopting Scrum in all the wrong ways so I turned to Ken Schwaber's two definitive books on the topic: Agile Project Management with Scrum and The Enterprise and Scrum. To be honest I did not come away with much. Not that these are not good books but I tend to learn backwards in that I sometimes need the mechanics of something before I can relate it to the underlying concept. I then downloaded Henrik Kniberg's Scrum and XP From the Trenches and this book helped tie the concepts and mechanics together for me.
Phase Six: Agile Developer
On my next project I took all the things I had learned from our previous rough adoption of Scrum and applied it to our approach. I was lucky in that the management of my new client was very Agile friendly. I now really saw Scrum working and I started to concentrate on how to write my software in a more Agile way.
I attend a local Agile user group and won Practices of an Agile Developer. It was great in two ways because it not introduced some new concepts to me, it reaffirmed some of the lessons I had learned of the past few projects. It was a very easy to read book that provided a lot of information. At the next meeting I won the Art of Agile Development, which built upon everything I had just learned. Both these books made me want to learn Test Driven Development (TDD) better so I then grabbed Test Driven Development in Microsoft .NET.
I recently went back and reread Agile Principles, Patterns, and Practices in C# because when I was first given the book I was still not convinced about Agile and I really only skimmed the material. Going over it now had a much greater impact and there is so much great content that can be applied even outside of Agile that I suggest it to any developer regardless of their approach.
Phase Seven: The Empty Vessel
As I write this I have to say I am a little ashamed at how at some of these levels I thought I really knew everything I needed to know. It has made me really open myself up to learning as much as I can and figuring out the best way to apply that knowledge in my day to day activities. It was humbling to look at the publication dates on some of the books and think, "This has been around for awhile and I am just now learning it!"
There were more books along the way but I wanted to point out the ones that really pushed me to the next level of my development career and I hope to be constantly adding to this list for as long as I write software.
Thought Leadership is Good for Business
I have been lucky to work for some great consulting companies over the years who recognize the importance of their consultants actively participating in thought leadership and IT community activities. I was a technical trainer for a large part of my career and really enjoyed teaching. When I went into consulting it was great to practice what I had preached for so long but I began to miss being in front of a class. When .NET first appeared on the horizon, I along with several of my colleagues jumped on it and before long we were asked to start presenting what we had done at various technical organization in my area. At my very first presentation an attendee later contacted my company to bring us in to help them with their .NET adoption. The value was immediately obvious.
Since then I have strived to be as active a I can in local, regional, and national technology organizations as a participant, speaker, and organizer and I have seen first hand how this has helped my respective companies. Here are a few of the ways these activities have helped with our business:
- Attracting New Employees: I've often had people come up and ask me about working with some cool technology. I always make it a point to only present on technology I have actually worked with and I've been lucky to see a wide range of solutions. When other developers hear about the projects I have worked on they too get excited and interested in being involved.
- Advertising Your Solutions: Sometimes your company may have expertise in a certain technology and it is not widely known in your area. Presenting for multiple organizations helps get the word out that you have value to offer in a certain space.
- Positioning Yourself as an Expert: It is easy to look like an expert for 60 minutes when you have plenty of time up front to plan, but when you really know the material it is obvious to your attendees. This does well to market your own abilities and thereby your company's abilities. While there may be other people with the same skill set in a certain technology, effectively being able to communicate and explain it is just as valuable.
- Improving Employee's Communication Skills: Some developers are great at developing, but not so great at communicating. As a consultant, communication is a skill that is almost as valuable as your coding skills. Companies that encourage their employees to present are helping them grow these skills also.
- Differentiates You from Your Competitors: There may be many companies that offer the same services are you do, but being seen as a thought leader in a certain space can really set you apart from your competition.
As you can see, there are some very real benefits to participating in IT communities as a Thought Leader. Companies who encourage the employees to pursue these activities and invest in them can see some very tangible gains. Even attending and supporting local IT groups can help with your business so get out there and be active!
Thursday, June 18, 2009
Virtual Sprint Task Board Using a Wii Remote
While I love implementing Scrum in Team System, nothing can quite compare to a physical story wall of sprint task board. The intimacy and tactile nature of moving cards around on the board cannot be replaced. However, a good many of my clients who are implementing Scrum shy away from using note cards and sticky notes. Conchango provides a very cool WPF app that simulates a sprint task board, but even when you have it projected on a large screen, someone still has to sit at the computer to move items around and it is either a logistical nightmare having each person sit down to do this or one person does it while everyone tells them what they did and you start to loose the attributes that make the task board such a great tool.
You could go out and buy a large touch screen monitor and mount it in the Daily Scrum room if you have $3000 to $5000 just lying around. Or you could buy an interactive "smart" white board for about the same cost. If your company is that much invested in Scrum that they will approve such an expense, please let me send you my resume! The rest of us do not have that kind of money to spend and I was very intrigued when someone sent me this link to a YouTube video featuring an interactive story wall using Mingle. Upon further investigation this setup can be easily achieved with less that $100.
Johnny Chung Lee from Carnegie Mellon University came up with an ingenious way to use a Nintendo Wii Remote to create an interactive whiteboard. Here is what you need:
- Bluetooth receiver (~$20 or free if already on your computer)
- Nintendo Wii Remote ($35 at Wal-Mart)
- Infrared Pen ($18 at Wiiteachers.com)
- Johnny's Wiimote Whiteboard Software (free)
Getting Connected with Bluetooth
The first laptop I tried this on was a Sony Viao and it came with the standard Windows XP Bluetooth stack which while it will see the Wii Remote it does not really know what to do with it even once it have been connected. I had to download the stack from Bluesoleil to get it the connect and be recognized by the Wiimote Whiteboard Software. The download is only a trial and the purchase the product it was $30 USD. My work laptop did not have Bluetooth so I just decided to buy a Bluetooth receiver so I could then setup everything up on any computer with a USB port. The website WiiBrew has a list of Bluetooth drivers and receivers that are known to work with the Wiimote. I bought the Cirago BTA-3210 USB 2.0 Micro V2.0+EDR Bluetooth Dongle on TigerDirect for around $20 and it comes with the Toshiba Bluetooth drivers.
I inserted the Cirago receiver into my USB port and installed the drivers from the included CD. After a reboot I opened the Bluetooth Settings windows and clicked new connection, I pressed the 1 and 2 buttons on the Wii remote to put it is discovery mode and it was recognized without a hitch.
The IR Pen
I mentioned this project to my wife early on and she ordered me several LED pens from Amazon which was very sweet of her but none of them worked. You need an IR pen with certain specifications. There are many sites with the specs on how to build one, but I just purchased one from Wiiteachers.com for around $18. It is a standard Expo dry erase marker that they gutted and then fitted with the IR light, a switch, and place to put a single AAA battery. There are a few fancier pens out there, but this one was cheap and worked well.
Setting it all Up
Once I had everything (I already had a few Wii remotes at home although the kids did complain when I took one away for work), I setup a projector connected to my laptop, connected the Wii remote via my Bluetooth receiver, ran the Wiimote Whiteboard application, calibrated the IR pen, and was using the pen as a mouse.
A few notes on the setup:
Set the Wii Remote to the side with a clear view of the entire projected screen. The IR sensor has about a 45 degree angle and several times the calibration worked fine but it could not see the outer edges of my screen. I finally mounted (with a rubber band) the Wii remote to my camera tripod and set it over to the side of the wall I was projected on.
When you use the pen you have to make sure you body is not blocking it from the IR sensor on the Wii remote. This was not too much of an issue because I had to be to the side of where I was using the pen anyway as to not cast a shadow over the projected area.
If you bump the Wii remote, the projector, or resize your desktop you will need to recalibrate the pen. This is very easy to do with the Wiimote Whiteboard application. Also if you use it with a Virtual PC image, the desktop resolution on the Virtual PC needs to be the same as the host desktop or the pointer and the pen are off a bit.
The Virtual Sprint Task Board
Once everything was working I opened up my Virtual PC with Team System 2008 and the Conchango Sprint Task Board application. The setup works great and I can easily move tasks from one state to another as well as tap on an item to get the detail pop up window. The scale slider at the top of the board allows you to zoom in and out so you can see the entire board or zoom into a specific set of User Stories.
Windows comes with a virtual keyboard that you can use to type using the IR pen. This is not super optimal, but it works for small things like updating a tasks remaining hours. The Windows version is fairly rudimentary so I downloaded the freeware version of Touch-It's virtual keyboard which had some added features like docking, customizable keyboard layouts, etc.
While it is still not the same as an actual physical board with note cards, for those using an Agile management tool already this is a cheap way to get close to it. Here is a short video of me setting this up and using the task board.
Thursday, June 11, 2009
Farewell eScrum, We Barely Knew Ya!
Saturday, June 06, 2009
Upcoming Microsoft Events in Nashville
nPlus1 Summer Summit
nPlus1.org is hosting its third Architecture Summit on June 10th at the Microsoft office in Nashville, TN (Franklin). The topic of this summit will be Patterns and Principles.
Session One: Software Patterns
Patterns are an important tool to use as architects and developers. They provide a common vocabulary for us to design with, as well as a common approach to a common problem. Come learn about useful patterns, and how to use them in your everyday code.
Session Two: How I Learned To Love Dependency Injection
Dependency Injection is one of those scary topics that most developers avoid. It sounds all ‘high-falootin’ and complex. It’s not. Really. We wouldn’t lie. It’s a great way to manage complexity in your system, and a great way to make your system so much more testable. And isn’t that what we all want?
Each session will be followed by open discussions periods. A catered lunch will be provided starting at noon when the welcome time begins.
When & Where
Wednesday, June 10, 2009 12:00 PM - 5:00 PM
Microsoft Office - Franklin
2555 Meridian Blvd, Suite 300
Franklin, TN 37067
Click here to register for this event.
Microsoft ArcReady: Architecting for the Client Tier
The client (or presentation) tier of our applications is taking on an increasingly important role. Users are expecting more compelling user interfaces, but they also want more functionality from their applications. In this ArcReady we examine how to design and deliver well architected client applications that will be easy to maintain and extend.
Session 1: Trends and patterns on the client tier
In our first session we will take a vendor and platform neutral look at some of the trends and emerging technologies that can be used on the client tier. We will look at techniques like Mashups, technologies like Natural User Interfaces (NUI) and the increasing importance of the mobile platform. We will also look at some common patterns that can be used in the architecture of the client tier.
Session 2: Applying Microsoft technology on the client tier
In our second session we will take some look at how we can use Microsoft technologies to create well architected and compelling client applications. We will look at technologies like Silverlight and WPF that can be used to create compelling clients. We will also look at technologies that can be used to make your applications more extensible for future development. We will also examine some architectural guidance developed by the Microsoft Patterns and Practices group.
When & Where
Friday, June 12, 2009 9:00 AM - 11:45 AM
Microsoft Office - Franklin
2555 Meridian Blvd, Suite 300
Franklin, TN 37067
Click here to register for this event.
MSDN Events Unleashed
Internet Explorer 8 for Developers
The Windows Internet Explorer 8 browser makes it easier to explore and interact with the web. Did you know that there will be a new standards mode by default? While it can easily be turned off, knowing the potential to impact this may have on many sites that have been crafted around various features in past versions is a good idea. Attend this session to learn which current practices you need to change and how the new standards mode affects your development techniques and your existing sites. We review tools that are built into Internet Explorer 8 to help developers debug and create Web pages. We also discuss Web Slices, Accelerators, and Search Suggestion – all of which are key new features in this latest release.
In this session we’ll show you:
- Developer Tools
- Selectors API
- Use and create accelerators
- Use and create web slices
- Use and create search suggestions
- AJAX Navigation enhancements with Virtual Earth demo on history, AJAX Cross Domain Calls
- CSS improvements (printing, counters, new pseudo classes)
Developing on Microsoft Windows 7
Building applications that are easy to use, visually appealing, and offer high performance is a challenge that developers face every day. Innovative applications can greatly improve the user experience, empowering companies to differentiate their services and solutions. However, developers are increasingly asked to do more in less time, while also optimizing the power and performance requirements of their applications. The Windows 7 platform makes it easy for developers to create engaging, user-friendly applications by providing familiar tools and rich development features that allow them to take advantage of the latest PC capabilities. In this session we will explore the new Taskbar and Jump Lists, the Scenic Ribbon, file management with Libraries, and Windows Web Services among many other enhancements to the new operating system.
In this session we’ll show you:
- New Features in Windows 7
- Use and create jump lists
- Scenic Ribbon API
- Use and create Libraries
- Windows Web Services API
- Additional enhancements that support development
When & Where
Friday, June 12, 2009 1:00 PM - 2:50 PM
Microsoft Office - Franklin
2555 Meridian Blvd, Suite 300
Franklin, TN 37067
Click here to register for this event.
TechNet Events Unleashed
Session 1: Windows Server 2008 R2 – Optimize Your Time
Get a jump start on your peers with Windows Server 2008 R2 by joining us for this no nonsense technical session, where we will discuss the critical improvements in the next version of Windows Server. Some have argued that, given the impressive scope of new functionality in Windows Server 2008 R2, it should not be an “R2” release, but rather should be given a completely new name. They point to features such as Direct Access, Branch Caching, Live Migration, Powershell 2.0, and VDI that will help you cut down on the amount of time you spend doing mundane tasks, reduce end user frustration and support headaches, and give your mobile workforce a competitive edge through ubiquitous access to company data. There’s a good chance that by attending this session you’ll find a way to save time and money with Windows Server 2008 R2, and so you’ll get the time you spend with us back many times over.
Session 2: Windows 7 – Maximize Your Potential
In designing Windows 7, the engineering team had a clear focus on what we call ‘the fundamentals': performance, application compatibility, device compatibility, reliability, security and battery life. Early reviews of Windows 7 seem to indicate that the choice to focus on the fundamentals is resonating well with many users and professionals. And, IT professionals will further benefit from the enhancements to manageability and security. You’ll also learn how your investments in testing and evaluating Windows Vista will pay off in the transition to Windows 7. Come see firsthand what all the buzz is about in this demo-intensive session where we explore the UI improvements, performance gains, and manageability enhancements in the next client operating system from Microsoft.
Session 3: Internet Explorer 8 – Get Excited About the Browser Again
Internet Explorer is currently the most widely used browser in business, and while we’ve seen nice incremental improvements to IE in recent history, we haven’t seen as many truly earth shattering changes. Well, many agree that the next release of IE, Internet Explorer 8, provides the most compelling upgrade to IE in many years. In a typical day, users spend 2 hours or more per day in the browser, so significant improvements here can have tremendous impact on productivity and the way we work. The browser needs to be thought of in the same terms as an operating system—it has to be rich, robust, interoperable, easy to use and secure. Internet Explorer 8 is a browser that meets these needs and more for users, enterprises, IT professionals and developers alike. Come learn how the new improvements, too numerous to mention here, are driving many to get excited about the browser again.
When & Where
Friday, June 12, 2009 3:10 PM - 5:00 PM
Microsoft Office - Franklin
2555 Meridian Blvd, Suite 300
Franklin, TN 37067
Click here to register for this event.
CMA Music Festival - Microsoft Booth
If you are going to the CMA Music Festival June 12th - 14th, stop by the Microsoft booth to see demos of IE8, Surface, cool Silverlight apps, and more. They will also have XBoxes and Guitar Hero setup for you to play. I am manning the booth on Sunday so drop by and say hi!
Monday, May 04, 2009
Choosing the Right Scrum Management Tool (Slight Return)
Release planning is a very useful exercise to help forecast when features will be ready in the future (as much as possible with the ability to rearrange the entire Product Backlog after every Sprint). This often puts management more at ease since they have convinced themselves that when they used to see project plans with Gant charts they were more predictive (which is usually not the case). Here are some things to look for in a management tool around this:
- Drag and Drop: Some of the better tools out there allow you to view your Product Backlog and drag items from it into Releases and Sprints. This side by side view provides a great way to plan and try "what if" scenarios.
- Velocity Indicators: Along with the drag and drop mentioned above, the ability to enter velocity for the Sprints and have the interface visibly show you how many Story Points are used/free in each Sprint (and a roll up into the Release) really helps with planning also. The better ones show this both by the numbers (like 4/10 for used/total) and with a bar showing green when you have plenty of free Story Points left and goes yellow to red as you fill up the Sprint.
- Capacity Planning: Being able to calculate the capacity for each Sprint by taking in consideration team members availability is key for less cross functional teams. You should be able to set each members individual capacity per Sprint and include days off and holidays.
- Velocity Averaging: Mike Cohn prescribes taking the average of the 3 top most velocities for the last 6 months and the average of the lowest 3 velocities for the same period to use as a range for Release Planning. It would be great if your management tool could do that also!
Sorry I left this off from my original post.
Also, there have been plenty of people telling me tools I missed. I apologize as the intent of my post was not to list all the options but rather concentrate on the desired features. My list at the end of the post was just some of the top vendors according to some of the various surveys I have seen. Below is a chart form the VersionOne 2008 survey. Please feel free to keep posting comments on other tools, but do not be offended if I did not mention your weapon of choice.
Sunday, May 03, 2009
Choosing the Right Scrum Management Tool
I constantly see postings on various Agile forums asking what is the best tool to use. There are even several blog posts out there outlining the pros and cons for the various packages currently on the market. I find the choice of which tool to use is very contextual and while you do need to do you research on all the different vendors and offerings, you should really get a good idea of what you want from a tool before you go out looking for one.
For instance: If you are a smaller shop that has been previously using the note card and sticky note approach, you would probably not want to jump into a larger, all encompassing tool at first. A larger shop that is moving from a much more structured approach would tend to want something with more features to give them some of the same abilities they had with their previous toolset. Also, various people (Product Owners, Scrum Masters, developers, testers, stakeholders, etc.) have different expectations of how they will interact with these tools and what they will get out of them.
So I thought I would talk about what functions you would perform in these tools and what to look for while evaluating them:
Product Backlog Maintenance
Argue with me all you want, but managing the Product Backlog is one of the more important jobs in Scrum. If you have a crappy Product Backlog, you will get a crappy product no matter how well the actual practices in the Sprint are performed. So what should we look for in this area?
- User Story Support: User stories are basically the standard way to capture requirements in the Agile world and any tool claiming to be a Agile management tool should allow us to easily record and manage them. This means containing areas for the user story, conditions of acceptance (or How To Demo), acceptance tests, story points, business value, and delivery order at a minimum.
- Other Product Backlog Types: While I just stated that User Stories are the primary mechanism for capturing requirements, I do think there should be support for capturing things like technical requirements in the Product Backlog too. Even if I record these non-functional requirements in the same User Story format, I like to have a way to define it as a technical requirement for reporting purposes. It also helps with testing since these types of requirements sometimes do not change the user interaction so there would be no new user acceptance tests, but they often require extensive regression testing of existing functionality. Having a way to separate these types of requirements is helpful to the testers to know what will require new tests to be written.
- Batch Entry/Editing: From experience I know that a tool that requires me to open various windows to enter or edit multiple Product Backlog items makes planning meets a laborious process. The best support I have seen for this functionality is the ability to export and import from Excel.
- Epic/Theme/Feature Hierarchy Support: When planning a large scale project it is important to see the smaller user stories in their larger context by grouping them as feature sets, which relate to a theme, which originates from an Epic. I’ve heard complaints from management about Product Backlogs with small User Stories are tactical approaches and they want to see things at a strategic level. By rolling the smaller stories up to these various levels you provide that type of view to them.
Sprint Backlog Maintenance
Transparency is a key element of Scrum and keeping up with the tasks in the Sprint Backlog accurately is one of the keys to providing that visibility. While the Product Backlog is managed by a single person normally (the Product Owner) and is not usually touched every day, the Sprint Backlog is managed by the entire Scrum team and should be updated daily (if not several times a day). This makes the nuances to interacting with the Sprint Backlog a little different. Here’s what I look for:
- Easy Access to ‘My’ Tasks: Since we want the team to update their tasks frequently, I need to make it as easy as possible for them to view their open tasks, update the task, and grab another unassigned task. While Virtual Tasks Boards are great for this in the Daily Scrum, I just want to talk about editing the Sprint Backlog items during the rest of the day. The best way to ensure this is to integrate with the tool I use most often, which for a .NET developer like me is Visual Studio. But we cannot forget other team members like testers who may or may not being using the same tool.
- Batch Entry/Editing: Once again from experience, when you are adding multiple Sprint Backlog items in a Sprint Planning meeting, having to do it one at a time in a new window for each one is a time suck. Excel is once again a great way to do this so look for export and import from the Sprint Backlog as well as the Product Backlog.
- Custom Views: As a Scrum Master you will frequently be looking at the Sprint Backlog for various reasons. Because of this you may want to sort or filter the view differently. Having the ability to create your own views of the Sprint Backlog (and this goes for the Product Backlog too) is essential. For example I use a view that shows me what Sprint Backlog tasks were edited today, yesterday, and three days ago (for Mondays to see what was edited on Friday). Scrum team members can also use this for the ‘My tasks’ functionality I mentioned above.
- Product Backlog Linking: Just as a User Story links back to a feature, theme, and/or epic, each Sprint Backlog item should link back to an item in the Product Backlog. This is essential for reporting purposes. On the opposite side of that rule there will be unplanned work that may not tie back to the Product Backlog and I should be able to easily identify these orphaned tasks.
Alistair Cockburn coined the term ‘Information Radiator’ which he described as a type of display that relays critical and frequently updated project information. Furthermore the display must relay the information in a way that the viewer requires very little effort to gauge its intent. In other words I should be able to walk by and with a glance get an idea of the status of some facet of the project. These are normally displayed in the open where the entire team can see them, but with distributed teams that becomes difficult. Here are some of the features and specific types of Information Radiators to look for:
Types of Information Radiators
- Sprint Burndown Chart: The most useful and common of the Information Radiators is a must have in any tool. Some additional features I have seen in some tools that I liked:
Track Done Burndown: Most burndowns are based on “work hours remaining” which sometimes does not tell the whole story. If you have only 20 hours left across 4 Product Backlog items then it would look like you are doing well, but if there are 5 hours remaining on each Product Backlog item and none of them are done then it can be misleading. A track done report simply reports on the number of Product Backlog items that have been marked as Done versus those Not Done (and sometime it will show In Progress ones differently).
Capacity Trend Line: This is the line that shows you where you will be based on your team’s current capacity based on the number of hours to be worked. It is a good indicator if you have bitten off more than you can chew from the start.
Moving Average Trend Line: This one always threw me for a loop until I understood what it was supposed to show. There a several flavors of this (the most complicated being Linear Regression), but what it boils down to is based the average rate of burndown (positive or negative) from a set portion of history, if you continue at that rate the line shows you where you will be at the end of the Sprint. So if you are entering more Sprint Backlog hours at the beginning of the Sprint than you are burning, this line will slope up because if you kept adding more hours than you took away you would never reach 0 hours. It usually looks this was at the beginning but when we really get cooking and have stopped adding in new tasks, the trend line tips over and we get an idea based on an average of our last few days where we would wind up if we kept that average up. If that does not make sense then read this, which will probably make even less sense!
- Product Backlog Burndown Chart: If you have multiple projects each with their own Product Backlog, then the Product Backlog Burndown is a good indicator of how close you are to finishing the entire project. It shows the number of Product Backlog items remaining either over time or by Sprint. If you have a companywide Product Backlog then this has less immediate relevance since if it ever went to 0 then you either reached perfection or went out of business!
- Hours/Tasks Breakdown by Team Member: There are several versions of these types of reports but they basically show each team member and how many Sprint Backlog items (or tasks) they have closed, are currently working on, and the hours remaining. It is a quick snap shot to see if someone is falling behind. This should be something you naturally discover during the Daily Scrum if you are doing it right but with larger teams it may not be as obvious.
- Bug/Fix Trends: Management loves metrics about bugs: How many bugs did we fix last sprint? How fast are we fixing bugs? What is the breakdown of bugs by testing/user impact? Having some nice graphs and pie charts to show the answers to these questions in an easy to view display makes management happy. It can also help a team quickly identify if they have a code quality issue.
Display Options for Information Radiators
As I said before, it is important that Information Radiators be as visible as possible to the entire team. So there are other factors to consider from a tool in this area:
- Multiple Means of Access: If the tool only allows you to look at these charts and reports through the tool itself, then you will have limited options when trying to display them for your entire team to see or offer management a way to view them on demand. Most tools provide these reports via the web which can work for most situations, but the better tools I have seen offer them through something like SQL Reporting Services which has many options of delivery including web, email, and exports to Word/PFD/Excel. This will be especially crucial for a distributed team to all have access to this information.
- Rotating Views: Frequently I see companies putting large LCD screens in various places around their buildings to convey information to their employees. This is a great idea for information radiators also. A good tool will allow me to pick and choose different reports to be displayed on a rotating basis on an external monitor making it a very effect way to show everyone the status of our projects.
One of the reasons the Agile world tends to stay away from management tools is that they may force us to adopt the tools process over the process that is best for our team. So for a Scrum management tool to be effective, it must allow us to customize it to fit our needs. Here are some common customizations:
- Ability to Edit Backlog Fields: There have been many times when implementing Scrum for a client that they have needed to record additional information in the Product and Sprint backlogs. This would require the management tool to allow us to add, edit, and remove fields for backlog items.
- Ability to Edit Backlog Workflow: We are used to the standard ‘Not Done’, ‘In Progress’, and ‘Done’ type workflows for Product and Sprint Backlogs, but often teams will want to modify these for things like ‘Ready For Test’, ‘Deferred’, ‘Code Reviewed’, etc. Not only is it nice to be able to change the states, but changing the workflow rules is a nice feature along with it.
- Ad Hoc Reporting: When you start using a management tool in the Scrum, a ton of data can be captured. You start to be able to easily track trends like how fast does the team close and ‘In Progress’ task, how often do we add new tasks in the middle of the Sprint, etc. This is very useful information and the ability to pull that data out of the tool and report on it can give your team much more insight in how it is performing.
Managing backlogs is only one part of the Scrum process. During the Sprint there are tons of activities going on that may use other tools such as unit testing, source control, feature testing, automated builds, etc. The ability to integrate with these tools can add even more power to your management tool in a few ways:
- Linking Source Code to Backlog Items: When you add code to your source control repository, the ability to link it to the appropriate Sprint Backlog task (and through that the corresponding Product Backlog item) gives you even more reporting options. You can quantify the exact codes changes that took place for any given requirement.
- Automated Work Item Creation: A common example of this is creating a Sprint Backlog task to after a broken build. If you have setup a continuous integration tool that builds whenever code is checked in or on a scheduled basis, if that build breaks you can have a Sprint Backlog task automatically created to fix the build. You may already have internal software tools for tracking bugs or high level requirements and having the ability to setup some type of synchronization between those tools and your Scrum management tool can help with adoption since it builds upon tools your company has a current investment in.
- Linking Test to Backlog Items: Just as with source code, the ability to link unit tests or automated feature tests to Sprint/Product Backlog items helps with stuff like traceability matrices. If management asks the impact of changing a piece of functionality you can easily make the change locally and then run all the linked tests to see what it breaks or ensure it does not break anything. It will also help identify tests to be run for regression testing when changes are made to existing code.
Making the Choice
The functionality above covers the biggest feature sets I can think of, but there are probably others. While most of the mainstream tools out there cover the basics fairly well, the other functionality mentioned above could be the deciding factor over one or the other.
My advice is to think of what is the most important set of features for you team and then examine each of your choices to see which ones fit your criteria the best. To help get you started below are some links to some of the top tools on the market. Happy hunting!
Tuesday, April 28, 2009
Brian Harry Presenting on Team System 2010 on Live Meeting
Brian Harry, a Product Unit Manager for TFS from Microsoft, will be presenting on the new features of TFS 2010 (codename Rosario) and what benefits it will bring to software development and project management. You can read more about Brian on his blog.
When? Thursday, April 30th, 2009
What time? 3:30 to 4:30pm EST
Who should attend? Technical Decision Makers, Lead Developers, and Software Architects who want to see what new innovations are coming from Microsoft and impact they will make on business
Sunday, April 19, 2009
The Importance of Failure
I find that people who claim to have never failed usually fit one of these scenarios:
They are really that good. It is possible that this person in their vast experience has made all the right choices and risen above all external factors to truly have succeeded in every project they have undertaken. Warning: I find that highly unlikely.
They haven't done much. If you have been on one project that succeeded you would technically be batting 1000. But who would you want on your team: the guy who has hit a home run with one single at bat or the guy who has batted 366 over 1,000 at bats?
Your exposure to the project's success was limited. When you are a developer you tend to think that if you deliver decent code on time then you have done part. And if your team delivers on time and on budget then you can count the project as a success. But there are many other factors for a project to be considered successfully that sometimes project teams are shielded from. I was on a project where my team delivered quality code, on time, within budget, and afterwards the client shelved the entire package and never used it.
Your projects have been small or less critical in nature. I have met someone who at the time had really been on several successful projects. But upon further investigation he worked in a large company on applications that were not mission critical and his projects had more lax success factors. Although he was technically a solid developer, he was very unprepared for more serious endeavors.
So now you think I want to hire people who fail all the time, right? Not exactly. Consistent failure is not good, but small, intermittent failures provide you with opportunities to learn in a way that success does not. When I hear some say they have never failed, I worry that they most likely will not see it coming when they do. And the laws of probability dictate that it will inevitably happen.
Learning from Failure
So if (and I really mean when) you fail, it can never be a total loss if you learn from it. I speak from my own experience.
I was brought onto a project for a fortune 100 company specifically because I had been successful before with implementing new technologies for Sales Force Automation. And at the time in my limited view of what I deemed a successful project, I had always been on the winning side up to that point.
I was the Architect/Team Lead for the team and was also given a good deal of influence in other areas of the project as well. I have to admit that it just fed my ego and I started to think our chances for success were bullet proof. But the project failed and even though there were many contributing factors I had a decent amount of the blame on my shoulders. One of the worst things about it was that even when things had started to go bad, I ignored the signs and thought that through pure force of will I could pull it out in the end.
It was a very humbling experience and in the months immediately afterwards I spent quite a bit of time reviewing what had gone wrong. There were some obvious technical decisions, but there were other factors that had more impact: requirements, project management, team management, customer expectations, and the intangible political environment. I had been so focused on fixing the technical issues that I (and other team members) failed to address these other issues. Even after the initial meltdown, when all the technical issues were fixed, the other factors were such that the application never saw the light of day.
The insight I gained from the failure makes a good deal what I have learned from other successes pale in comparison. I took that and applied it to every project after that, determined not to repeat those mistakes. I am happy to say I have been fairly successful at that and I have had to work hard at finding new and more inventive ways to fail! I have had definitive moments in subsequent projects where I have seen similar signs arise and have been able to try a new approach (some successful, some not so much) to try and mitigate these types of issues. And each time I find a successful way of turning that old issue around, I feel that I become that much stronger of a developer. And when I am asked about my past failures I dutifully speak of them and convey how each one taught me a lesson so that I won't be caught unawares again.
So now one of my initial interview questions for new candidates is "Tell me about a past failure." If they respond that they have never failed then they better really knock my socks off for the rest of the interview to even have a ghost of chance. Likewise if they tell me about a failure and do not follow it up with what they learned from it, in my eyes they just had their first failure.
Wednesday, April 01, 2009
Team System MVP
I was very stoked to get my confirmation email for my Team System MVP today. I was a little worried it might be an April Fools joke, but apparently it is the real deal. I am lucky to be in a group with such talented people and really look forward to getting even more invovled in the VSTS community this next year. Congrats to all my fellow MVPs announced today!
Tuesday, March 31, 2009
My Latest Favorite Feature of VSTS 2010
Thursday, March 26, 2009
New Product Backlog Estimation History Report for Conchango Scrum TFS Template
Paul's team is just starting with Scrum and we have had many conversations about estimating User Stories in the Product Backlog using Story Points. Most of the groups I coach have to get used to the idea of estimating this way. I always tell them that over time your team will become more accurate with gauging the story point value of individual stories. We talked about being able to go back over time and see how many hours were actually worked against individual stories and using that data to get an idea of how much diversity there are in the actual hours worked over multiple stories assigned the same story point value.
Paul created the great report we are calling "Product Backlog Estimation History". You choose a story point value and a date range, then the report will show you a graph of the number of Product Backlog Items with the same amount of actual hours worked.
Over time you can see trends in how accurate your team is estimating using story points by seeing the variation of hours actually worked. If you see a large variation then you might want to look at this during a retrospective to see why the estimates are no more accurate. You will see some degree of variation naturally, but large amounts can be a symptom of a deeper issue.