Collaborator8.2-Blog-CTA-Demo

Using Peer Code Reviews as a Professional Development Tool

professional-development

Over the years, I have had the privilege of working with hundreds of developers.  Overall, I have found them to be intelligent, witty, focused, incredibly funny… and also pretty isolated.

Developers are most effective when they have long un-interrupted periods of time to concentrate on their work. The most common distractions are caused by other people—people wanting to talk, people walking by, people causing a ruckus in the hallway.

Consequently, most developers isolate themselves by either working odd hours or by wearing hoodies and headphones all day (with or without music playing). Although this level of isolation helps decrease unnecessary interruptions, it also creates a lack of opportunity for collaboration and key professional development opportunities that other departments typically enjoy.

Rather than having individuals struggling alone, collaboration in the work place ensures that more experienced team members can help out less experienced colleagues in a positive way. One way that developers can gain this level of collaboration, without sacrificing focus, is by implementing peer code reviews.

Besides fixing defects, a code review distributes important knowledge, such as what patterns make sense and how features work across the team as a natural part of the work process. By learning from other members of the team, developers improve their individual skills, and the team overall becomes more efficient, consistent, and interconnected to create a better (and more fun) working environment.

In order to implement an effective peer code review process, you need the right tools.  The nice thing about using a tool like Collaborator is that it automatically tracks and creates a record for each review, so that the process can be verified and others can learn from your review.  Bottom line: Collaborator is designed to make the review process an easy and effective way to learn from your peers. Here’s how it works.

First, the developer who wrote the code, also known as an author, opens the Collaborator Client and specifies the files/changelists that they want to have reviewed.  The Collaborator Web UI automatically opens and creates a new review.  The Review Summary page provides a way for everyone involved to see the name of the review and any explanatory information the author wants to include.

Now the author can define custom information to be set for the review. For example, you can customize fields to show what product you are working on or what department the code is from. The Web portal also allows you to customize the defect names. At Smartbear we call them ladybugs, but our clients call them bugs, defects, work items, inquiries, things that keep me up at night, etc…  You can call them whatever makes the most sense for your organization.

See also: