Performance Insights using Gitlab?

Godwin Pinto
4 min readMar 22, 2023

--

In the world of Big data, how organizations can use Gitlab / Jira / GitHub data to measure performance (hard skills) of dev. employees, than just a traditional task tracking / SLA measurement system.

Image by jcomp on Freepik

Recommended: Readers having Git management (Gitlab/Jira/GitHub) overview knowledge is suggested.

Productivity tracking in an important aspect in assessing performance for dev teams. While tools like Employee Monitoring, Time Doctor, Time Tracker, etc. exists, but they come at a cost. At the same time, I personally find them to be too rigid and scary that someone is closely watching my minute by minute status.

Therefore in most cases with SMEs, tracking productivity for SDEs is done with independent(non-integrated) performance management tools wherein inputs are being manually fed based on memory power and manager’s priority.

Every tech company has moved to adopting Git & issue management tools like Gitlab, Jira or GitHub. This article focuses on how modifying your GitLab labels could enable meaningful summaries for your dev teams performance insights, no extra cost and less scary for developers.

The objectives could be achieved by adopting the proposed methodology and with some restructuring of your existing labels.

Proposed Methodology:

  1. All tasks for the day are recorded on GitLab (at least by end of day or next day). This is more like a clock system with freedom, since Gitlab was anyway being used.
  2. The accountability of 8 hours of daily work to be punched as separate tasks (either to assigned tasks / creating new, if they were self initiated).
  3. All tasks to have mandatory “work” and “state” label tagged. More examples mentioned below.

Note: Handling some repetitive tasks (for simplicity): You may have an meeting that you spend 1 hour every day which could be created as one task. One would daily enter “time spent as 1 hour” in the same tasks with an estimated time(i.e. 5 hours) for a week (i.e. 5 days).

How to use Prefix Labels

Work label prefix:
A mandatory prefix to define which area the time was spent on. Example, an enhancement task would be renamed from “enhancement” to “work: enhancement”, indicating a new feature is to be developed. As a rule, only one work label to be assigned to one task. Few other labels that could be eligible to be falling under Work prefix would be as follows;

  1. bugFix: A bug has been identified and need to be fixed
  2. enhancement: A new feature development
  3. analysis:: Investigation efforts
  4. available: No specific work has been assigned
  5. call: Unplanned work calls
  6. discussion: Group discussion among internal team
  7. documentation: Any documentation related work
  8. devOps: Maintenance related work
  9. interview: time spent by an individual on conducting interviews
  10. knowledgeTransfer: knowledge transfer activities
  11. learning: assigned tasks to learn a topic / a tech language, etc
  12. meeting: client meetings
  13. release: getting production sources ready
  14. suggestion: time spent in thinking of improvements on any tasks
  15. support: supporting any activities / assisting
  16. deployment: any deployment related activities

State prefix:
A mandatory prefix label defining the status of work. Example, “state: doing”, indicates that the task is in progress. As a rule, only one state label can be assigned to one task. Few prefix that may fall under this label would be;

  1. cannotFix: Task is not resolvable. This is like a Dead end
  2. doing: Task is under progress.
  3. notResolved: Indicates that a task which was declared as readyForTesting, is not resolved or reopened
  4. onHold: Task has been temporarily suspended
  5. parked: Task has been suspended indefinitely
  6. readyForTesting: Task has been assigned to verification team
  7. scrapped: Tasks has been called-off
  8. signedOff: Task has been closed by verification team
  9. toDo: Task is in bucket list
  10. unitTestPending: Tasks is pending self-verification

These are just indicative labels and you could create / remove as per your organizations needs and segregation.

A work / state prefix would help understand;

  1. If an individuals effort are being allocated in the right area
  2. Work allocated or not (manager management skill)
  3. Non-billed efforts
  4. Restructuring process of project

Further with other features of GitLab like time estimated, time spent, shrug, emojis additional summaries can be derived like;

  1. Ratio of on-time vs delayed deliverables
  2. Quality of deliverables
  3. Time spent in helping others or help needed
  4. Exceptional deliverables (beyond on-time)
  5. Cost overruns on the project
  6. Overworked / overloaded employee

Once you have adopted the stated process and labelling, the next step would be to extract data from GitLab which could be done using an open source utility like https://github.com/godwinpinto/gitlab-etl (developed by me 😄 ) and with the use of appropriate queries you could extract those results in BI / reporting tools and push them to your performance tools.

Reference queries for developers to get some of the above mentioned insights will follow in my subsequent article.

Now, not only do you have performance insights with just one tool but also is “continuous” or ongoing data. If you come from the HR (Human Resource) domain, you would be familiar with the benefits and linkages of continuous feedback data systems in performance management reviews.

Is this now a better approach to evaluate / review hard skills from actual work data rather than having recency bias / non-systematic data points?

Sign up to discover human stories that deepen your understanding of the world.

--

--

Godwin Pinto
Godwin Pinto

Written by Godwin Pinto

Principal engineer by profession | Business software application ideation and development enthusiast

Responses (1)

Write a response