Marketplace Musings: How to Drive Effective Technology Pilots
We’re constantly exploring the latest trends in the staffing and recruiting industry, but one has gone thus far unnoticed: technology leaders’ infatuation with pilots. And I’m not referring to the kind folk who shuttle me to and from Bullhorn headquarters—I’m referring to the 30- to 90-day trial period on new software, or technology pilots.
Piloting studies actually began in social sciences to see if there was a correlation between the study and direct results before additional funding would be provided. Later, companies began to test new software, and as agile methodologies were increasingly adopted, the idea of Minimal Viable Product and Trial have become synonymous across many organizations. Pilots are growing in volume and length of time and there are three steps companies need to take before piloting any new software.
Define Success Metrics
Your first priority should be to define what success looks like when using this product. If success means bringing in more candidates, look at how many more candidates it brought in than a control group of similar demographics. If success means an increase in redeployment, test that your redeployment rate actually went up, even if you have to wait to see that result for a month or two. You may see additional success along the way, but stick to what that software should be doing and do not let ancillary results move you off the primary metric.
Work with the vendor to set a timeline for when they believe you should see positive results and hold firm to that deadline. If the results aren’t there then it’s most likely not the right fit. It shouldn’t be enough that your reps and other users liked the tool—hey always like the tool because they want their job to be easier. Make sure the software does what you intended for it to do before the technology pilot and that all stakeholders agree on the outcome. If you don’t agree, hold it off until you can nail down the goals. Use your existing KPIs to review what improvement in that area will do for ROI- simple 3,5,10% increases can be a guideline.
Where is this software fitting in your current recruiting process and what audiences will be most affected? Is it going to replace something that doesn’t work as well or will this be an additional step people will have to learn? Solid operational understanding leads to quality technology pilots and testing. Know where the tool fits first and find the part of your process that has a bottleneck. A well-designed process is how you run efficiently, so make sure not to disrupt the parts that are working well.
Whenever you roll out anything new, you need to have training planned, usage tracking in place, and regular follow-ups scheduled over the next quarter. This will help ensure the new process and success you planned for are not only a best practice but have been fully adopted. Change management will be the most difficult part of this process because people are naturally averse to change. Make sure your new software is being used as intended so the results in your trial can be scaled with the release of the tool to your wider audience.
Many of you may be stuck in technology pilots that feel like groundhog day, but you can stop that never-ending cycle and move to a place where you’re more comfortable saying no. By having results to point to, you can deny unnecessary pilots from moving forward until they match up with your process and needs. With the success criteria in place along with buy-in on the definition of that success, explaining to your team and the vendor why you’re deciding not to move forward with the technology will be a much easier process. Your life will be better off as a result, you’ll have fewer pilots to roll out to your team, but a greater impact through the technology you do choose to move forward with.