As Corvid-19 rampages its way across the globe and is dubbed by the IMF as “way worse” than the global financial crisis in some parts of the world, there is no question that businesses across almost all industries are feeling the heat.
This increased pressure for survival will undoubtedly be felt more by data science teams in smaller businesses, which are typically small to begin with. How can organizations building a data scientist team during this trying period maximize its output?
Consider generalists
Though it is ideal to have the entire team staffed with specialists, one possible option for firms that find themselves having difficulties with the budget might be to considering hiring generalists, too. Compared to an attempt to hire specialists for each position and then coming up short, this approach offers the opportunity to first fill up the positions and gradually train the generalists up.
The idea isn’t to understaff key positions, however, but to build the team with a good mix of specialists and generalists. One advantage of the latter will be their ability to offer broader insights that might even enhance the efficiency of tasks and the flow of certain work processes. As the generalists become versed in more than one role, job satisfaction should also trend higher over time.
Prioritize easy wins
A common strategy in IT is to go for low-hanging fruits for a quick win that builds confidence and establishes the value of the team. The situation is no different for the data science team, which should put some thought into small yet impactful projects that can be completed relatively quickly. Though this doesn’t directly impact their output in the short term, the increased cooperation across the organization will make itself felt in the mid to long term.
Another advantage of taking on some “easy” projects is the ability to quickly learn about the infrastructure and processes that the team needs. It will also allow team members to figure out the strengths and weaknesses of each other, and bond together as each successful project is completed.
Ultimately, a dozen easy wins with a few that “sticks” and make a significant difference within the organization will be far more valuable than a single massive and technically challenging that takes many months to complete.
Plan to scale
It pays to plan for scalability right from the start. Part of this entails giving each project group the autonomy to do their stuff once the project parameters and deliverables are identified and agreed upon. Of course, this assumes that each project team has the capabilities required to complete it – it won’t work if the data scientist who is not familiar in infrastructure have to keep requesting for assistance with setting up a HADOOP cluster.
One final consideration is to rely as much on automation as possible. For instance, the traditional data science process can be manual and time-consuming. As noted by one data science expert, tools such as AutoML systems can be used to significantly accelerate this process by automating mundane aspects of the work.
Depending on the tools used, a democratization of the data science process also opens the door to new classes of developers, offering businesses a competitive advantage with minimum investments.
Photo credit: iStockphoto/Bulat Silvia