Machine Learning in 2022: TensorFlow or PyTorch?

Data scientists or AI researchers working on deep learning will probably turn to PyTorch or TensorFlow, two popular open-source frameworks designed for AI. But how exactly do they differ, and which is the correct choice for building new ML models in 2022?

To help beginner data scientists or those looking to get into AI, Ryan O'Connor of AssemblyAI wrote a 5,000-word deep dive about this rapidly evolving topic. He undertook a multi-faceted comparison to address this complex topic, evaluating considerations such as model availability, deployment ease, and the strengths of both ecosystems.

“Outdated or incomplete information [about PyTorch and TensorFlow] is abundant, and further obfuscates the complex discussion of which framework has the upper hand in a given domain,” O’Connor explained.

Industry or research use?

While TensorFlow has a reputation for being a framework that is focused on industry use cases and PyTorch is favored by researchers, O’Connor says this perception stems partially from outdated information and that the discussion today is far more nuanced.

For instance, the 2019 release of TensorFlow 2 addressed multiple issues to make TensorFlow easier for researchers to use. Moreover, the introduction of tools such as TorchServe and mobile-centric PyTorch Live runtime have eased the way to rolling out native ML deployments.

Having said that, researchers today will almost certainly be using PyTorch given its dominance in the research landscape. Indeed, most state-of-the-art (SOTA) models and publications of AI papers are in PyTorch, and this is unlikely to change anytime soon.

Similarly, TensorFlow’s robust deployment framework and the TensorFlow Extended platform are invaluable and hard to beat when it comes to deploying them for commercial use, says O’Connor. “[TensorFlow offers] easy deployment on a gRPC [Google Remote Procedure Call] server along with model monitoring and artifact tracking are critical tools for industry use.”

The state of model availability

Given the rapid release of new models and their growing complexity over time, training SOTA models from scratch is increasingly infeasible, observes O’Connor. As we previously noted, GPT-3 has around 170 billion parameters and the upcoming GPT-4 is rumored to have more than 100 trillion parameters. It hence makes sense to utilize publicly available models where possible.

Model availability diverges sharply between PyTorch and TensorFlow, however. Because AI practitioners will likely want to utilize models from other sources beyond the official model repositories, O’Connor went for a quantitative look at model availability for each framework.

This is where PyTorch wins hands down. According to information from the Hugging Face repository for AI models, the number of models available for use exclusively on PyTorch “blows the competition out of the water”.

“Almost 85% of models are PyTorch exclusive, and even those that are not exclusive have about a 50% chance of being available in PyTorch as well. In contrast, only about 16% of all models are available for TensorFlow, with only about 8% being TensorFlow-exclusive,” he wrote.

The right choice for learners

Where one framework might be superior to another in specific use cases, there is no correct answer as to which is optimal, O’Connor concludes. And though each framework has disparate merits, a learner looking to dip his or her toes into AI today won’t go wrong picking up either TensorFlow or PyTorch.

But when push comes to shove, a candidate from the industry would probably do well to choose TensorFlow. Similarly, a researcher would probably opt for PyTorch given that it is the de facto framework for research work, though those dabbling in reinforcement learning should also consider TensorFlow due to the availability of certain agents and frameworks.

Finally, those looking for a career change would do well to go with TensorFlow if they are looking to get a foot in the door and get versed in end-to-end AI implementation, says O’Connor. If the objective is to implement SOTA models, focus on cutting-edge research, or simply develop a deeper understanding of deep learning, then PyTorch would be ideal.

“In 2022, both PyTorch and TensorFlow are very mature frameworks, and their core deep learning features overlap significantly… both have good documentation, many learning resources, and active communities,” he summed up.

Paul Mah is the editor of DSAITrends. A former system administrator, programmer, and IT lecturer, he enjoys writing both code and prose. You can reach him at [email protected].​

Image credit: iStockphoto/IvelinRadkov