How to speed up a Tensorflow model by 200%? Machine learning models nowadays require more and more compute power. According to a study from OpenAI, the compute power needed to...
Read More

I wrote an article on how you can improve neural network inference performance by switching from TensorFlow to ONNX runtime. But now UbiOps also supports GPU inference. We all know...
Read More

Some time ago I wrote an article about comparing the performance of the TensorFlow runtime versus the ONNX runtime. In that article I showed how to improve inference performance greatly...
Read More