• Documentation
  • Support
  • Go to my Account
Menu
  • Documentation
  • Support
  • Go to my Account
  • Product
  • Solutions

    By industry

    • Public
    • Healthcare
    • Critical Infrastructure
    • Public
    • Healthcare
    • Critical Infrastructure

    By application

    • Generative AI
    • Computer Vision
    • Time Series
    • Generative AI
    • Computer Vision
    • Time Series

    On-demand GPU

    Instantly scale AI and machine learning workloads on GPU on-demand

    Learn more
  • Customers

    Featured customers

    Bayer and UbiOps
    Scaling computer vision workloads across GPUs
    Innovating with AI towards a digitally secure Netherlands
    Process and analyze enormous amount of data (50+ AI and data science apps) with UbiOps
    Personalized medicine with AI for immunotherapy treatment
    Optimization of district heating grids with IoT data
    All customer stories
  • Resources
    • Documentation
    • Video guides
    • Tutorials
    • Technical integrations
    • NVIDIA AI Enterprise
    • Documentation
    • Video guides
    • Tutorials
    • Technical integrations
    • NVIDIA AI Enterprise
    • Blog
    • Whitepapers
    • Webinars, Interviews & Talks
    • Github
    • UbiOps Training
    • Blog
    • Whitepapers
    • Webinars, Interviews & Talks
    • Github
    • UbiOps Training
  • Company
    • About us
    • UbiOps Partners
    • Contact
    • On-demand GPUs
    • About us
    • UbiOps Partners
    • Contact
    • On-demand GPUs
    • UbiOps for Research & Education
    • Slack Community
    • Jobs at UbiOps
    • UbiOps for Research & Education
    • Slack Community
    • Jobs at UbiOps

    Latest news

    Why is Hybrid Cloud Deployment Useful?

    UbiOps Revolutionizes AI Model Inference Using AMD Instinct 

  • Partners
  • Book a demo
  • Login
Contact Us
Try for free

Tag: CPU

Functionality Technology UbiOps

Make your model faster on CPU using ONNX

January 12, 2023 / January 8, 2024 by [email protected]

How to speed up a Tensorflow model by 200%? Machine learning models nowadays require more and more compute power. According to a study from OpenAI, the compute power needed to train AI models is rising ever since it was first used in the 60’s. With the required compute power doubling every two years up until […]

Read more »

Tagged

CPUONNXONNX runtimetensorflow
ONNX CPU vs GPU in UbiOps

Blog Functionality Technology

ONNX CPU vs GPU

March 28, 2022 / July 26, 2023 by [email protected]

I wrote an article on how you can improve neural network inference performance by switching from TensorFlow to ONNX runtime. But now UbiOps also supports GPU inference. We all know GPU’s can improve performance a lot but how do you-get your ONNX model running on a GPU? And should I run all of my neural […]

Read more »

Tagged

CPUGPUGPUsMLOpsONNX
Sidebar

Latest news

  • February 5, 2025
  • October 10, 2024

Follow us

Linkedin Youtube Github Medium

Get updates and news from UbiOps

Newsletter

Contact

Headquarters The Hague

Wilhelmina van Pruisenweg 104
2595 AN, The Hague
The Netherlands
+31 70 792 00 91

Amsterdam Office

LAB42, room L2.16, Science Park 900, 1098 XH Amsterdam, the Netherlands

Company

  • Documentation
  • Support
  • Contact Us
  • Go to my Account
  • Documentation
  • Support
  • Contact Us
  • Go to my Account

Follow us

Linkedin Youtube Github Medium

UbiOps is a trademark of
Dutch Analytics B.V. Reg. 66849381

Knowledge Base

  • Tutorials
  • Video Guides
  • Blogs and News
  • Book a Demo
  • Tutorials
  • Video Guides
  • Blogs and News
  • Book a Demo
  • UbiOps Terms and Conditions
  • Privacy Policy
  • Cookie declaration
  • UbiOps Terms and Conditions
  • Privacy Policy
  • Cookie declaration