New deployment types FAQ


We have recently released the express and batch deployment types after which every deployment will be either express or batch. We made this document to answer your questions and to explain why it is a great addition to our platform.


What are express and batch deployments?


Express deployments can only handle direct requests (synchronous) and have a maximum timeout of 1 hour. Batch deployments can only handle batch requests (asynchronous) and can
handle requests that take up to 48 hours.


Why did we introduce the two types of deployments?


In the previous versions of UbiOps, requests to deployments could take no longer than an hour. We felt that this limited what use cases you could put on UbiOps. That’s why we have split the deployment into two types, where the batch type can run much longer than an hour if needed, up to 48 hours!


What can I use the new batch (long running) requests for?


Many jobs, like optimization and training jobs, can take over an hour to process. The batch deployment type is perfect for this. This deployment type can run up to 48 hours, allowing for
many new types of workloads to be running on UbiOps.


What is the difference between batch and express deployments?


You can use express deployments if you need or expect a direct result from your deployment and you expect it to run for a short period of time. You can use batch requests if your
deployment can take a longer time to run (up to 48h) and you want to collect the results later. Express requests are synchronous and batch requests are asynchronous. What this means is if you make a call to an express deployment, you get the result back in the same HTTPS request after the processing is finished. For batch requests, you need to make one request to send the data and an additional one to retrieve the result.


What is the maximum duration of a batch request?


A batch request is allowed to take up to 48 hours to process.


Can I use both types (express+batch) of deployments in a single pipeline?


Yes, you can! This is great for building model (re-)training pipelines, polling connections to databases and more. The only thing to take into account is that when you have a pipeline
containing batch deployments, you can only make batch requests to that pipeline.

How do I know if a request/job is finished?


You can check the status of a request through our API. It will provide you with a ‘pending’, ‘processing’, ‘completed’ or ‘failed’ status. When the status is ‘completed’ or ‘failed’, the request
is finished.


Can I use the same deployment package for batch and express deployments?


Yes, the deployment package stayed exactly the same for both types.


Is batch meant for multiple requests at the same time?


In this context, “batch” means “running in the background”. However, for a batch request you can indeed send multiple requests at the same time.


How do I know if my request failed?


You can retrieve the request status from the API and in the UI. Also, as for all deployment and pipeline versions, you can enable email alerts for failed requests. So you can get notified
immediately when something goes wrong.


Can I cancel a batch request?


Yes, you can. You can cancel deployment requests through the API or in the UI.


Can I run short jobs as a batch job as well?


Yes, you can, if an asynchronous request pattern has your preference.


How do I know which deployment type to choose?


If you expect or need fast responses with low overhead, you should choose the express type. If you expect your models to run for 30 minutes or more, a batch deployment is more suitable. Or if you require an asynchronous implementation.


Is there any disadvantage to long-running/batch requests compared to express
deployments? Longer cold start for example?


The cold start time is the same. The main difference is that it is asynchronous, so you make 1 API call to make the request, and then you have to make another API call to collect the result later. The latter is the case already for batch requests now as well. This might add a little extra overhead to the whole request-processing-response cycle. Other than that there are no disadvantages.


Why can’t I run express requests that take longer than 1 hour to process?


For express (and current direct) requests the HTTPS connection with the UbiOps platform stays open during the requests. The longer it takes, the higher the chance of interruptions and a failed request. Therefore, we do not allow express requests to take over 1hr for express requests.


Can I have a deployment that has multiple versions in which some are express and some
are batch?


Yes this is possible. The deployment mode is set on version level.

Can I change an express deployment to batch and vice-versa?


Unfortunately, this is not possible at the moment. You will need to create a new deployment for
this.


Will my current deployments continue to function?


Yes they will. We will migrate your deployments automatically to the right type so you can continue using them in the same way. The way a batch or direct request is created and all the
API endpoints for it will remain the same.