Picture of the author

References

Inference

This page explains how to work with Synativ Inferences. Once you have fine-tuned a model, you can use it to run a batch inference job and download the result for further evaluation.

Start Inference

You can start Inference by calling start_inference:

synativ_api.start_inference(
    model_id='synativ-model-8a8c4ae5-6d29-4741-90d1-3987d1dfc79b',
    dataset_id='synativ-dataset-95bdf8f9-3e92-4070-b13d-74a0ada3a46b',
    metadata={}
)

The above will run inference with model synativ-model-8a8c4ae5-6d29-4741-90d1-3987d1dfc79b and dataset synativ-dataset-95bdf8f9-3e92-4070-b13d-74a0ada3a46b. You can create a new Dataset just for inference or use the Dataset you used previously during fine-tuning if it included the data for inference already.

metadata is a JSON string which can contain user-set hyperparameters depending on the use case. If left empty, inference will be performed with Synativ default hyperparameters.

You will receive an Inference object as response:

Inference(
    creation_time='2023-08-09 17:37:39.153481',
    metadata='{"scale":8.0}',
    dataset_id='synativ-dataset-95bdf8f9-3e92-4070-b13d-74a0ada3a46b',
    id='synativ-inference-86ef4838-3b4a-4426-a606-5d225393867a',
    model_id='synativ-model-8a8c4ae5-6d29-4741-90d1-3987d1dfc79b'
)

List Inferences

You can list your existing Inferences by calling list_inferences:

synativ_api.list_inferences()

You will receive a list of Inferences you have started:

ListInferencesResponse(
	inferences=[
        Inference(
            creation_time='2023-08-09 17:37:39.153481',
            metadata='{"scale":8.0}',
            dataset_id='synativ-dataset-95bdf8f9-3e92-4070-b13d-74a0ada3a46b',
            id='synativ-inference-86ef4838-3b4a-4426-a606-5d225393867a',
            model_id='synativ-model-8a8c4ae5-6d29-4741-90d1-3987d1dfc79b'
        ),
		Inference(
            creation_time='2023-08-09 17:32:05.337222',
            metadata='{"scale":9.0}',
            dataset_id='synativ-dataset-f7d47a84-662f-4f9e-8ae3-d25a0ad8ca49',
            id='synativ-inference-10b6f6df-71f5-452c-9cb7-766bc6893d4e',
            model_id='synativ-model-eb65782a-3d6b-4bb1-ad47-c64160249354'
        )
	]
)

Get Inference Details

You can get the details of a specific Inference by calling get_inference with the respective InferenceId:

synativ_api.get_inference(
    inference_id='synativ-inference-86ef4838-3b4a-4426-a606-5d225393867a'
)

You will receive an Inference object as response:

Inference(
    creation_time='2023-08-09 17:37:39.153481',
    metadata='{"scale":8.0}',
    dataset_id='synativ-dataset-95bdf8f9-3e92-4070-b13d-74a0ada3a46b',
    id='synativ-inference-86ef4838-3b4a-4426-a606-5d225393867a',
    model_id='synativ-model-8a8c4ae5-6d29-4741-90d1-3987d1dfc79b'
)

Get Inference Status

You can check the status of your Inference by calling get_inference_status with the respective InferenceId:

synativ_api.get_inference_status(
    inference_id='synativ-inference-86ef4838-3b4a-4426-a606-5d225393867a',
)

This will return a Status object with one of the following:

Status(status='NOT_FOUND')          ## Wrong inference id
Status(status='QUEUED')             ## Job is queued
Status(status='SETTING_UP')         ## Job is setting up
Status(status='DOWNLOADING_DATA')   ## Downaloding data and fine-tuned model
Status(status='RUNNING_INFERENCE')  ## Inference in progress
Status(status='SAVING_RESULTS')     ## Saving inference results
Status(status='COMPLETED')          ## Inference has completed
Status(status='FAILED')             ## Inference has failed

Download Inference Results

You can download the inference results to your local disk by calling download_inference_results with the respective InferenceId:

synativ_api.download_inference_results(
    inference_id='synativ-inference-86ef4838-3b4a-4426-a606-5d225393867a',
    local_dir='<local_save_dir>'
)

Delete Inference

You can always delete an Inference by calling delete_inference with the respective InferenceId:

synativ_api.delete_inference(
    inference_id='synativ-inference-86ef4838-3b4a-4426-a606-5d225393867a'
)

Deleting an Inference will completely delete all records and files of this inference from our servers. We will have no way, whatsoever, to retrieve it.

Previous
Models