Share via


ModelOperationsCatalog Class

Definition

Class used by MLContext to save and load trained models.

public sealed class ModelOperationsCatalog
type ModelOperationsCatalog = class
Public NotInheritable Class ModelOperationsCatalog
Inheritance
ModelOperationsCatalog

Methods

CreatePredictionEngine<TSrc,TDst>(ITransformer, Boolean, SchemaDefinition, SchemaDefinition)

Create a prediction engine for one-time prediction (default usage).

CreatePredictionEngine<TSrc,TDst>(ITransformer, DataViewSchema)

Create a prediction engine for one-time prediction. It's mainly used in conjunction with Load(Stream, DataViewSchema), where input schema is extracted during loading the model.

CreatePredictionEngine<TSrc,TDst>(ITransformer, PredictionEngineOptions)

Create a prediction engine for one-time prediction. It's mainly used in conjunction with Load(Stream, DataViewSchema), where input schema is extracted during loading the model.

Load(Stream, DataViewSchema)

Load the model and its input schema from a stream.

Load(String, DataViewSchema)

Load the model and its input schema from a file.

LoadWithDataLoader(Stream, IDataLoader<IMultiStreamSource>)

Load a transformer model and a data loader model from a stream.

LoadWithDataLoader(String, IDataLoader<IMultiStreamSource>)

Load a transformer model and a data loader model from a file.

Save(ITransformer, DataViewSchema, Stream)

Save a transformer model and the schema of the data that was used to train it to the stream.

Save(ITransformer, DataViewSchema, String)

Save a transformer model and the schema of the data that was used to train it to the file.

Save<TSource>(ITransformer, IDataLoader<TSource>, Stream)

Save a transformer model and the loader used to create its input data to the stream.

Save<TSource>(ITransformer, IDataLoader<TSource>, String)

Save a transformer model and the loader used to create its input data to the file.

Extension Methods

LoadTensorFlowModel(ModelOperationsCatalog, String, Boolean)

Load TensorFlow model into memory. This is the convenience method that allows the model to be loaded once and subsequently use it for querying schema and creation of TensorFlowEstimator using ScoreTensorFlowModel(String, String, Boolean). usage of this API requires additional NuGet dependencies on TensorFlow redist, see linked document for more information. TensorFlowModel also holds references to unmanaged resources that need to be freed either with an explicit call to Dispose() or implicitly by declaring the variable with the "using" syntax/>

LoadTensorFlowModel(ModelOperationsCatalog, String)

Load TensorFlow model into memory. This is the convenience method that allows the model to be loaded once and subsequently use it for querying schema and creation of TensorFlowEstimator using ScoreTensorFlowModel(String, String, Boolean). usage of this API requires additional NuGet dependencies on TensorFlow redist, see linked document for more information. TensorFlowModel also holds references to unmanaged resources that need to be freed either with an explicit call to Dispose() or implicitly by declaring the variable with the "using" syntax/>

Applies to