EstimatorChain<TLastTransformer>.AppendCacheCheckpoint Method
Definition
Important
Some information relates to prerelease product that may be substantially modified before it’s released. Microsoft makes no warranties, express or implied, with respect to the information provided here.
Append a 'caching checkpoint' to the estimator chain. This will ensure that the downstream estimators will be trained against cached data. It is helpful to have a caching checkpoint before trainers or feature engineering that take multiple data passes. It is also helpful to have after a slow operation, for example after dataset loading from a slow source or after feature engineering that is slow on its apply phase, if downstream estimators will do multiple passes over the output of this operation. Adding a cache checkpoint at the begin or end of an EstimatorChain<TLastTransformer> is meaningless and should be avoided. Cache checkpoints should be removed if disk thrashing or OutOfMemory exceptions are seen, which can occur on when the featured dataset immediately prior to the checkpoint is larger than available RAM.
public Microsoft.ML.Data.EstimatorChain<TLastTransformer> AppendCacheCheckpoint (Microsoft.ML.Runtime.IHostEnvironment env);
member this.AppendCacheCheckpoint : Microsoft.ML.Runtime.IHostEnvironment -> Microsoft.ML.Data.EstimatorChain<'LastTransformer (requires 'LastTransformer : null and 'LastTransformer :> Microsoft.ML.ITransformer)>
Public Function AppendCacheCheckpoint (env As IHostEnvironment) As EstimatorChain(Of TLastTransformer)
Parameters
- env
- IHostEnvironment
The host environment to use for caching.