TensorFlow Serving C++ API Documentation
Public Member Functions | Static Public Member Functions | List of all members
tensorflow::serving::SavedModelBundleFactory Class Reference

#include <saved_model_bundle_factory.h>

Public Member Functions

Status CreateSavedModelBundle (const string &path, std::unique_ptr< SavedModelBundle > *bundle)
 
Status CreateSavedModelBundleWithMetadata (const Loader::Metadata &metadata, const string &path, std::unique_ptr< SavedModelBundle > *bundle)
 
Status EstimateResourceRequirement (const string &path, ResourceAllocation *estimate) const
 
const SessionBundleConfig & config () const
 
SessionBundleConfig & mutable_config ()
 

Static Public Member Functions

static Status Create (const SessionBundleConfig &config, std::unique_ptr< SavedModelBundleFactory > *factory)
 

Detailed Description

A factory that creates SavedModelBundles from SavedModel or SessionBundle export paths.

The emitted sessions only support Run(), and although not enforced it is expected that the client will only make non-mutating Run() calls. (If this restriction, which we've added as a safety measure, is problematic for your use-case please contact the TensorFlow Serving team to discuss disabling it.)

If the config calls for batching, the emitted sessions automatically batch Run() calls behind the scenes, using a SharedBatchScheduler owned by the factory. The 'config.num_batch_threads' threads are shared across all session instances created by this factory. However, each session has its own dedicated queue of size 'config.max_enqueued_batches'.

The factory can also estimate the resource (e.g. RAM) requirements of a SavedModelBundle based on the SavedModel (i.e. prior to loading the session).

This class is thread-safe.

Definition at line 52 of file saved_model_bundle_factory.h.

Member Function Documentation

◆ Create()

Status tensorflow::serving::SavedModelBundleFactory::Create ( const SessionBundleConfig &  config,
std::unique_ptr< SavedModelBundleFactory > *  factory 
)
static

Instantiates a SavedModelBundleFactory using a config.

Parameters
configConfig with initialization options.
factoryNewly created factory if the returned Status is OK.

Definition at line 88 of file saved_model_bundle_factory.cc.

◆ CreateSavedModelBundle()

Status tensorflow::serving::SavedModelBundleFactory::CreateSavedModelBundle ( const string &  path,
std::unique_ptr< SavedModelBundle > *  bundle 
)

Instantiates a bundle from a given export or SavedModel path.

Parameters
pathPath to the model.
bundleNewly created SavedModelBundle if the returned Status is OK.

Definition at line 112 of file saved_model_bundle_factory.cc.

◆ CreateSavedModelBundleWithMetadata()

Status tensorflow::serving::SavedModelBundleFactory::CreateSavedModelBundleWithMetadata ( const Loader::Metadata metadata,
const string &  path,
std::unique_ptr< SavedModelBundle > *  bundle 
)

Instantiates a bundle from a given export or SavedModel path and the given metadata.

Parameters
metadataMetadata to be associated with the bundle.
pathPath to the model.
bundleNewly created SavedModelBundle if the returned Status is OK.

Definition at line 106 of file saved_model_bundle_factory.cc.

◆ EstimateResourceRequirement()

Status tensorflow::serving::SavedModelBundleFactory::EstimateResourceRequirement ( const string &  path,
ResourceAllocation *  estimate 
) const

Estimates the resources a SavedModel bundle will use once loaded, from its export path.

Parameters
pathPath to the model.
estimateOutput resource usage estimates. Different kinds of resources (e.g. CPU, RAM, etc.) may get populated.

Definition at line 100 of file saved_model_bundle_factory.cc.


The documentation for this class was generated from the following files: