The tf.saved_model.save() function expects an object of type Trackable for export, such as tf.Module or a subclass of the Trackable class. However, you are providing an object of type tensorflow.lite.python.interpreter.Interpreter, which is not a trackable object.\n\nTo resolve this issue, you need to wrap the interpreter object in a tf.Module subclass. Here's an example of how you can do this:\n\npython\nimport tensorflow as tf\n\nclass MyModule(tf.Module):\n def __init__(self, interpreter):\n self.interpreter = interpreter\n\n @tf.function(input_signature=[tf.TensorSpec(shape=[None, 224, 224, 3], dtype=tf.float32)])\n def predict(self, inputs):\n self.interpreter.set_tensor(self.interpreter.get_input_details()[0]['index'], inputs)\n self.interpreter.invoke()\n output = self.interpreter.get_tensor(self.interpreter.get_output_details()[0]['index'])\n return output\n\n# Create an instance of MyModule\nmy_module = MyModule(interpreter)\n\n# Save the module\ntf.saved_model.save(my_module, saved_model_path)\n\n\nIn this example, MyModule is a subclass of tf.Module that wraps the interpreter object. It defines a predict method that performs the inference using the interpreter. You can then create an instance of MyModule and save it using tf.saved_model.save().

解决

原文地址: https://www.cveoy.top/t/topic/pRxM 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录