Skip to main content

All Questions

0 votes
0 answers
86 views

How can I extract a specific data from a function?

I printed a variable print(detection_result) which I called from detection_result = detector.detect(input_tensor) and it printed DetectionResult(detections=[Detection(bounding_box=BoundingBox(origin_x=...
Michael Dixon's user avatar
0 votes
1 answer
2k views

Regular tensorflow ops not supported by the interpreter error during inference

I am currently trying to convert a saved model trained using Tensorflow (v2.7.0) and Keras to a Tensorflow Lite model. The structure of the model is the following: model_lstm = Sequential() model_lstm....
D.Sbetti's user avatar
1 vote
1 answer
3k views

TensorFlow TypeError: 'generator' object is not callable

Edit: Possible answer at the end of the post Hi I am trying to convert a LSTM into tflite model and I am running into TypeError: 'generator' object is not callable error. My code worked before with ...
Florida Man's user avatar
  • 2,177
1 vote
0 answers
400 views

'Invalid input shapes: expected 1 items got 16 items' when trying to quantize model for tflite

I was trying to quantise a TF model into a TFLite model to deploy it on my ESP32 by calling the dataset through tf.keras.preprocessing.image_dataset_from_directory() and used images_batch and ...
Pruthvi B.'s user avatar
0 votes
2 answers
2k views

'EndVector() takes 1 positional argument but 2 were given' while trying to quantize a tensorflow model

I was trying to quantize a TF model into a TFLite model to deploy it on my ESP32 by calling the dataset through tf.keras.preprocessing.image_dataset_from_directory() and used images_batch and ...
Pruthvi B.'s user avatar
0 votes
1 answer
556 views

Is it possible to apply GradCam to a TF Lite model

I have been studying about GradCam and I noticed most cases are used on a Keras/Tensorflow model. However I have a tensorflow lite model that has been compiled to .tflite format. I am not sure if it's ...
user avatar
0 votes
1 answer
3k views

Use fn_output_signature instead stop problem

My Input; python model_main_tf2.py --model_dir=models\my_ssd_mobilenet_v2_fpnlite --pipeline_config_path=models\my_ssd_mobilenet_v2_fpnlite\pipeline.config Output; prnt.sc/104ecjv My learning stopped ...
user avatar
2 votes
1 answer
369 views

Tensorflow 1.15 is not loading frozen graph, and has given me the same error for the past week

Okay, so I'm working on a large project in Google Colab, where I have to detect a certain object from all the others. Now, for the better half of the past week, I've been working tirelessly trying to ...
Faraz Naseem's user avatar
0 votes
1 answer
238 views

how to get names of all detected models from existing tensorflow lite instance?

I'm looking to build a system that alerts me when there's a package at my front door. I already have a solution for detecting when there's a package (tflite), but I don't know how to get the array of ...
name_here's user avatar
1 vote
2 answers
1k views

Tensorflow - Train.py - ValueError: ('%s is not decorated with @add_arg_scope', ('nets.mobilenet.mobilenet', 'depth_multiplier'))

I'm trying to make a custom object-detection model following this tutorial: https://towardsdatascience.com/custom-object-detection-using-tensorflow-from-scratch-e61da2e10087 Everything worked fine ...
SolArabehety's user avatar
  • 8,616
2 votes
1 answer
619 views

TFLite Converter: RandomStandardNormal implemented for keras model, but not for pure TensorFlow model

Task I have two models which should be equivalent. The first one is built with keras, the second one with tensorflow. Both variational autoencoders use the tf.random.normal method in their model. ...
DocDriven's user avatar
  • 4,014
0 votes
2 answers
1k views

Is there an equivalent of tf.lite.Interpreter.get_input_details for C++?

In TensorFlow lite's Python API, there are methods to retrieve details concerning the input and output tensors, called tf.lite.Interpreter.get_input_details and tf.lite.Interpreter.get_output_details. ...
DocDriven's user avatar
  • 4,014
31 votes
1 answer
2k views

Description of TF Lite's Toco converter args for quantization aware training

These days I am trying to track down an error concerning the deployment of a TF model with TPU support. I can get a model without TPU support running, but as soon as I enable quantization, I get ...
DocDriven's user avatar
  • 4,014
1 vote
0 answers
662 views

Cannot convert frozen graph to tflite model

I am trying to convert a frozen graph into a tflite model using the provided tflite_converter. I am reconstructing how I created the .pb file to make sure I did not mess up something on the way there. ...
DocDriven's user avatar
  • 4,014
1 vote
1 answer
588 views

mobilenetv2 tflite not expected output size with python3

I am running into an issue with my mobilenetV2 SSD model. I converted it using the steps detailed here, except the fact that I use the CLI tool tflite_convert for the related step. This works fine ...
Romzie's user avatar
  • 457

15 30 50 per page