-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError when invoking TFLite INT8 model with tile operation #67789
Comments
Hi @ShabbirMarfatiya , I will replicate your issue and will get back to you. |
Hey @sawantkumar, Thanks for looking into this issue. If you need any more details, feel free to let me know. |
Hi @sawantkumar, Have you happened to find anything related to my issue that might be helpful? |
Hi @ShabbirMarfatiya , I am in the process of replicating your issue. Meanwhile did you try out with the latest tensorflow version , 2.16.1 ? |
Hi @sawantkumar, No, I didn't try that, but I'll try and let you know the outcome. |
Hi @ShabbirMarfatiya , I see that you have already filed a bug report for this, right? Anyway , just in case can you also also provide all the other assets like model file and dataset. |
Hi @sawantkumar, I have uploaded all necessary folders and files to GDrive. Here's the link for you: https://drive.google.com/drive/folders/1Ov1qVyreDWq0q3VQbAq0BIvrxRSB4dRs?usp=sharing If you need any more details, feel free to let me know. |
Hi @ShabbirMarfatiya , I am getting a lot of protobuf library issues. It will take me some time to resolve and replicate your issue. I will let you know once i am done. Thank you |
Hi @ShabbirMarfatiya , we're wondering if you may be able to resolve your issue by using AI-Edge-Torch, you can find more information here: googleblog. I have actually created a simple script for converting a mobilenet model here: import torch
import torchvision
import ai_edge_torch
mobilenet_model = torchvision.models.mobilenet_v3_small()
sample_inputs = (torch.randn(1, 3, 224, 224),)
edge_model = ai_edge_torch.convert(mobilenet_model.eval(), sample_inputs)
edge_model.export("mobilenet_v3_small.tflite") If you want to, you can actually try visualizing the result in model-explorer as well. Please try them out and let us know if this resolves your issue. If you still need further help, feel free to open a new issue at the respective repo. |
I'm facing an issue while trying to run inference with a TensorFlow Lite model quantized to INT8 precision. The model was trained using CenterNet MobileNet for hand keypoint detection, and I'm getting a
RuntimeError
when invoking the interpreter, with the following error message:Environment:
Steps to Reproduce:
Expected Behavior:
The TensorFlow Lite INT8 model should run inference successfully without any errors.
Actual Behavior:
The RuntimeError is raised when invoking the interpreter, indicating that the tile operation is not supported in INT8 precision.
Additional Information:
I've followed the recommended steps for INT8 quantization, including setting the optimization flags, providing a representative dataset, and setting the target specs for INT8 operations. However, the issue persists.
@sushreebarsa @sachinprasadhs @mohantym @pkarimib, Could you please assist me in resolving this issue? I would greatly appreciate any guidance or suggestions.
The text was updated successfully, but these errors were encountered: