-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for FIFOQueue #1099
Comments
Hi, I am using a model provided by https://github.com/davidsandberg/facenet/ with the Inception-Resnet-v1 model for facial-recognition. It is using FIFOQueueV2, QueueDequeueUpToV2 Ops which while converting to tensorflow JS/Lite gives Unsupported Ops error. Also, I'm not be being able to understand the alternative 1 that you provided. Can you explain that in brief? |
@dsmilkov I think this falls into a much bigger category of support tf.data ops. Here is a summary of an approach for supporting the Queue and other tf.data stream types using a global state:
Note: This requires the user understand what tf.data ops are in the model, and be able to use the tf.data api. This also add package dependency of tfjs-data to tfjs-converter. Let me know if these make sense, I can put them into a design doc for further discussion. |
I think the fundamental problem is that the data during inference in the browser will come in a different way than during training. For example, during training, it makes sense to feed a dataset/queue of It's too early to try to offer a solution to close that gap. As a first step, the converter should detect whenever the graph has tf.data/queue ops, warn the user, and recommend that the user use the earliest node in the graph after the data feeding when calling model.execute(). This is option 2A in my previous comment. |
Hi, @dsmilkov Thank you for opening this issue for tracking purposes. Since this issue has been open for a long time, the code/debug information for this issue may not be relevant with the current state of the code base. The TFJs team is constantly improving the framework by fixing bugs and adding new features. We suggest you try the latest TFJs version with the latest compatible hardware configuration which could potentially resolve the issue. We can keep the issue open if it is still relevant. Please confirm if we need to keep the issue open. Thank you for your support and cooperation. |
Currently we fail to convert a TF Model if the model has a
FIFOQueue
.We should investigate which models use
FIFOQueue
and whether queues make sense during inference time, or only for training.Two paths for graphs with
FIFOQueue
:Implement a
FIFOQueue
and make the model stateful so that executing an "enqueue" or "dequeue" node modifies the state of the graph. This solution is technically feasible, but challenging.In the meantime, another option is to treat this as a UX problem and provide information to the user during conversion time.
2A) The simplest solution is to detect that a graph has a "dequeue" op, followed by a synchronous subgraph, and warn the user to consider feeding data right after that op (showing that op's name and shape) when calling model.execute().
2A) Additionally, would be great to ask the user if they are ok with the converter dropping any nodes before the "dequeue" op, which will also make the graph smaller.
The text was updated successfully, but these errors were encountered: