Convert tfjs model for using in Tensorflow in c++ - javascript

I'm new to tensorflow and I have one question, My project has two majors part, first written in NodeJs that train my model from dataset and save model to local storage, so I have two files:
model.json
wights.bin
The second part is written in c++, After couple of days I could build tensorflow with bazel and add it to my OpenCv project, so here is my question :
I want to train my model in NodeJs part and use these models in my C++ part. Is this possible ?
also I saw tjs converter but it converts models to use in NodeJs not vice versa.
Update :
After searching a lot I figured out that I should convert my models to protobuff file, but tfjs-Converter does not support this type of conversion and another point is that I want to use my model with opencv library.
Update 2
Finally I could change my model to .pb file, first I use tfjs_converter to convert to keras model(.h5 file) and after that use this python script to convert to .pb file and opencv can successfully load model. But I got this error in using model :
libc++abi.dylib: terminating with uncaught exception of type
cv::Exception: OpenCV(4.1.0)
/tmp/opencv-20190505-12101-14vk1fh/opencv-4.1.0/modules/dnn/src/dnn.cpp:524:
error: (-2:Unspecified error) Can't create layer
"flatten_Flatten1/Shape" of type "Shape" in function
'getLayerInstance'
Any help ?
thanks

Finally i solved my own problem.
Here is the steps that I've done :
Convert my tfjs model to keras model using tfjs-converter
Using this python scripts to change keras(.h5) model to frozen model(.pb)
Use this tutorial to optimize my .pb model
Finally everything works great!

Related

How to work with JSON lines GPT-2 database?

I downloaded all files. And all of them are just a randomly answers in JSON format. So, I want to train my own tensorflow.js model using this database! But, I don't have a question database here. So, what I need to do? I want to train my model to have an offline version of GPT-2, because I didn't find a already pre-trained model of it!
And yes, I want use JavaScript in Tensorflow.JS library. So for note, here's ones of the files that I was downloaded using the download_dataset.py script: xl-1542M-k40.valid.jsonl, xl 1542M.test.jsonl, xl-1542M.train.jsonl, xl-1542M.valid.jsonl
I was using this repo: https://github.com/openai/gpt-2-output-dataset

Why loadGraphModel function from tensorflow.js not working?

I am working on deploy an ML that I trained using tensorflow (in Python). The model is saved as an .h5 file. After converting the model using the tensorflowjs_converter --input_format=keras ./model/myFile.h5 /JS_model/ command.
I imported the tensorflow library using the following:
<script src="https://cdn.jsdelivr.net/npm/#tensorflow/tfjs/dist/tf.min.js"> </script>
After this, I ws able to load the model using the loadLayersModel() function. However, when using the loadGraphModel, it does not work. It outputs this error on the browser:
''
I also tried using the tf.models.save_model.save() function in python which it outputs the variables and assets folders, as well as the .pb file. However, an error still occurs. Using the code above, changing only the path to 'THE_classifier' (which is the name of the folder where asset, variables and the .pb is located), the output is:
I want to work with the loadGraphModel() function because according to various sources, it provides a faster inference time.
layers models and graph models have different internal layout, they are not compatible and interchangable. if its a layers model, it must be loaded with tf.loadLayersModel and if its a graph model, it must be loaded with tf.loadGraphModel
graph models are frozen models - so if you want to convert keras model to graph, you need to freeze it first, else it can only be converted to layers model
(and thats where difference in inference time comes from - its faster to evaluate a frozen model than one that is still using variables)

Forge Model Properties Excel

Currently I was following these steps to get the Forge Viewer into PowerBi. I was able to successfully get that to work but now I am attempting to get the properties out of the revit models.
I am following these steps found here
by editing this package forge-model-properties-excel
I am putting in a revit model that shows up as a 3D model in Forge viewer, and I know it has the material properties that I need. But when i run it i get the error that reads
this model has no {3D} view
I'm not sure why I am receiving this error if the model does have a 3D view.
Many thanks in advance.
The sample application is looking for a 3D view that is actually called "{3D}": https://github.com/xiaodongliang/forgeviewer_embed_in_powerbi_report/blob/70c58ab3217cdf4b10c475abf2721871984ed4e0/forge-model-properties-excel/index.js#L27.
Is it possible that the 3D views in your Revit model are named differently? If they are, make sure to update that line of code with the expected name.

Three.js - "Unknown format" Error when using .FBX Models

im fairly new to THREE.js. Im trying to load a .FBX Object via the FBXLoader located in three/examples/jsm/loaders/FBXLoader.
Also, i'm using this in React.js.
Now, the page loads but the model isnt there. The error in the console says: An error happened: Error: THREE.FBXLoader: Unknown format.
My FBX File seems to be in binary format, not in ASCII Format.
I really dont know what to do.
My Code:
//deleted
I also tried moving the models folder in and out of the public folder.
Thanks in advance.
I found a solution.
My FBX file had an old version, also it had Binary encoding.
Switching to ASCII and a newer version fixed everything.

How to download models and weights from tensorflow.js

I'm trying to download a pretrained tensorflow.js model including weights, to be used offline in python in the standard version of tensorflow as part of a project that is not on an early stage by any means, so switching to tensorflow.js is not a possibility.
But I cant just figure out how to download those models and if its necessary to to do some conversion to the model.
I'm aware that in javascript I can access the models and use them by calling them like this
but how do I actually get the .ckpt files or the model frozen if thats the case?
<script src="https://cdn.jsdelivr.net/npm/#tensorflow/tfjs#0.13.3"></script>
<script src="https://cdn.jsdelivr.net/npm/#tensorflow-models/posenet#0.2.3"></script>
My final objective is to get the frozen model files, and get the outputs like is done in the normal version of tensorflow.
Also this will be used in an offline environment, so any online reference would not be useful.
Thanks for your replies
It is possible to save the model topology and its weights by calling the method save of the model.
const model = tf.sequential();
model.add(tf.layers.dense(
{units: 1, inputShape: [10], activation: 'sigmoid'}));
const saveResult = await model.save('downloads://mymodel');
// This will trigger downloading of two files:
// 'mymodel.json' and 'mymodel.weights.bin'.
console.log(saveResult);
There are different scheme strings depending on where to save the model and its weights (localStorage, IndexDB, ...). doc
I went to https://storage.googleapis.com/tfjs-models/ and found the directory listing all the files. I found the relevant files (I wanted all the mobilenet float, as opposed to quantized mobileNet), and populated this file_uris list.
base_uri = "https://storage.googleapis.com/tfjs-models/"
file_uris = [
"savedmodel/posenet/mobilenet/float/050/group1-shard1of1.bin",
"savedmodel/posenet/mobilenet/float/050/model-stride16.json",
"savedmodel/posenet/mobilenet/float/050/model-stride8.json",
"savedmodel/posenet/mobilenet/float/075/group1-shard1of2.bin",
"savedmodel/posenet/mobilenet/float/075/group1-shard2of2.bin",
"savedmodel/posenet/mobilenet/float/075/model-stride16.json",
"savedmodel/posenet/mobilenet/float/075/model-stride8.json",
"savedmodel/posenet/mobilenet/float/100/group1-shard1of4.bin",
"savedmodel/posenet/mobilenet/float/100/group1-shard2of4.bin",
"savedmodel/posenet/mobilenet/float/100/group1-shard3of4.bin",
"savedmodel/posenet/mobilenet/float/100/model-stride16.json",
"savedmodel/posenet/mobilenet/float/100/model-stride8.json"
]
Then I used python to iteratively download the files into their same folders.
from urllib.request import urlretrieve
import requests
from pathlib import Path
for file_uri in file_uris:
uri = base_uri + file_uri
save_path = "/".join(file_uri.split("/")[:-1])
Path(save_path).mkdir(parents=True, exist_ok=True)
urlretrieve(uri, file_uri)
print(path, file_uri)
I enjoyed Jupyter Lab (Jupyter Notebook is also good) when experimenting with this code.
With this, you'll get a folder with bin files (the weight) and the json files (the graph model). Unfortunately, these are graph models, so they cannot be converted into SavedModels, and so they are absolutely useless to you. Let me know if someone find a way of running these tfjs graph model files in regular TensorFlow (preferably 2.0+).
You can also download zip files with the 'entire' model from TFHub, for example, a 2 byte quantized ResNet PoseNet is available here.

Categories