expo init myLittleProject
cd myLittleProject
npm install @tensorflow/tfjs
npm install --save react-native-fs
npm install --save @react-native-community/async-storage
To test that everything is working replace App.js code temporary with this:
import React from 'react'
import { Text, View } from 'react-native'
import * as tf from '@tensorflow/tfjs'
class App extends React.Component {
state = {
isTfReady: false
}
async componentDidMount() {
await tf.ready()
this.setState({ isTfReady: true })
console.log(this.state.isTfReady)
}
render() {
return (
<View style={{
flex: 1,
justifyContent: 'center',
alignItems: 'center'
}}>
<Text>
{this.state.isTfReady ? "Ready" : "Waiting"}
</Text>
</View>
)
}
}
export default App
https://github.com/tensorflow/tfjs/tree/master/tfjs-react-native
Using Own Model
Create metro.config.js
to root folder and add following content there (you might need to add some assetExts or something else but first check errors):
const blacklist = require('metro-config/src/defaults/blacklist');
module.exports = {
transformer: {
getTransformOptions: async () => ({
transform: {
experimentalImportSupport: false,
inlineRequires: false,
},
}),
},
resolver: {
assetExts: ['bin', 'txt', 'jpg', 'ttf', 'png'],
sourceExts: ['js', 'json', 'ts', 'tsx', 'jsx'],
blacklistRE: blacklist([/platform_node/])
},
};
Model is trained somewhere else and it's saved using tf.save('file://model')
It creates a directory with model.json
and weights.bin
inside it.
Then this model can be used in React Native (Expo) project following way:
import React from 'react'
import { Text, View } from 'react-native'
import * as tf from '@tensorflow/tfjs'
import { bundleResourceIO } from '@tensorflow/tfjs-react-native'
class App extends React.Component {
state = {
isTfReady: false,
model: false,
}
async componentDidMount() {
await tf.ready()
this.setState({ isTfReady: true })
const modelJSON = require('./assets/model/model.json');
const modelWeights = require('./assets/model/weights.bin');
const model = await tf.loadLayersModel(bundleResourceIO(modelJSON, modelWeights));
model.summary();
this.setState({ model })
}
render() {
return (
<View style={{
flex: 1,
justifyContent: 'center',
alignItems: 'center'
}}>
<Text>
TF: {this.state.isTfReady ? "Ready" : "Waiting"}
</Text>
<Text>
MODEL: {this.state.model ? "Ready" : "Waiting"}
</Text>
</View>
)
}
}
export default App
Top comments (6)
This is really cool.
I have tried to do the same and integrate mobilenet modle into react-native.
But getting some errors.
dev-to-uploads.s3.amazonaws.com/i/...
Could you please help me on this
I actually started to use this for a different model and got the same error. Mobilenet actually works when you use it certain way (if I remember correctly). Try if you use this code
I just created a new issue github.com/tensorflow/tfjs/issues/... which hopefully is noticed by someone for tf team and not deleted as I'm always bad at following issue templates. I started looking this problem on my own and tried to fix it without success. I try to have more time in the near future. Please let everyone know if you solve it and I will do the same.
Hi, Thanks for the quick reply. I just try to use mobilenet like you have mentioned above but it still giving me the same error.
If you have any working code related to this could you please share with me to get an idea. I really appreciate your help since this is really helpful to continue my research. Thanks again
I believe bundleResourceIO cannot be used by managed expo apps. See API docs here:-
js.tensorflow.org/api_react_native...
@subodha Pathiraja - this might explain the error you have highlighted.
try this heartbeat.fritz.ai/image-classific...
My models are in .pb and .txt.
Where i can classifie my models to generate file .json and .bin?
Thank you