~jochen/Origami-Classify

Classify images with Origami Studio 3 + TensorFlow Lite
changed camera image resolution to match model image resolution
specified model type and image resolution in README
added instructions for training and using custom TF Lite model

refs

master
browse  log 

clone

read-only
https://git.sr.ht/~jochen/Origami-Classify
read/write
git@git.sr.ht:~jochen/Origami-Classify

You can also use your local clone with git send-email.

#Classify images with Origami Studio 3 + TensorFlow lite

  • You can install this repo anywhere on your computer.
  • For this tutorial I recommend you follow exactly the steps below and install it in your Documents directory.
  • You can simply copy-paste the code snippets into you terminal.

Adventure Class

#1. Download and install Origami Studio 3.

#2. Open a terminal window:

Just press command + space , search for terminal.app and hit return.

#3. Download example code:

git clone https://git.sr.ht/~jochen/Origami-Classify ~/Documents/Origami-Classify

This creates a folder named Origami-Classify and downloads the following files into it:

  • classify.origami
  • classify.py
  • README.md
  • LICENSE

#4. Install virtualenv:

pip3 install virtualenv

#5. Change directory to your Documents folder.

cd ~/Documents

#6. Create virtual environment:

virtualenv -p python3 OS-classify

Thist will create a new folder in your Documents folder named OS-classify. You can choose another name, but in this tutorial we'll use OS-classify

#7. Change directory to virtual environment:

cd OS-classify

#8. Activate virtual environment:

source bin/activate

#9. Move example code into your virtual environment folder:

cp ~/Documents/Origami-Classify/* ~/Documents/OS-classify/

This will copy all the files from the Origami-Classify folder to your OS-classify project folder.

#10. Install Flask:

pip3 install Flask

#11. Install PIL:

pip3 install Pillow

#12. Install Numpy:

pip3 install -U numpy

#13. Install TFlite:

pip3 install --index-url https://google-coral.github.io/py-repo/ tflite_runtime

#14. Test TFlite installation:

Python3
>>> from tflite_runtime.interpreter import Interpreter
  • Successful installation will only print >>> on the following line. This may take a couple of seconds.
  • Quit Python3 with: $ exit()

#15. Prepare your model and labels:

curl -o mobilenet_v1_1.0_224_quant_and_labels.zip 'https://storage.googleapis.com/download.tensorflow.org/models/tflite/mobilenet_v1_1.0_224_quant_and_labels.zip'
unzip mobilenet_v1_1.0_224_quant_and_labels.zip
rm mobilenet_v1_1.0_224_quant_and_labels.zip
  • You use the curl command to download a .zip file which contains the pre-trained mobileNet model and it's labels.
  • The unzip command extracts the .zip file into your current directory (OS-classify)
  • with the rm command you delete the now unnecessary .zip file

#16. Your OS-classify folder should now contain the following files:

ls
  • This command will list the contents of you OS-classify folder.
  • At this point the following files should get listet in your terminal:
OS-classify folder
bin
lib
pyvenv.cfg
mobilenet_v1_1.0_224_quant.tflite
labels_mobilenet_quant_v1_224.txt
classify.py
classify.origami

#17. Setup and run Flask:

export FLASK_APP=classify.py
flask run --host=0.0.0.0

#18. Copy Flask IP address:

  • Flask will print multiple lines to the terminal.
  • The last line shows the IP address the local Flask server is running on. (might take a couple of seconds)
  • Copy this address.

#19. Open classify.origami:

  • In the Patch Graph you can find a patch called "Network Request"
  • Paste the copied IP address into its URL-field and append it with "message".
  • Example: http://192.168.0.27:5000/message

#20. Adventure Time !!! (❍ᴥ❍ʋ)

#Training and using your own model:

  • Visit teachablemachine.withgoogle.com/train/image (Standard Image Model, 224px x 224px color images)
  • Create and train your own classifier model
  • Export and download your model as TensorFlow Lite (Quantized)
  • Unzip the downloaded file (converted_tflite_quantized.zip)
  • the unzipped folder contains two files
converted_tflite_quantized folder
model.tflite
labels.txt
  • Put these two files into your Origami-Classify project folder
  • Open classify.py in the texteditor of your choice
  • Edit model_path to point to your model.tflite (line 54)
  • Edit label_path to point to your labels.txt
  • *Example: model_path = "model.tflite" , label_path = "labels.txt" *
  • Perform step 17. to 20. again to test your own model in your Origami sketch
Do not follow this link