Thursday, March 31, 2016

Example for Deploying a Tensorflow Model

http://stackoverflow.com/questions/34036689/example-for-deploying-a-tensorflow-model-via-a-restful-api

http://stackoverflow.com/questions/35414824/tensorflow-in-production-for-real-time-predictions-in-high-traffic-app-how-to

http://stackoverflow.com/questions/35037360/how-to-deploy-and-serve-prediction-using-tensorflow-from-api


http://stackoverflow.com/questions/33759623/tensorflow-how-to-restore-a-previously-saved-model-python

http://stackoverflow.com/questions/34500052/tensorflow-saving-and-restoring-session


TensorFlow Serving is a high performance, open source serving system for machine learning models, designed for production environments and optimized for TensorFlow. The initial release contains examples built with gRPC, but you can easily replace the front-end (denoted as "client" in the diagram below) with a RESTful API to suit your needs.
enter image description here
To get started quickly, check out the tutorial.


Build Tensorflow Serving with GPU support

https://github.com/tensorflow/serving/issues/17


Batching Inference

https://github.com/tensorflow/serving/tree/master/tensorflow_serving/batching
https://github.com/tensorflow/serving/pull/57
https://github.com/tensorflow/serving/issues/10
https://github.com/tensorflow/serving/issues/60


Using docker

https://blog.giantswarm.io/moving-docker-container-images-around/

Using Save

Alternately, Bob could have simply done a docker save on the Ubuntu image to give Alice a copy of the container's image, which she could use without Bob's modifications:
$ docker save ubuntu | gzip > ubuntu-golden.tar.gz
Alice would then take that copy and load it, instead of doing an import :
$ gzcat ubuntu-golden.tar.gz | docker load
Now Alice runs the container and notes that Bob's this file is not there:
$ docker run -i -t ubuntu /bin/bash
root@e11b3abb67de:/# ls  
bin  boot  dev  etc  home  lib  lib64  media  mnt  opt  proc  root  run  sbin  srv  sys  tmp  usr  var  
I will note here if Alice does a docker import - ubuntu instead of docker load, that docker will store the image with zero complaints. Docker will even try to start an imported instance which was saved with docker export. It will just fail to run anything in the container when it does so.



http://stackoverflow.com/questions/23935141/how-to-copy-docker-images-from-one-host-to-another-without-via-repository

You will need to save the docker image as a tar file:
docker save -o <save image to path> <image name>
Then copy your image to a new system with regular file transfer tools such as cp or scp. After that you will have to load the image into docker:
docker load -i <path to image tar file>
PS: You may need to sudo all commands.


http://stackoverflow.com/questions/20932357/docker-enter-running-container-with-new-tty

docker exec -it [container-id] bash



5 comments:

  1. http://media.bemyapp.com/practical-methodology-deploying-machine-learning/

    ReplyDelete
  2. http://blog.kubernetes.io/2016/03/scaling-neural-network-image-classification-using-Kubernetes-with-TensorFlow-Serving.html

    ReplyDelete
  3. NetworkError: NetworkError(code=StatusCode.UNAVAILABLE, details="This task would start a fresh batch, but all batch threads are busy, so at present there is no processing capacity available for this task")

    https://github.com/tensorflow/serving/issues/10

    ReplyDelete
  4. CUDA for TF serving docker image

    https://github.com/tensorflow/serving/issues/71

    ReplyDelete
  5. https://devblogs.nvidia.com/parallelforall/nvidia-docker-gpu-server-application-deployment-made-easy/

    ReplyDelete