Below you will find pages that utilize the taxonomy term “Docker”
Using Kubernete Jobs for one off ingestion of CSV's
Running Postgres on kubernetes locally
While this may be overkill, its better than configuring a kubernetes cluster on gcloud or whatever else and if done correctly will translate to a cloud service we can later use in a production system while allowing us to focus on micro services individually.
To start Kubernetes locally
I am using Docker for mac which comes with Kubernetes v1.9.8 as of this time of writing and while it may not perfectly replicate a development/staging/production environment, I find it to be much more straightforward to develop in this manner due to many of the new Kubernetes tooling. While we could use docker swarm or compose, kubernetes seems to be a more versatile use case for the hope of not only running an app but creating pipelines to ingest and maintain a stable configuration that could be used in the cloud (and much more resembles how you can develop locally then roll out to kubernetes systems).
TensorFlow 0.7.0 dockerfile with Python 3
edit: everything has since been updated to Tensorflow 0.7.0 which I based off of my base cuda dockerfile to use with tensorflow or theano (depending on my goals, Keras allows great flexibility in between training vs compiling)
TensorFlow
In 2015 Google came out with a new deep learning framework/tensor library similar in many ways to Theano and I enjoy using it a lot more than Theano simply due to long compile times of Theano when using Keras and TensorBoard. This will not go into detail about using Theano or TensorFlow or Keras but instead is how I built a docker image that uses a slightly older nvidia card (which for my purposes is capable of using multiple gpu’s in isolation and exiting a model on one card and not effecting the other).