downloading files from kaggle

Issue Often there is no simple way to get the files from kaggle to a remote server. While previously I had used either a cookies extension or a python command line module that allowed me to specify the competition, neither of these work efficiently or at all for various reasons. I had been meaning to write a script to do this for some time now had never gotten around to it. »

telegram bot

Setting up a Telegram Bot in Python and Docker This post is my personal introduction to using Telegram and Bluemix and while it is incredibly simple, it is useful to see how to do the basics before integrating peripheral API’s or extraneous processes. First, set up a telegram account however you’d like. I personally used the Desktop client and it worked great. Once you have an account, you need to interact with @BotFather to set up a bot. »

TensorFlow 0.7.0 dockerfile with Python 3

edit: everything has since been updated to Tensorflow 0.7.0 which I based off of my base cuda dockerfile to use with tensorflow or theano (depending on my goals, Keras allows great flexibility in between training vs compiling) TensorFlow In 2015 Google came out with a new deep learning framework/tensor library similar in many ways to Theano and I enjoy using it a lot more than Theano simply due to long compile times of Theano when using Keras and TensorBoard. »

using generative neural nets in keras to create โ€˜on-the-flyโ€™ dialogue

Note 4/10/17: almost all the python modules have changed quite a bit since this original post and there are issues with youtube-dl and keras, if you would like to work on an updated version or have an updated version please let me know! Introduction: There’s been a few cool things done with generative neural nets so far but to my knowledge, very few generative neural nets have found a useful application in any publicly discussed business application. »

DeepQA with Customer Service Reps

Introduction The dataset we will be using is quite special in the sense that usually this type of data is quite proprietary and for good reason. While it may include a multitude of different Preparing the data While we are not going to do much besides run the data through a preliminary DeepQA repo to see what happens we need to format the data. This will give us a grasp of how medicore the data off the bat is (lots of inaudible/random cutt off, not much information ) »