If you’re new to machine learning please refer this post to install h20 and setup the system.Now when you have everything installed, let’s go down to the main point.we start with simple h2o api step by step.
Step 1 : Import the library. code – import h2o;
Step 2 : Run init code. code – h20.init();
we run this code earlier; don’t worry, this will no harm.
Now I am going through this in details, but i want to just emphasize that this is the complete script: it download data, prepares it, create a multilayer neural model (i.e. deep learning) this is a competitive with the state of the art on this data set, and run the predictions on it.
Now before any this i want to introduce you to the most common data-set in the machine learning. i.e. The Iris Data set.If you haven’t heard of this iris data set before, this must be your first machine learning experience.So Iris data set is a set of 150 observation of the iris plant.This data sets consists of 3 different types of irises’ (Setosa, Versicolour, and Virginica) petal and sepal length, stored in a 150×4 numpy.ndarray
The rows being the samples and the columns being: Sepal Length, Sepal Width, Petal Length and Petal Width.It’s very popular dataset for the machine learning experiments as it is small enough to be usefully viewed in a chart, but big enough to be interesting, and it’s a nontrivial: none of the four measurement neatly divide the data up.
Example 3-1 Deep learning on the Iris data set in python.
import h2o h2o.init() datasets = "//raw.githubusercontent.com/DarrenCook/h2obook/master/datasets/" data = h2o.import_file(datasets + "iris_wheader.csv") y = "class" x = data.names x.remove(y) train, test = data.split_frame([0.8]) m = h2o.estimators.deeplearning.H2ODeepLearningEstimator() m.train(x, y, train) p = m.predict(test)