Computer Vision News - January 2019

24 Computer Vision News Focus on Using Hyperas, you need nothing more than scroll brackets to define the range of values or options you want evaluated for each hyperparameter or network configuration element, and Hyperas will tune them for you. For our example, we shall evaluate Dropout values changing at a uniform rate between 0 and 1, compare two different activation functions ( relu and sigmoid ), and 3 possible values for number of neurons in the dense layer: 256, 512 and 1024. Three optimization algorithms will be evaluated: adam, rmpProp and SGD. All of this code will be written inside a create_model function, which will both compile the model and train it. This function will be sent to optim.minimize , which will return the best performing model. def create_model(x_train, y_train, x_test, y_test): model = Sequential() model.add(Dense(512, input_shape=(784,))) model.add(Activation('relu')) model.add(Dropout(0.2)) model.add(Dense(512)) model.add(Activation('relu')) model.add(Dropout(0.2) model.add(Dense(10)) model.add(Activation('softmax')) model = Sequential() model.add(Dense(512, input_shape=(784,))) model.add(Activation('relu')) model.add(Dropout({{uniform(0, 1)}})) model.add(Dense({{choice([256, 512, 1024])}})) model.add(Activation({{choice(['relu', 'sigmoid'])}})) model.add(Dropout({{uniform(0, 1)}})) # If we choose 'four', add an additional fourth layer if {{choice(['three', 'four'])}} == 'four': model.add(Dense(100)) # We can also choose between complete sets of layers model.add({{choice([Dropout(0.5), Activation('linear')])}}) model.add(Activation('relu')) model.add(Dense(10)) model.add(Activation('softmax')) model.compile(loss='categorical_crossentropy', metrics=['accuracy'], optimizer={{choice(['rmsprop', 'adam', 'sgd'])}}) Tip - Train Your Network

RkJQdWJsaXNoZXIy NTc3NzU=