Keras是基于python的深度学习库
Keras是一个高层神经网络API,Keras由纯Python编写而成并基、以及后端。
安装步骤及遇到的坑:
(1)安装tensorflow:CMD命令行输入pip install --upgrade tensorflow
(2)安装Keras:pip install keras -U --pre
(3)验证tensorflow
jupyter notebook或者spyder输入以下代码:
import tensorflow as tf hello = tf.constant(“hello,tensorflow”) sess = tf.Session() print(sess.run(hello))
能显示“hello,tensorflow”则表示安装成功
(4)验证keras,
使用Keras中mnist数据集测试 下载Keras开发包,命令行输入以下命令
>>> conda install git #安装git工具 >>> git clone https://github.com/fchollet/keras.git #下载keras工程内容 >>> cd keras/examples/ #进入测试代码所在路径 >>> python mnist_mlp.py #执行测试代码
验证keras时遇到两个坑,问题描述及解决方案如下:
(1)conda更新失败,安装git工具遇到CondaHTTPError: HTTP 000 CONNECTION FAILED for url <https://repo.anaconda.com/pkgs/main/win-64/git-2问题,解决办法是修改国内镜像源,改为清华镜像源即可
>>>conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/>>>conda config --set show_channel_urls yes #生成配置文件
修改生成的配置文件 C:\Users\<你的用户名>\.condarc
#修改前 channels: - https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/ - default ssl_verify: trueshow_channel_urls: true #修改后
channels: - https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/ ssl_verify: trueshow_channel_urls: true
>>>conda info命令查看配置信息,确认修改成功后,>>>conda install git即可完成下载更新
(2)keras中的example案例中MNIST数据集无法下载
问题原因:keras 源码中下载MNIST的方式是 path = get_file(path, origin='https://s3.amazonaws.com/img-datasets/mnist.npz'),数据源是通过 url = https://s3.amazonaws.com/img-datasets/mnist.npz 进行下载的。访问该 url 地址被墙了,导致 MNIST 相关的案例都
卡在数据下载部分
解决办法:
(a)下载好 mnist_npz 数据集,并将其放于 .\keras\examples 目录下
(b)修改mnist_mlp.py
'''Trains a simple deep NN on the MNIST dataset.Gets to 98.40% test accuracy after 20 epochs(there is *a lot* of margin for parameter tuning).2 seconds per epoch on a K520 GPU.'''from __future__ import print_functionimport kerasfrom keras.datasets import mnistfrom keras.models import Sequentialfrom keras.layers import Dense, Dropoutfrom keras.optimizers import RMSpropbatch_size = 128num_classes = 10epochs = 20#load data from localimport numpy as nppath = "./mnist.npz"f = np.load(path)x_train, y_train = f["x_train"], f["y_train"]x_test, y_test = f["x_test"], f["y_test"]f.close()# the data, split between train and test sets#(x_train, y_train), (x_test, y_test) = mnist.load_data()x_train = x_train.reshape(60000, 784)x_test = x_test.reshape(10000, 784)x_train = x_train.astype('float32')x_test = x_test.astype('float32')x_train /= 255x_test /= 255print(x_train.shape[0], 'train samples')print(x_test.shape[0], 'test samples')# convert class vectors to binary class matricesy_train = keras.utils.to_categorical(y_train, num_classes)y_test = keras.utils.to_categorical(y_test, num_classes)model = Sequential()model.add(Dense(512, activation='relu', input_shape=(784,)))model.add(Dropout(0.2))model.add(Dense(512, activation='relu'))model.add(Dropout(0.2))model.add(Dense(num_classes, activation='softmax'))model.summary()model.compile(loss='categorical_crossentropy', optimizer=RMSprop(), metrics=['accuracy'])history = model.fit(x_train, y_train, batch_size=batch_size, epochs=epochs, verbose=1, validation_data=(x_test, y_test))score = model.evaluate(x_test, y_test, verbose=0)print('Test loss:', score[0])print('Test accuracy:', score[1])