开启掘金成长之旅!这是我参与「掘金日新计划 · 12 月更文挑战」的第四天,点击查看活动详情
总结:此文为12月更文计划第四天第九篇。
全连接层Layer
layer是输入为5,输出为100的全连接层,所以对于输入为10x5的矩阵来看,会乘以一个5x100的矩阵,所以输出就是10x100. 首先导入这次所需要的包:
%matplotlib inline
import numpy as np
import sklearn
import pandas as pd
import os
import sys
import time
import tensorflow as tf
from tensorflow import keras
print(tf.__version__)
print(sys.version_info)
for module in mpl, np, pd, sklearn, tf, keras:
print(module.__name__, module.__version__)
创建一个全连接层进行学习:
input_shape我们往往第一层指定输出就是10x100.
print(type(layer))
print('-'*50)
layer(tf.ones([10, 5])) #这里是对应层的输出
对应的输出结果如下:
layer.variables 可以打印layer里包含的所有参数
# x * w + b w就是指层的参数,kernel就是w,b就是bias
print(layer.variables)
print('-'*50)
layer.trainable_variables
#获得所有可训练的变量,和上面的变量数目一致 输出的结果如下:
对layer进行实战学习:
from sklearn.datasets import fetch_california_housing
housing = fetch_california_housing()
print(housing.DESCR)
print(housing.data.shape)
print(housing.target.shape)
导入sklearn数据包:
对这些数据进行学习,划分训练集与测试集
from sklearn.model_selection import train_test_split
x_train_all, x_test, y_train_all, y_test = train_test_split(
housing.data, housing.target, random_state = 7)
x_train, x_valid, y_train, y_valid = train_test_split(
x_train_all, y_train_all, random_state = 11)
print(x_train.shape, y_train.shape)
print(x_valid.shape, y_valid.shape)
print(x_test.shape, y_test.shape)
class CustomizedDenseLayer(keras.layers.Layer):
def __init__(self, units, activation=None, **kwargs):
self.units = units
self.activation = keras.layers.Activation(activation) #直接使用tf提供的
super(CustomizedDenseLayer, self).__init__(**kwargs)
def build(self, input_shape):
"""构建所需要的参数,也就是kernel还有bias"""
# x * w + b. input_shape:[None, a] w:[a,b]output_shape: [None, b]
print('-'*50)
print(input_shape)
self.kernel = self.add_weight(name = 'kernel',
shape = (input_shape[1], self.units),
initializer = 'uniform',#使用均匀分布的方法去初始化kernel
trainable = True)
self.bias = self.add_weight(name = 'bias',
shape = (self.units, ),
initializer = 'zeros',
trainable = True)
#接着我们要继承父类的build
super(CustomizedDenseLayer, self).build(input_shape)
对神经网络进行正向学习:
model = keras.models.Sequential([
CustomizedDenseLayer(30, activation='relu',
input_shape=x_train.shape[1:]), #这里传入的是特征数
# CustomizedDenseLayer(1),
#再加一个激活函数层,这个和下面注释的两行等价的
# customized_softplus,
CustomizedDenseLayer(1,activation=customized_softplus),
# keras.layers.Dense(1, activation="softplus"), #一层
# keras.layers.Dense(1), keras.layers.Activation('softplus'), #两层
])
输出的结果如下: