成人午夜视频全免费观看高清-秋霞福利视频一区二区三区-国产精品久久久久电影小说-亚洲不卡区三一区三区一区

關(guān)于CNN的優(yōu)化-創(chuàng)新互聯(lián)

import numpy as np

創(chuàng)新互聯(lián)公司是一家集網(wǎng)站建設(shè),白城企業(yè)網(wǎng)站建設(shè),白城品牌網(wǎng)站建設(shè),網(wǎng)站定制,白城網(wǎng)站建設(shè)報(bào)價(jià),網(wǎng)絡(luò)營(yíng)銷,網(wǎng)絡(luò)優(yōu)化,白城網(wǎng)站推廣為一體的創(chuàng)新建站企業(yè),幫助傳統(tǒng)企業(yè)提升企業(yè)形象加強(qiáng)企業(yè)競(jìng)爭(zhēng)力??沙浞譂M足這一群體相比中小企業(yè)更為豐富、高端、多元的互聯(lián)網(wǎng)需求。同時(shí)我們時(shí)刻保持專業(yè)、時(shí)尚、前沿,時(shí)刻以成就客戶成長(zhǎng)自我,堅(jiān)持不斷學(xué)習(xí)、思考、沉淀、凈化自己,讓我們?yōu)楦嗟钠髽I(yè)打造出實(shí)用型網(wǎng)站。

# 序列化和反序列化

import pickle

from sklearn.preprocessing import OneHotEncoder

import warnings

warnings.filterwarnings('ignore')

import tensorflow as tf

數(shù)據(jù)加載(使用pickle)

def unpickle(file):

import pickle

with open(file, 'rb') as fo:

dict = pickle.load(fo, encoding='ISO-8859-1')

return dict

labels = []

X_train = []

for i in range(1,6):

data = unpickle('./cifar-10-batches-py/data_batch_%d'%(i))

labels.append(data['labels'])

X_train.append(data['data'])

# 將list類型轉(zhuǎn)換為ndarray

X_train = np.array(X_train)

y_train = np.array(labels).reshape(-1)

# reshape

X_train = X_train.reshape(-1,3072)

# 目標(biāo)值概率

one_hot = OneHotEncoder()

y_train =one_hot.fit_transform(y_train.reshape(-1,1)).toarray()

# 測(cè)試數(shù)據(jù)加載

test = unpickle('./cifar-10-batches-py/test_batch')

X_test = test['data']

y_test = one_hot.transform(np.array(test['labels']).reshape(-1,1)).toarray()

# 從總數(shù)據(jù)中獲取一批數(shù)據(jù)

index = 0

def next_batch(X,y):

global index

batch_X = X[index*128:(index+1)*128]

batch_y = y[index*128:(index+1)*128]

index+=1

if index == 390:

index = 0

return batch_X,batch_y

構(gòu)建神經(jīng)網(wǎng)絡(luò)

1.生成對(duì)應(yīng)卷積核

2.tf.nn.conv2d進(jìn)行卷積運(yùn)算

3.歸一化操作 tf.layers.batch_normalization

4.激活函數(shù)(relu)

5.池化操作

X = tf.placeholder(dtype=tf.float32,shape = [None,3072])

y = tf.placeholder(dtype=tf.float32,shape = [None,10])

kp = tf.placeholder(dtype=tf.float32)

def gen_v(shape,std = 5e-2):

return tf.Variable(tf.truncated_normal(shape = shape,stddev=std))

def conv(input_,filter_,b):

conv = tf.nn.conv2d(input_,filter_,strides=[1,1,1,1],padding='SAME') + b

conv = tf.layers.batch_normalization(conv,training=True)

conv = tf.nn.relu(conv)

return tf.nn.max_pool(conv,[1,3,3,1],[1,2,2,1],'SAME')

def net_work(X,kp):

# 形狀改變,4維

input_ = tf.reshape(X,shape = [-1,32,32,3])

# 第一層

filter1 = gen_v(shape = [3,3,3,64])

b1 = gen_v(shape = [64])

pool1 = conv(input_,filter1,b1)

# 第二層

filter2 = gen_v([3,3,64,128])

b2 = gen_v(shape = [128])

pool2 = conv(pool1,filter2,b2)

# 第三層

filter3 = gen_v([3,3,128,256])

b3 = gen_v([256])

pool3 = conv(pool2,filter3,b3)

# 第一層全連接層

dense = tf.reshape(pool3,shape = [-1,4*4*256])

fc1_w = gen_v(shape = [4*4*256,1024])

fc1_b = gen_v([1024])

bn_fc_1 = tf.layers.batch_normalization(tf.matmul(dense,fc1_w) + fc1_b,training=True)

relu_fc_1 = tf.nn.relu(bn_fc_1)

# fc1.shape = [-1,1024]

# dropout

dp = tf.nn.dropout(relu_fc_1,keep_prob=kp)

# fc2 輸出層

out_w = gen_v(shape = [1024,10])

out_b = gen_v(shape = [10])

out = tf.matmul(dp,out_w) + out_b

return out

損失函數(shù)準(zhǔn)確率&最優(yōu)化

out = net_work(X,kp)

loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits_v2(labels=y,logits=out))

# 準(zhǔn)確率

y_ = tf.nn.softmax(out)

# equal 相當(dāng)于 ==

equal = tf.equal(tf.argmax(y,axis = -1),tf.argmax(y_,axis = 1))

accuracy = tf.reduce_mean(tf.cast(equal,tf.float32))

opt = tf.train.AdamOptimizer(0.01).minimize(loss)

opt鄭州婦科醫(yī)院 http://www.120zzkd.com/

開(kāi)啟訓(xùn)練

saver = tf.train.Saver()

epoches = 100

with tf.Session() as sess:

sess.run(tf.global_variables_initializer())

for i in range(epoches):

batch_X,batch_y = next_batch(X_train,y_train)

opt_,loss_ ,score_train= sess.run([opt,loss,accuracy],feed_dict = {X:batch_X,y:batch_y,kp:0.5})

print('iter count:%d。mini_batch loss:%0.4f。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:%0.4f。測(cè)試數(shù)據(jù)上準(zhǔn)確率:%0.4f'%

(i+1,loss_,score_train,score_test))

if score_train > 0.6:

saver.save(sess,'./model/estimator',i+1)

saver.save(sess,'./model/estimator',i+1)

score_test = sess.run(accuracy,feed_dict = {X:X_test,y:y_test,kp:1.0})

print('測(cè)試數(shù)據(jù)上的準(zhǔn)確率:',score_test)

iter count:1。mini_batch loss:3.1455。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.0938。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.2853

iter count:2。mini_batch loss:3.9139。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.2891。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.2853

iter count:3。mini_batch loss:5.1961。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.1562。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.2853

iter count:4。mini_batch loss:3.9102。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.2344。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.2853

iter count:5。mini_batch loss:4.1278。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.1719。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.2853

.....

iter count:97。mini_batch loss:1.5752。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.4844。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.2853

iter count:98。mini_batch loss:1.8480。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.3906。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.2853

iter count:99。mini_batch loss:1.5662。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.5391。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.2853

iter count:100。mini_batch loss:1.7489。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.4141。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.2853

測(cè)試數(shù)據(jù)上的準(zhǔn)確率: 0.4711

epoches = 1100

with tf.Session() as sess:

saver.restore(sess,'./model/estimator-100')

for i in range(100,epoches):

batch_X,batch_y = next_batch(X_train,y_train)

opt_,loss_ ,score_train= sess.run([opt,loss,accuracy],feed_dict = {X:batch_X,y:batch_y,kp:0.5})

print('iter count:%d。mini_batch loss:%0.4f。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:%0.4f。測(cè)試數(shù)據(jù)上準(zhǔn)確率:%0.4f'%

(i+1,loss_,score_train,score_test))

if score_train > 0.6:

saver.save(sess,'./model/estimator',i+1)

saver.save(sess,'./model/estimator',i+1)

if (i%100 == 0) and (i != 100):

score_test = sess.run(accuracy,feed_dict = {X:X_test,y:y_test,kp:1.0})

print('----------------測(cè)試數(shù)據(jù)上的準(zhǔn)確率:---------------',score_test)

iter count:101。mini_batch loss:1.4157。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.5234。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.4711

iter count:102。mini_batch loss:1.6045。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.4375。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.4711

....

iter count:748。mini_batch loss:0.6842。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.7734。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.4711

iter count:749。mini_batch loss:0.6560。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.8203。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.4711

iter count:750。mini_batch loss:0.7151。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.7578。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.4711

iter count:751。mini_batch loss:0.8092。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.7344。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.4711

iter count:752。mini_batch loss:0.7394。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.7422。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.4711

iter count:753。mini_batch loss:0.8732。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.7188。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.4711

iter count:754。mini_batch loss:0.8762。訓(xùn)練數(shù)據(jù)上的準(zhǔn)確率:0.6953。測(cè)試數(shù)據(jù)上準(zhǔn)確率:0.4711

另外有需要云服務(wù)器可以了解下創(chuàng)新互聯(lián)cdcxhl.cn,海內(nèi)外云服務(wù)器15元起步,三天無(wú)理由+7*72小時(shí)售后在線,公司持有idc許可證,提供“云服務(wù)器、裸金屬服務(wù)器、高防服務(wù)器、香港服務(wù)器、美國(guó)服務(wù)器、虛擬主機(jī)、免備案服務(wù)器”等云主機(jī)租用服務(wù)以及企業(yè)上云的綜合解決方案,具有“安全穩(wěn)定、簡(jiǎn)單易用、服務(wù)可用性高、性價(jià)比高”等特點(diǎn)與優(yōu)勢(shì),專為企業(yè)上云打造定制,能夠滿足用戶豐富、多元化的應(yīng)用場(chǎng)景需求。

網(wǎng)頁(yè)名稱:關(guān)于CNN的優(yōu)化-創(chuàng)新互聯(lián)
URL鏈接:http://jinyejixie.com/article8/dchcop.html

成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供網(wǎng)站排名、云服務(wù)器、面包屑導(dǎo)航、App設(shè)計(jì)網(wǎng)站收錄、網(wǎng)站建設(shè)

廣告

聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請(qǐng)盡快告知,我們將會(huì)在第一時(shí)間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如需處理請(qǐng)聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時(shí)需注明來(lái)源: 創(chuàng)新互聯(lián)

h5響應(yīng)式網(wǎng)站建設(shè)
慈利县| 清水河县| 赫章县| 江都市| 东港市| 报价| 桃源县| 桦甸市| 邹城市| 阿合奇县| 班玛县| 阿瓦提县| 永修县| 佛冈县| 福泉市| 文山县| 宜良县| 商城县| 汝南县| 平湖市| 金昌市| 灵台县| 东安县| 醴陵市| 九台市| 勐海县| 青冈县| 恭城| 盐津县| 金湖县| 红原县| 滨海县| 兴文县| 塘沽区| 绍兴县| 马山县| 鄂托克前旗| 镇江市| 依安县| 石狮市| 宁乡县|