国产xxxx99真实实拍_久久不雅视频_高清韩国a级特黄毛片_嗯老师别我我受不了了小说

資訊專欄INFORMATION COLUMN

Simplest Linear Regression on Keras Framework

Kross / 672人閱讀

It"s a regression problem with one feature inputted.

I wrote this script for fun and for the preparation of oncoming Mathematical modeling contest(also simply in order to complete the task of a daily blog?( ?? ω ?? )y), didn"t took a lot of time(It means I can have time to sleep...).

It was accomplished all by myself, that means there is no reference to github"s code. Well, great progress!

I committed it onto my own GitHub, which was not well organized.

Import Packages

import numpy as np
from keras.models import Sequential 
from keras.layers import Dense 
import matplotlib.pyplot as plt 
print ("Import finished")

Because the importing of Keras took a little bit more time, I need a hint that they"ve been successfully imported:

Generating Data

Make them out of sequence in order to make random splitting,
Add some noise:

X = np.linspace(0, 2, 300) 
np.random.shuffle(X)
Y = 3 * X + np.random.randn(*X.shape) * 0.33

Data visualization

plt.scatter(X,Y)
plt.show()
print (X[:10],"
",Y[:10])

Define Train and Test Data

X_train,Y_train = X[:260],Y[:260]
X_test,Y_test = X[260:],Y[260:]

Establish LR Model
input and output dimensions are both set as 1

model = Sequential()
model.add(Dense(units=1, kernel_initializer="uniform", activation="linear", input_dim=1))
weights = model.layers[0].get_weights() 
w_init = weights[0][0][0] 
b_init = weights[1][0] 
print("Linear regression model is initialized with weights w: %.2f, b: %.2f" % (w_init, b_init)) 

see the default coefficients:

Choose Loss-Function and Optimizer
Define loss as mean squared error, choose stochastic gradient descent as optimizer:

model.compile(loss="mse", optimizer="sgd")

Train Model
Run 500 epochs of iterations of sgd.

model.fit(X_train, Y_train, epochs=500, verbose=1)

The loss eventually stabilizes at around 0.0976:

Test Model

Y_pred = model.predict(X_test)
plt.scatter(X_test,Y_test)
plt.plot(X_test,Y_pred)
plt.show()
weights = model.layers[0].get_weights() 
w_init = weights[0][0][0] 
b_init = weights[1][0] 
print("Linear regression model is trained with weights w: %.2f, b: %.2f" % (w_init, b_init)) 

The final weights are 3.00 and 0.03, very close to the setted one(3.00, 0.33), the error of 0.03 might caused by the noise.

Use model

Input 1.66 as feature:

a = np.array([1.66])
Pre=model.predict(a)
print (Pre)

Tomorrow I would change this script into multi-dimensional regression machine, which can solve multi-feature regression problems.

文章版權(quán)歸作者所有,未經(jīng)允許請勿轉(zhuǎn)載,若此文章存在違規(guī)行為,您可以聯(lián)系管理員刪除。

轉(zhuǎn)載請注明本文地址:http://specialneedsforspecialkids.com/yun/40760.html

相關(guān)文章

  • Keras TensorFlow教程:如何從零開發(fā)一個(gè)復(fù)雜深度學(xué)習(xí)模型

    摘要:目前,是成長最快的一種深度學(xué)習(xí)框架。這將是對社區(qū)發(fā)展的一個(gè)巨大的推動(dòng)作用。以下代碼是如何開始導(dǎo)入和構(gòu)建序列模型。現(xiàn)在,我們來構(gòu)建一個(gè)簡單的線性回歸模型。 作者:chen_h微信號 & QQ:862251340微信公眾號:coderpai簡書地址:https://www.jianshu.com/p/205... Keras 是提供一些高可用的 Python API ,能幫助你快速的構(gòu)建...

    cyqian 評論0 收藏0
  • 2018 AI、機(jī)器學(xué)習(xí)、深度學(xué)習(xí)與 Tensorflow 相關(guān)優(yōu)秀書籍、課程、示例鏈接集錦

    摘要:機(jī)器學(xué)習(xí)深度學(xué)習(xí)與自然語言處理領(lǐng)域推薦的書籍列表人工智能深度學(xué)習(xí)與相關(guān)書籍課程示例列表是筆者系列的一部分對于其他的資料集錦模型開源工具與框架請參考。 showImg(https://segmentfault.com/img/remote/1460000014946199); DataScienceAI Book Links | 機(jī)器學(xué)習(xí)、深度學(xué)習(xí)與自然語言處理領(lǐng)域推薦的書籍列表 sho...

    wenshi11019 評論0 收藏0

發(fā)表評論

0條評論

最新活動(dòng)
閱讀需要支付1元查看
<