在人工神經網絡領域中,感知機也被指為單層的人工神經網絡,以區別于較復雜的多層感知機(Multilayer Perceptron)。 作為一種線性分類器,(單層)感知機可說是最簡單的前向人工神經網絡形式。盡管結構簡單,感知機能夠學習并解決相當復雜的問題。感知機主要的本質缺陷是它不能處理線性不可分問題。
?
感知機使用特征向量來表示的前饋式人工神經網絡,它是一種二元分類器,把矩陣上的輸入(實數值向量)映射到輸出值上(一個二元的值)。
是實數的表式權重的向量,是點積。是偏置,一個常數不依賴于任何輸入值。偏置可以認為是激勵函數的偏移量,或者給神經元一個基礎活躍等級。
(0 或 1)用于對進行分類,看它是肯定的還是否定的,這屬于二元分類問題。如果是否定的,那么加權后的輸入必須產生一個肯定的值并且大于,這樣才能令分類神經元大于閾值0。從空間上看,偏置改變了決策邊界的位置(雖然不是定向的)。
由于輸入直接經過權重關系轉換為輸出,所以感知機可以被視為最簡單形式的前饋式人工神經網絡。
?
>> P=[0 1 0 1 1;1 1 1 0 0]
P =
???? 0???? 1???? 0???? 1???? 1
???? 1???? 1???? 1???? 0???? 0
>>
>> T=[0 1 0 0 0]
T =
???? 0???? 1???? 0???? 0???? 0
>> net = newp(minmax(P),1)
net =
??? Neural Network object:
??? architecture:
???????? numInputs: 1
???????? numLayers: 1
?????? biasConnect: [1]
????? inputConnect: [1]
????? layerConnect: [0]
???? outputConnect: [1]
??????? numOutputs: 1? (read-only)
??? numInputDelays: 0? (read-only)
??? numLayerDelays: 0? (read-only)
??? subobject structures:
??????????? inputs: {1x1 cell} of inputs
??????????? layers: {1x1 cell} of layers
?????????? outputs: {1x1 cell} containing 1 output
??????????? biases: {1x1 cell} containing 1 bias
????? inputWeights: {1x1 cell} containing 1 input weight
????? layerWeights: {1x1 cell} containing no layer weights
??? functions:
????????? adaptFcn: 'trains'
???????? divideFcn: (none)
?????? gradientFcn: 'calcgrad'
?????????? initFcn: 'initlay'
??????? performFcn: 'mae'
????????? plotFcns: {'plotperform','plottrainstate'}
????????? trainFcn: 'trainc'
??? parameters:
??????? adaptParam: .passes
?????? divideParam: (none)
???? gradientParam: (none)
???????? initParam: (none)
????? performParam: (none)
??????? trainParam: .show, .showWindow, .showCommandLine, .epochs,
??????????????????? .goal, .time
??? weight and bias values:
??????????????? IW: {1x1 cell} containing 1 input weight matrix
??????????????? LW: {1x1 cell} containing no layer weight matrices
???????????????? b: {1x1 cell} containing 1 bias vector
??? other:
????????????? name: ''
????????? userdata: (user information)
>> net.iw{1,1}
ans =
???? 0???? 0
?
>> net.iw{1,1}=[1 1]
net =
??? Neural Network object:
??? architecture:
???????? numInputs: 1
???????? numLayers: 1
?????? biasConnect: [1]
????? inputConnect: [1]
????? layerConnect: [0]
???? outputConnect: [1]
??????? numOutputs: 1? (read-only)
??? numInputDelays: 0? (read-only)
??? numLayerDelays: 0? (read-only)
??? subobject structures:
??????????? inputs: {1x1 cell} of inputs
??????????? layers: {1x1 cell} of layers
?????????? outputs: {1x1 cell} containing 1 output
??????????? biases: {1x1 cell} containing 1 bias
????? inputWeights: {1x1 cell} containing 1 input weight
????? layerWeights: {1x1 cell} containing no layer weights
??? functions:
????????? adaptFcn: 'trains'
???????? divideFcn: (none)
?????? gradientFcn: 'calcgrad'
?????????? initFcn: 'initlay'
??????? performFcn: 'mae'
????????? plotFcns: {'plotperform','plottrainstate'}
????????? trainFcn: 'trainc'
??? parameters:
??????? adaptParam: .passes
?????? divideParam: (none)
???? gradientParam: (none)
???????? initParam: (none)
????? performParam: (none)
??????? trainParam: .show, .showWindow, .showCommandLine, .epochs,
??????????????????? .goal, .time
??? weight and bias values:
??????????????? IW: {1x1 cell} containing 1 input weight matrix
??????????????? LW: {1x1 cell} containing no layer weight matrices
???????????????? b: {1x1 cell} containing 1 bias vector
??? other:
????????????? name: ''
????????? userdata: (user information)
>> net.b{1}
ans =
???? 0
>> net.b{1}=-2
net =
??? Neural Network object:
??? architecture:
???????? numInputs: 1
???????? numLayers: 1
?????? biasConnect: [1]
????? inputConnect: [1]
????? layerConnect: [0]
???? outputConnect: [1]
??????? numOutputs: 1? (read-only)
??? numInputDelays: 0? (read-only)
??? numLayerDelays: 0? (read-only)
??? subobject structures:
??????????? inputs: {1x1 cell} of inputs
??????????? layers: {1x1 cell} of layers
?????????? outputs: {1x1 cell} containing 1 output
??????????? biases: {1x1 cell} containing 1 bias
????? inputWeights: {1x1 cell} containing 1 input weight
????? layerWeights: {1x1 cell} containing no layer weights
??? functions:
????????? adaptFcn: 'trains'
???????? divideFcn: (none)
?????? gradientFcn: 'calcgrad'
?????????? initFcn: 'initlay'
??????? performFcn: 'mae'
????????? plotFcns: {'plotperform','plottrainstate'}
????????? trainFcn: 'trainc'
??? parameters:
??????? adaptParam: .passes
?????? divideParam: (none)
???? gradientParam: (none)
???????? initParam: (none)
????? performParam: (none)
??????? trainParam: .show, .showWindow, .showCommandLine, .epochs,
??????????????????? .goal, .time
??? weight and bias values:
??????????????? IW: {1x1 cell} containing 1 input weight matrix
??????????????? LW: {1x1 cell} containing no layer weight matrices
???????????????? b: {1x1 cell} containing 1 bias vector
??? other:
????????????? name: ''
????????? userdata: (user information)
我們這個感知器的任務就是完成 and?運算,即只有輸入的2個元素都為?1,輸出才為1
sim是仿真函數,對神經網絡進行仿真,可以理解為進行測試
>> sim(net,[0;1])
ans =
???? 0
>> sim(net,[1;1])
ans =
???? 1
>> sim(net,[1;0])
ans =
???? 0
>> sim(net,[1;0])
ans =
???? 0
>> y=sim(net,[1;0])
y =
???? 0
mae為計算平均誤差,e為誤差矩陣,因為人為的設置了正確的權值,所以沒有誤差,t為正確輸出,y為感知機的實際輸出
>> y=sim(net,[1 0 1;0 1 1])
y =
???? 0???? 0???? 1
>> t=[0 0 1]
t =
???? 0???? 0???? 1
>> e=t-y
e =
???? 0???? 0???? 0
>> perf=mae(3)
perf =
???? 3
>> perf=mae(e)
perf =
???? 0
>>
?
注意上面的輸入方式是1;0為一組樣本,然后0;1為另一組樣本,1;1為最后一組樣本,共3 組樣本,見下面示例
ans(1,1)和ans(2,1)是一組輸入數據
>> [1 0 1;0 1 1]
ans =
???? 1???? 0???? 1
???? 0???? 1???? 1
?
?我們把權值修改成一個錯的,來看看誤差矩陣和平均誤差
>> y=sim(net,[1 0 1;0 1 1])
y =
???? 0???? 0???? 0
>> e=t-y
e =
???? 0???? 0???? 1
>> perf=mae(e)
perf =
??? 0.3333
>> t
t =
???? 0???? 0???? 1
>>
更多文章、技術交流、商務合作、聯系博主
微信掃碼或搜索:z360901061

微信掃一掃加我為好友
QQ號聯系: 360901061
您的支持是博主寫作最大的動力,如果您喜歡我的文章,感覺我的文章對您有幫助,請用微信掃描下面二維碼支持博主2元、5元、10元、20元等您想捐的金額吧,狠狠點擊下面給點支持吧,站長非常感激您!手機微信長按不能支付解決辦法:請將微信支付二維碼保存到相冊,切換到微信,然后點擊微信右上角掃一掃功能,選擇支付二維碼完成支付。
【本文對您有幫助就好】元
