fossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 3 days agoSquiggly Boiemander.xyzimagemessage-square65fedilinkarrow-up1864arrow-down113
arrow-up1851arrow-down1imageSquiggly Boiemander.xyzfossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 3 days agomessage-square65fedilink
minus-squareBeeegScaaawyCripple@lemmy.worldlinkfedilinkEnglisharrow-up5·2 days agoohhhh so that’s The model for neural networks, not A model for neural networks
minus-squareTamo240@programming.devlinkfedilinkEnglisharrow-up8·2 days agoIts an abstraction for neural networks. Different individual networks might vary in number of layers (columns), nodes (circles), or loss function (lines), but the concept is consistent across all.
minus-squareNotANumber@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up4·edit-22 days agoKinda but also no. That’s specifically a dense neural network or MLP. It gets a lot more complicated than that in some cases.
minus-squareNotANumber@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up5·2 days agoIt’s only one type of neural network. A dense MLP. You have sparse neural networks, recurrent neural networks, convolutional neural networks and more!
ohhhh so that’s The model for neural networks, not A model for neural networks
Its an abstraction for neural networks. Different individual networks might vary in number of layers (columns), nodes (circles), or loss function (lines), but the concept is consistent across all.
Kinda but also no. That’s specifically a dense neural network or MLP. It gets a lot more complicated than that in some cases.
It’s only one type of neural network. A dense MLP. You have sparse neural networks, recurrent neural networks, convolutional neural networks and more!