WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2024 Poal.co

918

This time I tried a few variations. The first one I produced made a output like this:

one (pic8.co)
two (pic8.co)
three (pic8.co)
four (pic8.co)

The second one with only different random initial weights produced this:

https://pic8.co/sh/uDQ04b.png https://pic8.co/sh/Y49JZH.png

The third one I set the initial weights to zero. It fit the data less well, but is more smooth.

https://pic8.co/sh/GEnNAX.png https://pic8.co/sh/T8Drjq.png

I you want to graph it I have some data:

one (jssocial.pw)
two (jssocial.pw)
three (jssocial.pw)

I'm getting a large amount of variance in how well it fits based on the initial weights and or the way the training is shuffled. But clearly starting the weights at zero to remove that variance isn't the answer because it had 10x the error on it's second attempt and 100x the error on the first. The one you see there is the second run. I guess then that means that the training order is a source too.

Edit: I discovered a horrible error.

This is AND (pic8.co)
data (jssocial.pw)

This is XOR (pic8.co)
data (jssocial.pw)

Playing with -1 and 1 as outputs (pic8.co)
data (jssocial.pw)

Some really curious stuff. Playing with per weight learning rate based on if the delta of it oscillates or continues in the same direction per training.

Xor when only reducing per weight learning rate (pic8.co)
Xor when per weight learning rate can expand (pic8.co)

The last one is weird. The whole thing is more smooth (meaning less over fitting which makes sense because doing this is similar to pruning), but then their is one dot by itself at zero,zero, which is similar to something you see with super over fitting. Not good, but interesting.

This time I tried a few variations. The first one I produced made a output like this: [one](https://pic8.co/sh/U3v03H.png) [two](https://pic8.co/sh/4Ioi74.png) [three](https://pic8.co/sh/Yf7zPe.png) [four](https://pic8.co/sh/y02duQ.png) The second one with only different random initial weights produced this: https://pic8.co/sh/uDQ04b.png https://pic8.co/sh/Y49JZH.png The third one I set the initial weights to zero. It fit the data less well, but is more smooth. https://pic8.co/sh/GEnNAX.png https://pic8.co/sh/T8Drjq.png I you want to graph it I have some data: [one](https://jssocial.pw/ppkey/fget/x0x7/upload/andplot1.json) [two](https://jssocial.pw/ppkey/fget/x0x7/upload/andplot2.json) [three](https://jssocial.pw/ppkey/fget/x0x7/upload/andplot3.json) I'm getting a large amount of variance in how well it fits based on the initial weights and or the way the training is shuffled. But clearly starting the weights at zero to remove that variance isn't the answer because it had 10x the error on it's second attempt and 100x the error on the first. The one you see there is the second run. I guess then that means that the training order is a source too. **Edit:** I discovered a horrible error. [This is AND](https://pic8.co/sh/AJEcQT.png) [data](https://jssocial.pw/ppkey/fget/x0x7/upload/gand1plot.json) [This is XOR](https://pic8.co/sh/B7ajSu.png) [data](https://jssocial.pw/ppkey/fget/x0x7/upload/gxor1plot.json) [Playing with -1 and 1 as outputs](https://pic8.co/sh/i8x9wK.png) [data](https://jssocial.pw/ppkey/fget/x0x7/upload/gxornpoutplot.json) Some really curious stuff. Playing with per weight learning rate based on if the delta of it oscillates or continues in the same direction per training. [Xor when only reducing per weight learning rate](https://pic8.co/sh/qzDi1W.png) [Xor when per weight learning rate can expand](https://pic8.co/sh/knVFjT.png) The last one is weird. The whole thing is more smooth (meaning less over fitting which makes sense because doing this is similar to pruning), but then their is one dot by itself at zero,zero, which is similar to something you see with super over fitting. Not good, but interesting.

(post is archived)

[–] 0 pt (edited )

The last result you posted, Xor when per weight learning rate can expand, is my favorite. It has such a nice flow. I see a wave hitting the shallows and washing gently up on shore. Idk what it's trying to say at 0,0,-1. Ha

I have a nueral network installed on a machine but have not attempted anything. I should try it out tho.

Very neat work you have! I'll def keep an eye on your posts. Thank you for sharing. Plus send bitcoin tips when Xor knows!