On Sunday, May 12, 2024, at 10:38 AM, Matt Mahoney wrote:
> All neural networks are trained by some variation of adjusting anything that 
> is adjustable in the direction that reduces error. The problem with KAN alone 
> is you have a lot fewer parameters to adjust, so you need a lot more neurons 
> to represent the same function space. That's even with 2 parameters per 
> neuron, threshold level and steepness. The human brain has another 7000 
> parameters per neuron in the synaptic weights.

I bet in some of these so-called “compressor” apps that Matt always looks at 
there is some serious NN structure tweaking going on there. They’re open 
source, right? Do people obfuscate the code when submitting?


Well it’s kinda obvious but transformations like this:

(Universal Approximation Theorem) => (Kolmogorov-Arnold Representation Theorem)

There’s going to be more of them.

Automating or not I’m sure researchers are on it.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T1af6c40307437a26-Md991f57050d37e51db0e68c5
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to