|
Wisecrack said Turns out you can treat a a function mapping parameters to outputs as a product that acts as a *scaling* of continuous inputs to outputs, and that this sits somewhere between neural nets and regression trees.
|
|
|
|
Well thats what I did, and the MAE (or error) of this works out to about ~0.5%, half a percentage point. Did training and a little validation, but the training set is only 2.5k samples, so it may just be overfitting.
|
|
|
|
The idea is you have X, y, and z.
|
|
z is your parameters. And for every row in y, you have an entry in z. You then try to find a set of z such that the product, multiplied by the value of yi, yields the corresponding value at Xi.
|
|
|
|
Naturally I gave it the ridiculous name of a 'zcombiner'.
|
|
|
|
Well, fucking turns out, this beautiful bastard of a paper just dropped in my lap, and its been around since 2020:
|
|
|
|
https://mimuw.edu.pl/~bojan/papers/...
|
|
|
|
which does the exact god damn thing.
|
|
|
|
I mean they did't realize it applies to ML, but its the same fucking math I did.
|
|
|
|
z is the monoid that finds some identity that creates an isomorphism between all the elements of all the rows of y, and all the elements of all the indexes of X.
|
|
|
|
And I just got to say it feels good.,If you're using random in python, and need arbitrary precision, use mpmath.
|
|
|
|
If you're using it with the decimal module, it doesn't automatically convert just so you know.
|
|
|
|
Instead convert the output of arb_uniform to a string before passing to the decimal module.,Worth every second it took to read:
|
|
|
|
https://gwern.net/scaling-hypothesi...,I discussed using page-rank for ML a while back here - https://devrant.com/rants/11237909/...
|
|
|
|
I talk about something vaguely similar in "scoring the matches" here though - https://pastebin.com/YwjCMvRp
|
|
|
|
Incidentally the machine learning community finally caught up and did something similar on a RAG
|
|
|
|
https://news.ycombinator.com/item/...``` |