Fiber-Net ramble

 I've had this idea for a while, but haven't built one. Essentially, the common version of the universal approximation theorem states that we can arbitrarily well approximate any function given that we have unlimited linear operations and a nonlinearity of choice.

Signals in optical fiber can be filtered, meaning their signal is decreased by an arbitrary percentage. They can be split and merged, and non-linear phenomena like the Kerr effect exist in practice (https://www.nature.com/articles/s41467-023-41377-5). My suspicion is that this is enough to arbitrarily well approximate any function up to a scalar constant, and can be proven by taking standard neural network construction and simply multiplying the output down by a scalar factor at each layer in order that the 'increasing' components in an output stay the same or lower magnitude at the output.

What you would get in practice is a neural network capable of evaluating instantly and with almost unlimited throughput at no-cost. Particularly with CNNs and other image ingestion tools I can see this working very well.

No comments:

Post a Comment

March thoughts

Lets start by taking any system of ordinary differential equations. We can of course convert this to a first order system by creating stand-...