Well of course not. If you could fit any neural network to any function quickly, you would have super powers. But in practice, local optima do seem to stop being an issue in big neural networks.
Also this article shows a method of how to construct a lookup table from a neural network in linear time. So in the worst case you can just memorize the input to output table, quickly. In the best case you can fit a simple elegant model, which fits the data perfectly with very few parameters. Given unlimited amounts of time to search the parameter space. Real world NNs are somewhere between these two.
Also this article shows a method of how to construct a lookup table from a neural network in linear time. So in the worst case you can just memorize the input to output table, quickly. In the best case you can fit a simple elegant model, which fits the data perfectly with very few parameters. Given unlimited amounts of time to search the parameter space. Real world NNs are somewhere between these two.