Introduction
- Basic neural network diagram
- inputs + times weight + add a bias + activate + repeat
- discrete lots of blocks
- stacked layers of simple computation nodes make up of neural networks and work together to approximate a function
- The basic concept of NODE
- From discrete layers to continuous functions
- one continuous block
- Why do we need continuousness?
- Faster testing time than recurrent networks, but slower training time. Perfect for low-power edge computing! (precision vs speed)
- More accurate results for time series predictions () i.e continous-time models
- 0pens up a whole new realm of mathematics for optimizing neural networks (Diff Equation Solvers, 100+ years of theory)
- Compute gradients with constant memory cost
Reference