This letter deals with neural networks as dynamical systems governed by finite difference equations. It shows that the introduction of $k$-many skip connections into network architectures, such as residual networks and additive dense networks, defines $k$th order dynamical equations on the layer-wise transformations. Closed-form solutions for the state-space representations of general $k$th order additive dense networks, where the concatenation operation is replaced by addition, as well as $k$th order smooth networks, are found. The developed provision endows deep neural networks with an algebraic structure. Furthermore, it is shown that imposing $k$th order smoothness on network architectures with $d$-many nodes per layer increases the state-space dimension by a multiple of $k$, and so the effective embedding dimension of the data manifold by the neural network is $k·d$-many dimensions. It follows that network architectures of these types reduce the number of parameters needed to maintain the same embedding dimension by a factor of $k2$ when compared to an equivalent first-order, residual network. Numerical simulations and experiments on CIFAR10, SVHN, and MNIST have been conducted to help understand the developed theory and efficacy of the proposed concepts.