In the past few decades, data-driven deep learning analysis has been developed and successfully applied in many domains, such as spectral signal analysis. The convolutional layer was regarded as an efficient feature extractor, and the residual learning framework was used to train the deep neural network. In this paper, by replacing identity blocks in a residual network with proposed short-cut structures, we present a new residual learning framework coupled with multiscale convolutional layers, whose advantages are discussed from the perspective of gradient descent during backpropagation and information theory. This study proved that the proposed short-cut structure can pass the loss gradient from the last layer to minimize the vanishing gradient problem. The experiments also showed that the multiscale design of our network can extract features from spectral signals more efficiently than traditional convolutional layers. The proposed new residual framework was tested on four public datasets to prove the network’s efficiency. Compared to the traditional residual network, the proposed method can decrease RMSE by 36.9% on average and the R2 reached 0.9544 on average.