In this paper, we model the behavior of a finite-state downlink wireless fading channel based on the configurations and system parameters provided by the 3GPP LTE. By partitioning the range of the received signal-to-noise ratio into a finite number of intervals, finite-state Markov channel models can be constructed for Rayleigh fading channels. Each state corresponds to a different channel quality indicated by certain modulation scheme. In LTE, Node B is capable of transmitting frames in the downlink with different modulation schemes (QPSK, 16QAM, 64QAM). The model can be used to provide realistic physical layer input to evaluate the performance of algorithms at the upper layers. For example, the MAC layer will use this data to test the performance of scheduling, admission control, power control, etc. Computer simulations are performed to verify the accuracy of the proposed model.