Echocardiography is a widely used, affordable, and high-throughput test to evaluate heart conditions. There are two cardiac phases known as end-systolic (ES) and end-diastolic (ED), which are observed by cardiologists and technicians during echocardiography. These phases are used to perform critical calculations during echocardiography, such as calculating heart chamber size and the calculation of ejection fraction. Typically, ED and ES phase detection is performed manually by a technician or cardiologist, which makes echocardiography interpretation time-consuming and error-prone. Therefore, it is crucial to develop an automated and efficient technique to detect the cardiac phases and minimize diagnostic errors. In this paper, we propose a deep learning model, namely DeepPhase, for accurate and automated interpretation of the ES and ED phases to assist cardiology personnel. Our proposed convolutional neural network (CNN) is designed to learn from echocardiography images and identify the cardiac phase from the given image without segmentation of left ventrical or the use of electrocardiograms. We have performed an extensive evaluation on two-real world echocardiography image datasets, namely, the benchmark Cardiac Acquisitions for Multistructure Ultrasound Segmentation (CAMUS) dataset and a new dataset (referred to as CardiacPhase) that we collected from a cardiac hospital. The proposed model outperformed relevant state-of-the-art techniques by achieving 0.98 and 0.92 area under the curve (AUC) on the CAMUS dataset, and the CardiacPhase dataset, respectively. We have proved the generalizability of our proposed work by training and testing with different cardiac views.