Applications that use the Internet of Things (IoT) capture massive amounts of raw data from sensors and actuators and frequently transmit this data to cloud data centers for processing and analysis. However, due to variable and unpredictable data generation rates and network latency, sending data to a cloud data center can result in a performance bottleneck. Data processing could occur at the network’s edge with the emergence of Fog and Edge computing-hosted microservices. Detecting and tracking objects from images, videos, and live streams are two of the fastest-growing computer vision applications increasingly being deployed at the edge. You Only Look Once (YOLO) models are highly optimized deep learning methods for object detection. This paper analyzes the CPU usage of four YOLO models on an edge device, an Nvidia Jetson Nano, at two different power budgets 5 and 10 W. Results show that the average CPU usage of the four YOLO models is low in 10 W power mode compared to 5 W power mode, except for YOLOv4-tiny. Furthermore, the number of frames per second processed by the four models remains relatively the same when switching from the 10 to 5 W power modes.