![codec-compare-header]()
In the video above, I show how the Amazon Fire TV 1, Fire TV 2, and Fire TV Stick 1 each handle playing video encoded with the h.264 and h.265 codec. I run all three devices through several different test videos at various bit rates. The purpose is to show that, as long as the video codec being used is supported by the device’s dedicated hardware decoder, the CPU of the device is nearly irrelevant when it comes to playing high quality video. This is why, even though the Fire TV Stick has a fairly weak CPU, it can still play video as well as the Fire TV boxes. Continue on if you’d like to read the transcript of the video.
Video Transcript
This is a quick video comparing the 1st-gen Fire TV Stick, 1st-gen Fire TV, and 2nd-gen Fire TV relative to how they each handle h.264 encoded video vs h.265 encoded video. I’m often asked if the Fire TV Stick is a good device for video playback, since it has a much weaker CPU than the Fire TV boxes, so I put together this video to show you that as long as you’re playing video using the right codec, the CPU is almost irrelevant for video playback, because all of these devices have dedicated hardware decoders.
On the left of the screen is the 1st-gen Fire TV Stick, in the middle is the 1st-gen Fire TV, and on the right is the 2nd-gen Fire TV. All videos are being played using the latest stable version of Kodi on stock devices running the latest Fire OS software version. At the top I’ve indicated the video codec being used, the bit rate of the video, and I’ve enabled Kodi’s codec overlay so you can see each device’s CPU usage and player bit rate.
What you’re watching right now is a 40 Mbps h.264 video. This is around the quality you can expect from a Blu-ray video and, as you can see, all three devices are playing the file smoothly without overwhelming the CPU. That’s because all three devices have dedicated hardware designed to decode h.264 video.
If we bump up the bit rate to 60 Mbps, you can see the playback is still smooth and the CPU usage stays pretty much unchanged. At this bit rate we’re already way above the quality of video streamed by services like Netflix and Amazon, which both stream 1080p at less than 10 Mbps. Notice that the bit rate reported by the player is right around 60 Mbps, which is where we expect it to be.
Moving up to the extreme case of a 100 Mbps video, which nobody would realistically use, we start seeing the 1st-gen Fire TV Stick and Fire TV struggle. Notice that even though the devices can’t keep up with the video, their CPU usage hasn’t changed much. You might expect to see 100% CPU usage, but the dedicated video decoder is still the one handling the daunting task of decoding the video, so the CPU doesn’t have much to do. You can see the bit rate reported by the player on the 1st-gen Fire TV Stick and Fire TV is not reaching 100 Mbps, this indicates the hardware decoder is just not capable of decoding this much data fast enough, but the bit rate reported by the player on the 2nd-gen Fire TV is reaching high bit rates, which is why playback is still smooth even at this high of a bit rate.
Now we’ll move to a video encoded with the h.265 codec and start things off at a measly 3 Mbps. Immediately you can see the 1st-gen Fire TV Stick and Fire TV are struggling. That’s because neither device has a hardware decoder capable of decoding h.265 video. The CPU on both devices is pretty much maxed out because it has to take on the task of decoding this video since the dedicated hardware decoder does not support h.265 video. The 1st-gen Fire TV, with it’s more powerful 1.6 Ghz quad-core CPU, is able to almost keep up with the 2nd-gen Fire TV’s dedicated h.265 hardware decoder, resulting in smoother playback than the much weaker 1 Ghz dual-core Fire TV Stick. Neither device was made for h.265 encoded video, so even though it’s just a 3 Mbps bit rate video, it results in a relatively bad viewing experience.
The 2nd-gen Fire TV, on the other hand, does have an h.265 capable hardware decoder. That’s because it’s a 4K device and h.265 is the prefered codec for 4K video streams from Netflix and Amazon Video. Just like with the h.264 videos from before, the 2nd-gen Fire TV’s CPU isn’t doing much work because all the heavy lifting is being done by the dedicated hardware decoder.
Moving up to a 15 Mbps h.265 video, which is around the quality used by Netflix and Amazon for 4K streaming, we can see the 2nd-gen Fire TV handles this without any issues because it was designed with dedicated hardware to handle this exact codec and bit rate. With the previous 3 Mbps h.265 video, the 1st-gen Fire TV was almost watchable due to its CPU capabilities alone, but now at 15 Mbps, the 1st-gen Fire TV’s CPU just can’t keep up anymore, and the Fire TV Stick just has no hope of processing this much data using it’s CPU alone.
Lastly, we’ll bump up the bit rate to 100 Mbps, which again, nobody would realistically use. The 1st-gen Fire TV Stick and Fire TV have no hope of processing this amount of data using their CPU alone, so they’re practically displaying a slideshow of static images at this point. The 2nd-gen Fire TV is handling this 100 Mbps h.265 video without a sweat, thanks to its h.265 capable hardware decoder.
For comparison, let’s switch back to the h.264 video at the same bit rate of 100 Mbps to remind you how well even the relatively weak 1st-gen Fire TV Stick was able to handle this extreme bit rate when a codec that the hardware was designed for is used.
I hope this video was able to demonstrate that, when it comes to playing high quality video, the CPU of a device is not nearly as important as the capabilities of the dedicated hardware decoder. If you enjoyed this video, be sure to hit that like button on YouTube to let me know you’d like to see more videos like this one, and while you’re there, be sure to subscribe to my YouTube channel. I host a weekly series called the AFTVnewscast where I discuss the Fire TV and topics like this on a weekly basis. And of course, keep it locked to AFTVnews.com for all things Fire TV.