Maybe, just maybe, you've heard the term '4K' kicked around recently, or over the past year when reading the latest tech news.
You've perhaps wondered what it is. Or you maybe already know what it is. But if, like me, you've heard the term, or know what it means, you may also wonder why you don't see more of it. Why it isn't more mainstream. Why it isn't an industry-standard.
Loading article content
If, for a moment, you think back a decade, to the very first time you heard 'HD', 'High Definition', '1080p' or 'BluRay'. One minute you'd heard the term, the next minute it was everywhere.
Yes, BluRay, HD, 1080p, or however you wish to refer to it, was pricey at first, but it very, very quickly took over from DVD as the industry gold-standard. Before long, it was everywhere; BluRay discs were in pride of place where DVDs once sat making fun of VHS tapes, BluRay players were in every tech shop's window and David Attenborough box-sets suddenly became even more awesome (that is, by the way, how you test the quality of video playback, you stick a David Attenborough nature show on and judge accordingly).
The industry and consumers alike adopted HD quickly and it is now on our living room big-screens, our media players, our PCs, our gaming consoles and our mobile phones. So here's a question: why is 4K not doing the same?
The answer is actually quite simple; it's a little too powerful for us just yet.
Allow me to explain. First of all, the kind of equipment required to shoot something in 4K resolution is expensive. Very expensive. Bearing in mind that 4K is essentially four times higher in pixel density than 1080p, there are plenty of cheaper devices or high-end mobile phones that will "shoot 4K", that is, shoot at a resolution of 3840 × 2160 pixels, but these will generally shoot at low frame rates (15 frames per second), thus rendering the output video jumpy and barely worthwhile. For something truly capable of filming in 4K, at an acceptable frame rate, you're looking at handing over thousands, if not tens of thousands of pounds.
Secondly, very little hardware out there has enough processing power to play 4K video at said acceptable frame rate so far.
Thirdly, standard HDMI cables (remember cables? Those long, thin things we used to use to transmit data before we opted to use air instead) are not powerful enough to carry 4K video from most devices; there is simply too much data to fit through the cable without it bottlenecking.
The fact of the matter is, while there are plenty of 4K-ready devices currently available, and many people who own them already, very, very few people will have ever actually played back anything with a 4K, or anything higher than 1080p resolution on them. At this moment in time, 4K simply requires too much processing power to film, to transmit and to playback all at an affordable price.
It's needless to say that 4K will ultimately become our standard video-watching resolution, and by all means, having seen it for myself (David Attenborough, naturally), it truly is a thing of beauty and something that, once seen, will make you look at 1080p HD like an old VHS. It almost makes real life look low-resolution after watching.
But for now, it's just a thread out of reach for most of us. I do believe, however, that this will be the year it either makes its name among consumers, or goes and sits on the eternal technology naughty-spot along with 3D televisions.