No matter what else they build, consumer electronics companies seem to be prolific manufacturers of jargon. And if you’re shopping for anything from speakers to TVs to phones this year, you’re going to run into a lot of it. That’s why we’ve pulled together some of the most important tech terms for consumers to know in 2016. Some are new, while others have been kicking around for the past couple of years but are now making their way into more products—and into more ads, packaging, and sales pitches from store clerks.

While some of these terms apply to great technologies that you may really like, others may not live up to the hype—at least not yet. One thing that’s fairly predictable: If a product has a lot of jargon attached to it, it’s probably accompanied by a high price tag.

High Dynamic Range (HDR)

In both still photography and video, HDR refers to imaging technologies that try to bring out the details in scenes where there’s a wide range of bright and dark elements. Often, the optimal settings for presenting or capturing objects in bright light obscure what’s happening in the shadows, and vice-versa. HDR has been a feature on cameras (including smartphone cameras) for about five years, but now you’ll be seeing the term applied to quite a number of new televisions, as well.

In a still camera, HDR solves the exposure problem by shooting quick photos at various exposures and then forming a composite from the best parts of all of them. When it works well, every part of the image looks vivid and detailed—though in some cases the photos can appear a bit surreal or even fake. You can turn HDR off if you don’t like it, and some newer cameras from Sony and others allow you to adjust the intensity of the HDR effect.

In a TV, HDR is all about increasing the range between the darkest and brightest elements within an image. The result is a dramatic, lifelike picture, with brighter whites and deeper blacks that really pop off the screen. The key reason that HDR in TVs has lagged behind HDR in still cameras is that the shows have to be shot and broadcast or streamed in HDR, and televisions have to be built to read and display that signal. It took awhile for TV manufacturers to settle on a technical standard that all of those players will adhere to. Now that appears to be happening, and any HDR TV will follow something called the SMPTE standard. (Some will also support a second standard called Dolby Vision.) These HDR televisions will be on the market starting in March, along with HDR content from the major studios.

But HDR in TVs will actually have more jargon attached to it—“Ultra HD Premium” sets promise to take full advantage of the technology, while other sets will only be “HDR-capable.” And televisions will still vary in picture quality for lots of reasons that go beyond their dynamic range. Of course, sorting out those performance differences is our job. Stay tuned for test results on the first sets with HDR, which should be available soon.

Find out how ISPs could make TV streaming services more expensive, and then let us help you decipher the confusing new TV packages from Verizon FiOS.

4K or Ultra HD

Back in 2013, 4K or Ultra HD televisions were a high-priced, new-tech, man-cave indulgence. Today they’ve gone mainstream and will probably make up about one-third of the TVs sold in 2016. You can find quite a few of these TVs in our Ratings for under $1,000 and even some for as little as $500, though they’re still not as cheap as plain old 1080p HDTVs.

Are they worth the extra bucks? That depends on several factors.

On the spec sheet, UHD TVs are quite impressive. They have four times as many pixels (about 8 million) than already-sharp HDTVs. Specifically, an Ultra HD TV has 3,840 pixels horizontally and 2,160 pixels vertically, compared to 1080p TV's resolution of 1,920 x 1,080. With all of those extra pixels, even the smallest details are visible—the finest strands of hair and the subtle texture of a cotton shirt, for example.

To the naked eye, the jump from regular HD to Ultra HD isn't as dramatic as the change from standard definition to HD. In fact, you’re probably not going to appreciate 4K resolution unless you’re looking at a 4K content on a model with a screen that’s at least 65 inches measured diagonally, or you’re sitting quite close to the TV.

Also, there isn’t much 4K content available right now, although Netflix, Amazon, DirecTV, and other content providers will be offering more 4K viewing options later in 2016. And new Ultra HD Blu-ray discs and players are starting to roll out.

So, is it time to buy one of these new sets? If you’re a casual television observer shopping for a set with a screen 50 inches or smaller, you’ll save a lot and miss little by going with an HDTV. But if you’re a serious sports fan or movie buff, or plan to use your TV as a part-time computer monitor, an Ultra HD TV is the way to go. And if you can wait a few months, prices are sure to drop. There's also another reason to consider a UHD TV: Many of them will also feature HDR.

Hi-Res Sound

If 4K TV has a linguistic counterpart in audio, it’s something called hi-res (aka “high-def”) sound. And you’re probably going to come across this tech term a lot in 2016: Apple Music is rumored to be adding a high-res music streaming option in the spring.

Most music you download or stream comes in the form of MP3 or AAC files, and the data has been compressed to make it easier to store and download. Both file formats use “lossy” compression, meaning bits of data—presumably those you’ll miss least—are removed. The more data that's taken away, the smaller the file becomes, but the likelihood grows that the sound will be noticeably degraded.

The alternative is hi-res, “lossless” compressed file formats, such as ALAC (Apple Lossless Audio Codec) and FLAC (Free Lossless Audio Codec). These maintain all of the original information, so when a song is uncompressed it should sound exactly the same as the source material. The downside is that these files can more than 10 to 15 times the size of AAC and MP3 files (say, 150MB for a FLAC song vs. 12MB for the MP3 or ACC version).

To enjoy hi-res audio, you’ll need a compatible player, such as the $300 Sony NWZ-A17Hi-Res Walkman or the HTC One M9 smartphone, along with premium headphones. And you’ll pay a lot more for the music, which you can download from websites including Acoustic Sounds, HDtracks, and Pono Music. Expect to pay about $20 to $25 for an album, compared with $10 for a typical iTunes or Amazon Digital Music album.

So is it worth it? In our testing, Consumer Reports audio engineers could sometimes hear a bit more detail and clarity in hi-res files than in CD-quality and AAC audio files, especially when the music was played through great equipment, such as a pair of $300 Grado Prestige SR325e headphones. But hi-res audio is probably not worth the splurge for casual listeners who store music on a portable service and listen through average gear.

qHD, like that on Samsung smartphones, is one of the tech terms you need to know
qHD displays, like those on Samsung's flagship Galaxy S7 smartphones, present their content with more than 500 pixels per inch of detail.

qHD (quad High Definition)

Smartphones, like TVs, have been undergoing their own resolution revolution for several years. The first smartphones with true high-definition (1920 x 1080) displays appeared in 2011. Initially, HD was a tech term that applied only to Android smartphones, but the other phone platforms, including Apple, have since caught on. The advantages of an HD smartphone display over lower-res ones are obvious. Details in photos and videos become more noticeable while text on webpages and documents look crisper and are easier to read.

In the fall of 2014, the screens of some flagship smartphones from LG and Samsung became significantly more refined, boasting a resolution 1440 x 2560 (about 500 or pixels per inch, or ppi, depending on the phones display size). These new displays, designated qHD, are often confused with the displays of 4K televisions, though they’re not quite as sharp. The “quad” part of the term comes from the fact the qHD screen has four times as many pixels as a 1280 x 720 (720P) display.

You’d think that a qHD display would mean sharper, eye-popping pictures and more detail. But in our tests, we found that most users don’t notice those extra pixels in everyday use, partly because even the largest phones have relatively small displays. What’s more noticeable—and not it a good way—is when a phone has a very low resolution. Our advice: Choose a model with a screen resolution of at least 720p (also expressed as 300ppi). 

Augmented Reality (AR)

Virtual Reality is a big buzzword for 2016, with products such at the Google Cardboad and the long-awaited Oculus Rift headset receiving lots of attention in the media. But the VR has a close relative, known as augmented reality, and it’s a toss-up as to which technology will matter more to the average consumer in the coming years.

Here’s the difference: Virtual reality is similar to what you see in the movie The Matrix—the user is immersed in a digitally created environment (though no Matrix-style brain implant is involved, at least not yet). As you turn your head, you gaze at a complete world that seems to exist in all directions. Augmented Reality (AR), on the other hand, is a real-world view with data overlays. An example would be the backup displays on many cars, which add overlays showing your direction of travel to assist with parallel parking.

Google Glass, which was withdrawn from the market in January 2015, was an augmented reality product—one that annoyed more people than it attracted. Now Microsoft is developing an AR headset called the HoloLens, and the company has shown how it could be used for practical purposes, such as viewing a 3D model at your kitchen table, or allowing you to fight off aliens that seem to emerge from your living room wall.

Because AR is tied to the real world, it avoids problems such as the motion sickness and vertigo sometimes associated with VR headsets. But it remains to be seen whether it will become a technology that has appeal for people in their everyday lives.

A USB Type C connector for an article on tech terms
USB Type-C connections eliminate the fumbling and squinting that have become a ritual on phones that use micro USB cables.

USB Type-C

In 2016, you’re going to see a growing number of smartphones, tablets, and computers with a new type of connector: USB Type-C. This new cable has a multitude of advantages over the micro USB connectors commonly found on computers and smartphones that aren’t iPhones.

First, just like the Lightning connector on an iPhone, USB-C connectors can be inserted into the phone no matter which way you hold it; there is no "wrong-side up." That eliminates the fumbling and squinting that has become a ritual on phones that use micro USB cables.

But here’s how Type-C is better than the iPhone’s Lightning connector. Type-C has a potentially much larger transfer rate—up to 10 gigabits per second (Gbps)—versus Lightning’s speed limit of about 4 Gbps. That should mean nearly instant transfers for the mega-size photos and HD videos produced by today's high-resolution smartphone cameras.

What’s more, USB Type-C supports bi-directional power. That means your phone will receive a charge while it’s transmitting files to a compatible TV, printer, or other accessory over the same cable. The bad news: Once your new phone has this connector, you'll need to buy a whole bunch of Type-C adapters to connect them to your old PCs and accessories. Also, there have been reports of some aftermarket cables damaging electronics—we expect the issue to be resolved quickly.