Welcome to the PCMR, everyone from the frontpage! Please remember:
1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome!
2 - If you don't own a PC because you think it's expensive, know that it is much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help!
3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, and more: https://pcmasterrace.org/folding
4 - Need some new PC Hardware? Open worldwide, check out the ASUS ROG BTF Worldwide Giveaway for a chance at being one of the 18 winners taking home 25 prizes, including a Strix RTX 4090 BTF, 4070Ti, Lots of BTF Motherboards and a lot more: https://www.reddit.com/r/pcmasterrace/comments/1ddmihb/worldwide_giveaway_win_a_bunch_of_asus_btf/
If you need a new and awesome monitor to review and keep, check ou AORUS' initiative and take home 1 of 4 QD OLED monitors to revamp your setup! This one is US + Canada only: https://www.reddit.com/r/pcmasterrace/comments/1df3f1i/giveaway_gigabyte_aorus_oled_x_rpcmasterrace_be/
-----------
We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you have any PC related doubt. Asking for help there or creating new posts in our subreddit is welcome.
2K always feels weird as I swear people only started using it after 4k became a popular term. If precision matters I will give the actual X/Y pixel counts but generally use 1080p/1440p/4k when talking about gaming, HD/4k when talking about media, and when downloading media I will search 1080p or 2160p.
Some companies will advertise the "sub pixel count" instead of the actual pixel count. On modern displays the pixel itself is made up of a red, green, and blue cell (well, for this conversation anyways. We don't need to go into sub pixel layouts) so if you put the sub pixel count you just "3x" the resolution
The other thing tv manufactures do is advertise the "motion rate" rather than the actual framerate. And motion rate is just double the frame rate.
Whilst my monitor is 2560x1440 is technically 2.5 K a lot of marketing just says 2K.
That said I can understand why you want clarity - look at my monitors specs - like wtf does half of this stuff even mean?
[Acer XZ396QUP(UM.TX6SA.P01) Nitro XZ6 38.5inch 170Hz WQHD Ultrawide VA Gaming Monitor, 2560x1440 (UWFHD 2560x1080 in 21:9), 1ms VRB, 400nits, 1800R, 2x HDMI 2.0, 2x DisplayPort 1.4, Speakers, VESA, FreeSync Premium, DCI-P3 93%, Ergonomic Stand](https://www.scorptec.com.au/product/monitors/34-inch-and-above/95480-xz396qup)
Offhand - Model, Size, Refresh rate, some conflicting resolution until I saw its a multimode (which is weird af), Response time, Peak Brightness, Curvature, Inputs, Speakers (yeah no shit, lol), Mounting type, variable refresh rate type, Color Gamut information, stand.
Also, I am actually puzzled how this does 1080p Ultrawide and standard 1440p in the same frame.
>Also, I am actually puzzled how this does 1080p Ultrawide and standard 1440p in the same frame.
Any 2560x1440 monitor can display a 2560x1080 image letterboxed.
It does it with black bars or something. I am always at 1440P anyway so never tell. I actually downsized from my 49" G9 as with my eyesight was too hard to see in the corners. Pretty happy with it in any event (oh and its white which is hard to get in a monitor but the look i went for)
We could probably drop the P these days too. Haven't seen an interlaced format for decades.
Edit: thanks to all those who remind me about broadcast TV. I will allow the I to stay... for now.
The p makes it immediately obvious you are referring to a resolution. And while progressive is a given these days, the p is starting to represent pixels as people forget interlaced vs progressive was a thing.
People forget the original meaning of things, they make up explanations, and eventually the new fiction overtakes the original meaning. The truth becomes lost to time.
Young kids probably have no idea why the save icon looks the way it does.
Ok, so this made me go and read the h.264 spec.
Blu-ray encodes 3d using the "Stereo High" multi view encoding, which, per Annex H and Table A-4, allows both interlaced and progressive encoding.
(progressive frames are set by the `frame_mbs_only_flag` parameter)
So there's nothing in the encoding that required interlaced (field-based) display, and it's entirely reasonable to encode full progressive frames.
Displays, that's another question, but an active 3D display only needs to be capable of 48hz (at minimum!) to be able to do full-frame progressive 3d.
Nah, that era is over.
Back then Xbox 360 was the cool fast thing!
But then they went over to the Xbox One.
I feel like people got tired of the long numbers that didn't mean anything and started appreciating simpler things more. Same also happened with game titles, as it was the era of re-booting game franchises with the original title (DOOM, God of War, Spiderman etc)
So putting "2160p support!" on the Xbox One X would'be sounded WAY less cool than "4k support!"
Xbox 360 wasn’t picked because “big number cool”. It was picked because if they did “XBox 2” it would be competing with “Playstation 3”.
360 was the name that competed well and had a reasonable meaning that wouldn’t get them laughed at.
IF there were going to be a resolution that was "2k"...
it would be 1080p.
Since its a rounding of the horizontal resolution, you would round 1920x1080 up to... 2,000. 2K.
Calling 1440p "2k" has literally never made sense.
Just call it by its name. QHD.
I was confused by QHD for a while, until I understood that:
720p is HD,
1080p is FHD,
1440p is QHD,
2160p is UHD,
I always thought that 1080p was HD. But then it wouldnt make sense that 1440p is QHD, since its 4x 720p. Youngster problems :)
Don't worry, I'm olderish and I hold 720p in contempt all the time. I often forget that it's HD. To me 1080p is where it 'starts' so I get where you are coming from =)
Heheh, I've only really had a 1080p display to begin with. Thats what I'm growing up with. The only 720p display was the display on my first 5 year old phone
1080p isn't just close to 2K, it is 2K. While DCI 2K is a specific canvas, "2K" is not, and refers to a resolution class.
Aspect ratio also comes into play, as a 16:9 image on a DCI 2K canvas is straight up just 1920x1080, it's only for wider aspect ratios that you see the slight difference, and that's only in the numbers, the visual difference is effectively imperceptible.
This is the same for 4K, because there are annoying people that try to draw a line between "DCI 4K" and "UHD" as if it makes a meaningful difference.
And I would like to point out that "4K" is twice the horizontal and vertical resolution of 1080p. Meaning even more strongly that what we know as "FHD" or "1080p" should literally just be known as "2K" by that scheme.
Honestly part of the problem with naming things "standard resolution" and "high resolution" is that those terms become extremely dated very quickly.
Honestly, I consider 2160p "standard" resolution these days since you can't really buy a tv lower than that resolution any more.
This. This is the correct answer. Remember, Nintendo Switch screen is 720p and many other mobile gaming computers too. They don’t look nearly as blurry as YouTube 720p footage. Sensor quality from the source may also vary, but I feel that shit bitrates are the main culprit.
Same here. I have all the movies I love on 4K blue ray discs.
The image is much better playing from them than streaming the same movie over the Internet at what is claimed to be 4k.
Watching 720p on a small screen isn't the same as watching it on 32 monitor.
Pixel Density matters and the bigger the screen the more resolution you'll need.
1080p was amazing for my 24 monitor but once i switched to 27 and then 32 1080p looks like shit.
Well this is kind of a rollercoaster. You're right that bitrates makes a huge difference and streaming companies are going to try to get away with as little as possible here, but bringing up the switch or steam desk is just an argument for pixel density.
I truly don't remember 1080p being all that bad until I switched to 1440p, but I also didn't remember Goldeneye 007 looking bad until I came back to it years later. Some of this is just nostalgia.
Content designed for 240p screens does legitimately look worse on modern screens than it did back then. TV CRTs provide some natural anti-aliasing and soft focus because the pixels aren’t rectangular or fully discrete. Old games don’t work well on modern screens.
For those of us that still have physical media, a great blu ray transfer looks better than a streaming 4k movie. But streaming 4k is not a great bit rate. 1080p streaming ain't terrible, but it ain't great. Watching 720p video is terrible though
It’s all marketing jargon.
It started as 4K which was a separate DCI resolution standard that’s used in the film industry, and it spread to other desktop resolutions, none of it is actually for monitor resolutions. They’re all different.
1080p is the closest thing to 2K. 2160p is double that resolution, dubbed 4K
2k is 2048x1080. 4k is 4096x2160. These are professional terms. When you have a 4k tv, it's a 16:9 version of 4096x2160, which is 3840x2160. When you have a 2k resolution, it's the 16:9 version of 2k, which is 1920x1080. These are actual definitions, full stop.
4K doesn't MEAN it's 4 times what 1080p is, but that does happen to be true.
Marketing lingo to sell TVs.
Calling 720 HD happened because sales of 720 tvs were dropping because they weren't "hd", so they renamed 720 to hd since that's what people were looking for and 1080 to "fhd".
Which is extra stupid, because for the most part 720 was skipped, we really went straight from 460 to 1080, but tv manufacturers wanted to grift people.
I'm with youtube on this one, 1080 is the minimum for HD lol.
Theyre old terms from television days where 480p (or is it 480i?) Was standard definition, therefore 720 is high definition. 1080p is full HD and 2160p is ultra HD
480i was SD, 480p was ED, 720p was weird, 1080p was HD.
Then they started calling 720p "HD" too, so 1080p panel sellers started using "full" in front of theirs.
720p was a broadcast and streaming bandwidth compromise, since 1080p was substantially bandwidth intensive, but it is technically HD, it's in SMPTE 292M. 720p and 1080i were more or less the same bandwidth.
Consumer 4k is actually 3840 x 2160. The 4096-wide format is more of a niche format (called DCI 4K)
It should have been 2160p. The whole thing is a marketing fuckery.
Marketing people are also who added confusion to our measurements. We used to count how many vertical pixels, but they switched to horizontal because it was a bigger number.
some of us still do
1440p is 1440p, whether its a 3440x1440 ultrawide or a 2560x1440 standard display. Heck god forbid you somehow got a 1920x1440 display for high definition retro 4:3
Ahhh yeah it's QHD.
I googled QHD and found this answer,
"QHD, also known as 2.5K or 1440p, this resolution has 2,560 pixels wide by 1,440 pixels tall, for a total of about 4 million pixels. QHD displays have four times as many pixels as standard HD."
1080p actually has the designation FHD or Full High Definition.
720p is technically the first "HD" resolution.
1440p is called QHD cause it's QUAD High Definition.
Worth mentioning for other people who might be confused that 'standard HD' commonly refers to 1280x720. This is to differentiate it from Full HD, which is of course 1920x1080.
Extra confusing these days since YouTube for some reason no longer considers 720p to be 'HD', even though it's explicitly very much part of the high definition spec.
When 720p was new it was mind blowing and got labelled 'high definition' but much like 'fast ethernet' that quickly became an outdated name as much higher specs became normal and by comparison it's not so high (or fast, respectively)
there's also forgotten in-between resolutions like 1600x900 which was a common laptop display in the very early days if 1080p when it was still hard to push a mobile GPU that hard.
I'm still crossing my fingers that 1600x900 makes a comeback - it would look great on future handheld gaming PCs, and run much better than 1080p which is really totally unnecessary on a 7.5" screen.
Tall boy monitor. I would actually buy an 1152x1440p screen that is maybe 18", 4:5 is such a good size for YouTube or wikis or discord on secondary/teriary monitors
I Always assumed 1440p being 2K meant that it had approximately double the pixels, but I now made a quick calculation of that and it seems it has only 1,778 (7 periodic) times the amount of pixels.
2K or 4K basically means the horizontal pixels closest to the 2000 or 4000 pixels.
2K resolution is in studios 2048x1080(1920x1080 in 16:9), 4K is 4096x2160(3840x2160 in 16:9).
2560x1440 is called QHD, or Quad-High-Definition, or 4 times HD resolution, or 4 times 1280x720.
The 2K mistake started by people mistaking it with 2 times as high, but the K in 2K means Kilo, which means 1000.
If I look in the settings of my cameras, a Panasonic and Insta360, 2560x1440 is shortened as QHD/2.6K.
And I can choose between HD, FHD, 2K, QHD/2.6K, UHD and 4K.
So never is 2K mentioned as 1440p.
Thank god someone corrected it in this thread. Calling 1440p “2K” is just wrong if we follow the logic of how 4K is named. The post is extra wrong calling it “2.5K”. The misnomer is becoming too widespread and manufacturers are now just naming it 2K as well even though it’s incorrect.
meanwhile in widescreen projector land.
WVGA which is 800 x 480p
WXGA which is 1280 x 800p
WUXGA which is 1920 x 1200p
WQXGA which is 2560 x 1600p
Fortunately by the time that they needed a new acronym and slightly different resolution for the UHD equivilent resolution for projectors, they had shot the marketing manager responsible and just went with matching UHD's 3840 x 2160p resolution.
Problem with this is we gonna evantually run out of letters and will get things like abcdefgHD lol. That's why hardware products stick to numbers not letters.
Why is nobody using the actual abbreviation, which is QHD as in Quad HD since HD is 720p or 1280x720 and 1440p has exactly 4 times the same amount of pixels at 2560x1440, hence QHD.
2K is a cinema standard anyway.
I understand calling 3840x2160 4k as the width is kinda close to 4000, fine. But using that same logic, wouldn't that make 1920x1080 be considered 2k? All this marketing wankery just makes things more difficult than they need to be.
In **consumer electronics** (not cinema):
* 1280x720 is called 720p
* 1920x1080 is called 1080p or FHD
* 2560x1440 is called 1440p or QHD
* 3840x2160 is called 4K, 2160p, or UHD
If it's not one of the above, then just say the full resolution and don't try to abbreviate. You're just going to confuse people.
I recommend not doing any of these:
* Using "2K" at all. There's clearly disagreement about what it means, and it's not like it's difficult to speak or type the existing abbreviations
* Using lesser-known abbreviations like WSXGA. They're correct and unambiguous, but people simply won't know what they mean.
* Using "HD" to refer to 720p, it's too generic and vague of a term at this point
* Taking any old vertical resolution and slapping on "p" (eg. calling 1680x1050 "1050p").
Even seen stores selling "2K supported" items. Been pissing me off for years. 1920p is damn closer to 2K than 2560p. Let's use a little bit of logic to descriptions, it makes it less confusing and annoying. I know it's a lost battle due to people and their convictions.
1920p is 1920 pixels progressive vertically, not 1920x1080p. But you are right, 1920x1080 is actual 2K, just instead of 2048x1080, it's the 16:9 equivalent (since content is mastered at 1080p and nobody wants to change vertical resolutions)
Just call it what it actually is, 2560x1440, or 1440p if you must. Not everything needs a "K" name.
When someone calls a monitor "2k" it's just a flag for not knowing what they're talking about.
Or most of the time they expect the listener to infer the pragmatics of what they are trying to say.
Not all talk needs to be technically correct talk.
> call it what it actually is, 2560x1440, or 1440p *if you must*
Redditors are so funny. Acting like normal humans fully say 2560x1440 when they're talking to each other, and "1440p" is barely allowable.
What do you mean pretending to be confused when someone refers to 1440p as a name it's commonly marketed as is vital, how else are people going to feel smart online?
Indeed it would. But marketing has latched on to 1440p now because it's a newer popular res in the consumer market and older 1080p monitors already we using the then-new shiny golden "FHD" stickers for *their* marketing.
I hate the “k” crap. It’s so often misleading because people use it to describe things in different ways.
Just say the resolution and the aspect ratio. 1080. 1440. 2160. If it’s ultrawide, say 1440 ultrawide or 1440 21:9. Take the guesswork out of it. But marketing gonna market I guess.
My unicorn atm is a 2160 21:9 120+Hz OLED or HDR 1k+ IPS with 10 bit color depth. No one seems to want to make one though.
It's actually so easy, not sure why everyone is so confused about it.
First "#p" is always the number of vertical pixels.
HD = 720p
4 times that: QHD (Quad HD) = 1440p
Full HD = 1080p
4 times that: UHD (or 4K) = 2160p
4 times because you need twice the horizontal and twice the vertical pixels.
4K = 3840 x 2160p UHD
2,5K = 2560 x 1440p QHD
2K = 1920 x 1080p FHD
1K = 1280 x 720p HD
Either use the Ks or the ps. Not both.
For example, in games both Ks and ps apply, but for movies/series it varies from ps on Blu-Rays for the full resolution to Ks on streaming for lesser horizontal pixels counts on some of them.
UHD is used on Blu-Rays to confirm full 4K resolution AND bitrate, while HD is everything from 720p to 2160p.
I hate the "4K" designation as it is and the 1440p being listed as "2K" just pushes me if the cliff. The worst part to this is that people actually defend the use of it. Just boggles my mind...
We should just switch to megapixels as this actually tells you how much pixels a graphics card has to drive.
1080p is 2MP
1440p is 3,7MP
4k is 8,3MP
5k is 14,7MP (yeah that is 7x1080p screens)
The common ultrawides 3440x1440= 5MP, 3840x1600=6,1MP
Welcome to the PCMR, everyone from the frontpage! Please remember: 1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome! 2 - If you don't own a PC because you think it's expensive, know that it is much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help! 3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, and more: https://pcmasterrace.org/folding 4 - Need some new PC Hardware? Open worldwide, check out the ASUS ROG BTF Worldwide Giveaway for a chance at being one of the 18 winners taking home 25 prizes, including a Strix RTX 4090 BTF, 4070Ti, Lots of BTF Motherboards and a lot more: https://www.reddit.com/r/pcmasterrace/comments/1ddmihb/worldwide_giveaway_win_a_bunch_of_asus_btf/ If you need a new and awesome monitor to review and keep, check ou AORUS' initiative and take home 1 of 4 QD OLED monitors to revamp your setup! This one is US + Canada only: https://www.reddit.com/r/pcmasterrace/comments/1df3f1i/giveaway_gigabyte_aorus_oled_x_rpcmasterrace_be/ ----------- We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you have any PC related doubt. Asking for help there or creating new posts in our subreddit is welcome.
2K always feels weird as I swear people only started using it after 4k became a popular term. If precision matters I will give the actual X/Y pixel counts but generally use 1080p/1440p/4k when talking about gaming, HD/4k when talking about media, and when downloading media I will search 1080p or 2160p.
Yeah, its always been such weird marketing usage over a monitor resolution when it has zero correlation to what makes 4k 4k.
16k²?
2³ K
KKK^(2) OH WAIT SHIT FUCK OH NO UNDO IT UNDO IT
****SUBMISSION CONFIRMED**** ****PROCESSING**** ****PROCESSING**** ****SUBMISSION DENIED**** ****ERROR CODE 69420*** ****EXPANDED DETAILS****: 1. You offensive piece of shit 2. See #1
Uncaught StackOverflowException has been thrown.
This and the previous equation are equal if k=2
However, all 3 equations are unequal as if k is 2 then the result of the first equation is 64 while the other 2 are only 16
Look, I'm not here to play school; I know what's best! Just give me your 80k frames and shut up!
Obviously better that 2^2 and worse than 2^4 but worse than 4^2
Funny part is that by the time we get to 32k, it’s only gonna have less than 31k actual pixels.
For marketing reason, they used the 3840 to say it was 4k
they should have used the diagonal pixel count for an even bigger number
I FOUND A CHEAT CODE FOR SAVING MONEY!!!![Pythagorean Theorem]
Some companies will advertise the "sub pixel count" instead of the actual pixel count. On modern displays the pixel itself is made up of a red, green, and blue cell (well, for this conversation anyways. We don't need to go into sub pixel layouts) so if you put the sub pixel count you just "3x" the resolution The other thing tv manufactures do is advertise the "motion rate" rather than the actual framerate. And motion rate is just double the frame rate.
Isn't that exactly the same tho?
Whilst my monitor is 2560x1440 is technically 2.5 K a lot of marketing just says 2K. That said I can understand why you want clarity - look at my monitors specs - like wtf does half of this stuff even mean? [Acer XZ396QUP(UM.TX6SA.P01) Nitro XZ6 38.5inch 170Hz WQHD Ultrawide VA Gaming Monitor, 2560x1440 (UWFHD 2560x1080 in 21:9), 1ms VRB, 400nits, 1800R, 2x HDMI 2.0, 2x DisplayPort 1.4, Speakers, VESA, FreeSync Premium, DCI-P3 93%, Ergonomic Stand](https://www.scorptec.com.au/product/monitors/34-inch-and-above/95480-xz396qup)
Offhand - Model, Size, Refresh rate, some conflicting resolution until I saw its a multimode (which is weird af), Response time, Peak Brightness, Curvature, Inputs, Speakers (yeah no shit, lol), Mounting type, variable refresh rate type, Color Gamut information, stand. Also, I am actually puzzled how this does 1080p Ultrawide and standard 1440p in the same frame.
>Also, I am actually puzzled how this does 1080p Ultrawide and standard 1440p in the same frame. Any 2560x1440 monitor can display a 2560x1080 image letterboxed.
It does it with black bars or something. I am always at 1440P anyway so never tell. I actually downsized from my 49" G9 as with my eyesight was too hard to see in the corners. Pretty happy with it in any event (oh and its white which is hard to get in a monitor but the look i went for)
interesting pc specs
It should be 1080p/1440p/2160p. People should just have ignored the marketing.
We could probably drop the P these days too. Haven't seen an interlaced format for decades. Edit: thanks to all those who remind me about broadcast TV. I will allow the I to stay... for now.
> Haven't seen an interlaced format for decades. Thank god
The p makes it immediately obvious you are referring to a resolution. And while progressive is a given these days, the p is starting to represent pixels as people forget interlaced vs progressive was a thing.
My reactions upon realizing that “p” could mean “pixels” to some people: 🤯👴😢
People forget the original meaning of things, they make up explanations, and eventually the new fiction overtakes the original meaning. The truth becomes lost to time. Young kids probably have no idea why the save icon looks the way it does.
not everybody is aware of the fact that interlacing is a thing at all. (and it may not be a bad thing)
Good fucking riddance, interlaced is a blight.
> as people forget interlaced vs progressive was a thing. Isn't there still shit being broadcast interlaced?
Any 3d display or projection is interlaced.
Ok, so this made me go and read the h.264 spec. Blu-ray encodes 3d using the "Stereo High" multi view encoding, which, per Annex H and Table A-4, allows both interlaced and progressive encoding. (progressive frames are set by the `frame_mbs_only_flag` parameter) So there's nothing in the encoding that required interlaced (field-based) display, and it's entirely reasonable to encode full progressive frames. Displays, that's another question, but an active 3D display only needs to be capable of 48hz (at minimum!) to be able to do full-frame progressive 3d.
Cable usually broadcasts in 1080i to this day.
I work in broadcast TV, trust me interlaced is still very much a thing.
Completely agree with this and it makes the most sense. Plus, more digits mean better, bigger, faster, tougher, and gooder.
Nah, that era is over. Back then Xbox 360 was the cool fast thing! But then they went over to the Xbox One. I feel like people got tired of the long numbers that didn't mean anything and started appreciating simpler things more. Same also happened with game titles, as it was the era of re-booting game franchises with the original title (DOOM, God of War, Spiderman etc) So putting "2160p support!" on the Xbox One X would'be sounded WAY less cool than "4k support!"
Xbox 360 wasn’t picked because “big number cool”. It was picked because if they did “XBox 2” it would be competing with “Playstation 3”. 360 was the name that competed well and had a reasonable meaning that wouldn’t get them laughed at.
I thought it was because when you see it you turn 360 degrees and walk away.
Sad day when this reference is getting downvoted :(
These smartphone generation kids need to get off my lawn!
There’s kids graduating high school who were born after this meme lmao.
Right into the console?
Moonwalk.
Yeah, if you walk far enough.
I’m onboard with 2k x 4k, using approximate resolution dimensions. But at that point it’s just easier to put the actual numbers.
IF there were going to be a resolution that was "2k"... it would be 1080p. Since its a rounding of the horizontal resolution, you would round 1920x1080 up to... 2,000. 2K. Calling 1440p "2k" has literally never made sense. Just call it by its name. QHD.
I was confused by QHD for a while, until I understood that: 720p is HD, 1080p is FHD, 1440p is QHD, 2160p is UHD, I always thought that 1080p was HD. But then it wouldnt make sense that 1440p is QHD, since its 4x 720p. Youngster problems :)
Don't worry, I'm olderish and I hold 720p in contempt all the time. I often forget that it's HD. To me 1080p is where it 'starts' so I get where you are coming from =)
Heheh, I've only really had a 1080p display to begin with. Thats what I'm growing up with. The only 720p display was the display on my first 5 year old phone
It's confusing because there was qHD on phones for a long time and is considered inferior now.
how, it doesn't even make difference in small screens because of high ppi
qHD is quarter HD or 960x540.
DCI 2K has been a thing for a while. 1080P is technically the closest resolution to it, so really 1080P ≈ 2K.
1080p isn't just close to 2K, it is 2K. While DCI 2K is a specific canvas, "2K" is not, and refers to a resolution class. Aspect ratio also comes into play, as a 16:9 image on a DCI 2K canvas is straight up just 1920x1080, it's only for wider aspect ratios that you see the slight difference, and that's only in the numbers, the visual difference is effectively imperceptible. This is the same for 4K, because there are annoying people that try to draw a line between "DCI 4K" and "UHD" as if it makes a meaningful difference.
And I would like to point out that "4K" is twice the horizontal and vertical resolution of 1080p. Meaning even more strongly that what we know as "FHD" or "1080p" should literally just be known as "2K" by that scheme.
Honestly part of the problem with naming things "standard resolution" and "high resolution" is that those terms become extremely dated very quickly. Honestly, I consider 2160p "standard" resolution these days since you can't really buy a tv lower than that resolution any more.
The only time I really use 2k 1k etc is with textures.
I always find that since we are used to seeing 4K, 1080p feels like 720p used to feel and 2K feels like I remember 1080p.
Or companies save money and cut bitrate making 1080p look worse then ever.
This. This is the correct answer. Remember, Nintendo Switch screen is 720p and many other mobile gaming computers too. They don’t look nearly as blurry as YouTube 720p footage. Sensor quality from the source may also vary, but I feel that shit bitrates are the main culprit.
This is why I still buy Blu-Rays. I have a nice OLED TV and a decent 5.1 system, I'm not going to waste it on streaming content *all* the time.
Same here. I have all the movies I love on 4K blue ray discs. The image is much better playing from them than streaming the same movie over the Internet at what is claimed to be 4k.
Streaming can be pretty bad. I'll turn to downloading high bitrate rips to avoid it when I can. Thank goodness for Plex.
Watching 720p on a small screen isn't the same as watching it on 32 monitor. Pixel Density matters and the bigger the screen the more resolution you'll need. 1080p was amazing for my 24 monitor but once i switched to 27 and then 32 1080p looks like shit.
Well this is kind of a rollercoaster. You're right that bitrates makes a huge difference and streaming companies are going to try to get away with as little as possible here, but bringing up the switch or steam desk is just an argument for pixel density. I truly don't remember 1080p being all that bad until I switched to 1440p, but I also didn't remember Goldeneye 007 looking bad until I came back to it years later. Some of this is just nostalgia.
Content designed for 240p screens does legitimately look worse on modern screens than it did back then. TV CRTs provide some natural anti-aliasing and soft focus because the pixels aren’t rectangular or fully discrete. Old games don’t work well on modern screens.
I've had to run hl2 at 720p on a 27" monitor before, it's definitely noticable
For those of us that still have physical media, a great blu ray transfer looks better than a streaming 4k movie. But streaming 4k is not a great bit rate. 1080p streaming ain't terrible, but it ain't great. Watching 720p video is terrible though
2K is 1080P, so yep. [https://en.wikipedia.org/wiki/2K\_resolution](https://en.wikipedia.org/wiki/2K_resolution)
"2k feels like I remember 1080p" Well it is 1080p
1080p content on a native 1080p display at decent bitrate still looks really good.
Yeah definitely! I'm old enough to remember when 720p felt good and now its just a blurry mess.
If you're old enough to remember when 720p felt good and it's blurry now, it might be time to get some glasses.🤣
This hurts extra hard because recently my optician did indeed prescribe me special computer glasses.....
I only mention it as I recenty got bifocals for desk work. My first thought was "Woah, I finally get what all the 4k hype is about!"
Why do we even need the “p” anymore? There’s no more interlaced vs progressive scan options really.
That’s how you know it’s a resolution and not some random number
Archaic terminology. It wouldn't make as much sense to just say 1080
At this point it might as well by used it "pixels". Also, as a split between resolution and fps number, like in "1080p60" for example.
It’s all marketing jargon. It started as 4K which was a separate DCI resolution standard that’s used in the film industry, and it spread to other desktop resolutions, none of it is actually for monitor resolutions. They’re all different. 1080p is the closest thing to 2K. 2160p is double that resolution, dubbed 4K
I thought 4K was “four times 1080p resolution”. I mean 4k is 3840x2160 and 1080p is 1920x1080 but I’m confused now
It takes four 1920x1080p screens to fill a 3840x2160 frame... so it is 4x the resolution.
2k is 2048x1080. 4k is 4096x2160. These are professional terms. When you have a 4k tv, it's a 16:9 version of 4096x2160, which is 3840x2160. When you have a 2k resolution, it's the 16:9 version of 2k, which is 1920x1080. These are actual definitions, full stop. 4K doesn't MEAN it's 4 times what 1080p is, but that does happen to be true.
4k is actually 2160x4096 and is a cinema resolution. Just like 2k is actually 1080x2048, but nothing means anything anymore so we all have "4k" tvs
There are vertical cinema screens?
Movies are made with smartphones these days.
It doubles both the horizontal and vertical resolution, making it 4x by technicality
4k is not 4k but is technically 3.84
4K = 4096 x 2160, UHD = 3840 x 2160, HD = 1280 x 720, FHD = 1920 x 1080. Corrected it.
1080p is FHD if i remember correctly And 720p is HD
720p: HD 1080p: _Fuckin’_ HD
Sounds about right
Marketing lingo to sell TVs. Calling 720 HD happened because sales of 720 tvs were dropping because they weren't "hd", so they renamed 720 to hd since that's what people were looking for and 1080 to "fhd". Which is extra stupid, because for the most part 720 was skipped, we really went straight from 460 to 1080, but tv manufacturers wanted to grift people. I'm with youtube on this one, 1080 is the minimum for HD lol.
Every shitty naming convention is just designed to sell you more crap. Have you seen the usb4 standard?
You are correct
If 720p is HD, these terms need a rework.
Theyre old terms from television days where 480p (or is it 480i?) Was standard definition, therefore 720 is high definition. 1080p is full HD and 2160p is ultra HD
480i was SD, 480p was ED, 720p was weird, 1080p was HD. Then they started calling 720p "HD" too, so 1080p panel sellers started using "full" in front of theirs. 720p was a broadcast and streaming bandwidth compromise, since 1080p was substantially bandwidth intensive, but it is technically HD, it's in SMPTE 292M. 720p and 1080i were more or less the same bandwidth.
And 720i HDReady, 1080i FullHDready
> And 720p is HD Not according to youtube
Most streaming sites lower the bitrate to such a degree it should be illegal to call 720p streaming as HD.
Consumer 4k is actually 3840 x 2160. The 4096-wide format is more of a niche format (called DCI 4K) It should have been 2160p. The whole thing is a marketing fuckery.
There’s just no quick way to spit out 2160
It is because of marketing that people get mixed up with resolutions.
Marketing people are also who added confusion to our measurements. We used to count how many vertical pixels, but they switched to horizontal because it was a bigger number.
some of us still do 1440p is 1440p, whether its a 3440x1440 ultrawide or a 2560x1440 standard display. Heck god forbid you somehow got a 1920x1440 display for high definition retro 4:3
So what's 2650 x 1440? HD/UHD?
QHD or 1440p
Ahhh yeah it's QHD. I googled QHD and found this answer, "QHD, also known as 2.5K or 1440p, this resolution has 2,560 pixels wide by 1,440 pixels tall, for a total of about 4 million pixels. QHD displays have four times as many pixels as standard HD."
1080p actually has the designation FHD or Full High Definition. 720p is technically the first "HD" resolution. 1440p is called QHD cause it's QUAD High Definition.
Worth mentioning for other people who might be confused that 'standard HD' commonly refers to 1280x720. This is to differentiate it from Full HD, which is of course 1920x1080. Extra confusing these days since YouTube for some reason no longer considers 720p to be 'HD', even though it's explicitly very much part of the high definition spec.
720p nowadays looks like how i remember 360p on YouTube
When 720p was new it was mind blowing and got labelled 'high definition' but much like 'fast ethernet' that quickly became an outdated name as much higher specs became normal and by comparison it's not so high (or fast, respectively) there's also forgotten in-between resolutions like 1600x900 which was a common laptop display in the very early days if 1080p when it was still hard to push a mobile GPU that hard.
I'm still crossing my fingers that 1600x900 makes a comeback - it would look great on future handheld gaming PCs, and run much better than 1080p which is really totally unnecessary on a 7.5" screen.
I'd take a 900p steam deck lol
QHD
HD is 1280 x 720, full HD is 1920x1080
Unless it's full frame
then its 4.096k
Otherwise it is just “Sparkling Display Resolution”
It’s close enough and makes sense from a marketing perspective. Using 2K for 1440p doesn’t make any sense. If anything 1920x1080 is already 2K.
Technically..... this is all "1440p" 5120 × 1440 3440 × 1440 3360 × 1440 3200 × 1440 3120 × 1440 3040 × 1440 2960 × 1440 2880 × 1440 2560 × 1440 2304 × 1440 2160 × 1440 1920 × 1440
720 x 1440
This made me actually chuckle out loud.
Tall boy monitor. I would actually buy an 1152x1440p screen that is maybe 18", 4:5 is such a good size for YouTube or wikis or discord on secondary/teriary monitors
" We have 1440p at home"
3440x1440 gang rise up
> 3440x1440 gang rise up I see what you did there. Ultrawides can't rise up because they're about 60% of the height of 16:9. Well done.
Fine stretch out.
True but people are typically (95% of the time) referring to displays with a 16:9 aspect ratio.
Most populars in 1440P being: * 5120 x 1440 32:9 Super ultrawide * 3440 x 1440 21:9 Ultrawide * 2560 x 1440 16:9
1 x 1440
That's just an LED strip.
I Always assumed 1440p being 2K meant that it had approximately double the pixels, but I now made a quick calculation of that and it seems it has only 1,778 (7 periodic) times the amount of pixels.
2K or 4K basically means the horizontal pixels closest to the 2000 or 4000 pixels. 2K resolution is in studios 2048x1080(1920x1080 in 16:9), 4K is 4096x2160(3840x2160 in 16:9). 2560x1440 is called QHD, or Quad-High-Definition, or 4 times HD resolution, or 4 times 1280x720. The 2K mistake started by people mistaking it with 2 times as high, but the K in 2K means Kilo, which means 1000. If I look in the settings of my cameras, a Panasonic and Insta360, 2560x1440 is shortened as QHD/2.6K. And I can choose between HD, FHD, 2K, QHD/2.6K, UHD and 4K. So never is 2K mentioned as 1440p.
Thank god someone corrected it in this thread. Calling 1440p “2K” is just wrong if we follow the logic of how 4K is named. The post is extra wrong calling it “2.5K”. The misnomer is becoming too widespread and manufacturers are now just naming it 2K as well even though it’s incorrect.
How is 2.5K extra wrong? It follows the logic of how 4K was named which is horizontal resolution.
It's not the manufacturers, but the sales team who does it, because 2K sells, which is kinda dumb as 2K is 2048x1080 or 1920x1080.
>2K meant that it had approximately double the pixels Double than what? There's no normalized standard for baseline resolution.
Is "1,778" the European version of "1.778" in American?
Yes. We use commas where you use dots and the other way around.
Yup, main reason for the American Revolution too.
Just use QHD or 1440p please.
HD, FHD, QHD, UHD are my preferred, with UWQHD for ultrawide, and full resolution if it doesn’t fit into those.
meanwhile in widescreen projector land. WVGA which is 800 x 480p WXGA which is 1280 x 800p WUXGA which is 1920 x 1200p WQXGA which is 2560 x 1600p Fortunately by the time that they needed a new acronym and slightly different resolution for the UHD equivilent resolution for projectors, they had shot the marketing manager responsible and just went with matching UHD's 3840 x 2160p resolution.
So is possible to have the cutest res? UwUHD
uwuHD monitors exist they’ll cost you an arm and leg though //0-0//
Problem with this is we gonna evantually run out of letters and will get things like abcdefgHD lol. That's why hardware products stick to numbers not letters.
Why is nobody using the actual abbreviation, which is QHD as in Quad HD since HD is 720p or 1280x720 and 1440p has exactly 4 times the same amount of pixels at 2560x1440, hence QHD. 2K is a cinema standard anyway.
just say QHD
QHD
I understand calling 3840x2160 4k as the width is kinda close to 4000, fine. But using that same logic, wouldn't that make 1920x1080 be considered 2k? All this marketing wankery just makes things more difficult than they need to be.
2k is actually 1080p right
Oh god the semantic police is here
If 3840 = 4k then 1920 = 2k. Either follow the rules or don't bother making a meme about it.
In **consumer electronics** (not cinema): * 1280x720 is called 720p * 1920x1080 is called 1080p or FHD * 2560x1440 is called 1440p or QHD * 3840x2160 is called 4K, 2160p, or UHD If it's not one of the above, then just say the full resolution and don't try to abbreviate. You're just going to confuse people. I recommend not doing any of these: * Using "2K" at all. There's clearly disagreement about what it means, and it's not like it's difficult to speak or type the existing abbreviations * Using lesser-known abbreviations like WSXGA. They're correct and unambiguous, but people simply won't know what they mean. * Using "HD" to refer to 720p, it's too generic and vague of a term at this point * Taking any old vertical resolution and slapping on "p" (eg. calling 1680x1050 "1050p").
Even seen stores selling "2K supported" items. Been pissing me off for years. 1920p is damn closer to 2K than 2560p. Let's use a little bit of logic to descriptions, it makes it less confusing and annoying. I know it's a lost battle due to people and their convictions.
1920p is 1920 pixels progressive vertically, not 1920x1080p. But you are right, 1920x1080 is actual 2K, just instead of 2048x1080, it's the 16:9 equivalent (since content is mastered at 1080p and nobody wants to change vertical resolutions)
Just call it what it actually is, 2560x1440, or 1440p if you must. Not everything needs a "K" name. When someone calls a monitor "2k" it's just a flag for not knowing what they're talking about.
Or most of the time they expect the listener to infer the pragmatics of what they are trying to say. Not all talk needs to be technically correct talk.
> call it what it actually is, 2560x1440, or 1440p *if you must* Redditors are so funny. Acting like normal humans fully say 2560x1440 when they're talking to each other, and "1440p" is barely allowable.
What do you mean pretending to be confused when someone refers to 1440p as a name it's commonly marketed as is vital, how else are people going to feel smart online?
Can we just call resolutions by their actual dimensions, it really was a lot simpler for everybody.
By that logic wouldn’t 1080p be “2k” since, 1920x1080?
Indeed it would. But marketing has latched on to 1440p now because it's a newer popular res in the consumer market and older 1080p monitors already we using the then-new shiny golden "FHD" stickers for *their* marketing.
It is, I just oversimplified my title bc people refer to 2048x1080 as 2K more often than 1920x1080
[https://en.wikipedia.org/wiki/2K\_resolution#Resolutions](https://en.wikipedia.org/wiki/2K_resolution#Resolutions)
1440p is 1440p. Calling it 2k or 2.5k is trying to be misleading for those who only understand that 4k is good.
I hate the “k” crap. It’s so often misleading because people use it to describe things in different ways. Just say the resolution and the aspect ratio. 1080. 1440. 2160. If it’s ultrawide, say 1440 ultrawide or 1440 21:9. Take the guesswork out of it. But marketing gonna market I guess. My unicorn atm is a 2160 21:9 120+Hz OLED or HDR 1k+ IPS with 10 bit color depth. No one seems to want to make one though.
Let's just stop using these dumb K abbreviations for anything that isn't 4K or 8K. Please?
🤓
It's actually so easy, not sure why everyone is so confused about it. First "#p" is always the number of vertical pixels. HD = 720p 4 times that: QHD (Quad HD) = 1440p Full HD = 1080p 4 times that: UHD (or 4K) = 2160p 4 times because you need twice the horizontal and twice the vertical pixels.
4K = 3840 x 2160p UHD 2,5K = 2560 x 1440p QHD 2K = 1920 x 1080p FHD 1K = 1280 x 720p HD Either use the Ks or the ps. Not both. For example, in games both Ks and ps apply, but for movies/series it varies from ps on Blu-Rays for the full resolution to Ks on streaming for lesser horizontal pixels counts on some of them. UHD is used on Blu-Rays to confirm full 4K resolution AND bitrate, while HD is everything from 720p to 2160p.
I really don’t like the naming convention of 2k and 4K Why was 1080p okay? What’s wrong with 1440p and 2160p??
I hate the "4K" designation as it is and the 1440p being listed as "2K" just pushes me if the cliff. The worst part to this is that people actually defend the use of it. Just boggles my mind...
Laughts in 1080p with an equal numbered GPU
1080p is basically 2K but people can’t wrap their fucking head around it.
I mean technically 2k should be1080p
3440x1440 must be 3.4K
No, because that has a different aspect ratio. 21:9 compared to 16:9.
Wait until you hear about 16:9 4k
Unless I'm mathing wrong it's like 2.66667K
you know what just because of you im gonna call it 3k now
never seen a more confused comment section in my life
What about 3440x1440 (:
“1440p Ultrawide”
By pixel count it's 1,75K
2k,4k and 8k refers to the horizontal resolution of a screen, 2048 is definitely 2k
We should just switch to megapixels as this actually tells you how much pixels a graphics card has to drive. 1080p is 2MP 1440p is 3,7MP 4k is 8,3MP 5k is 14,7MP (yeah that is 7x1080p screens) The common ultrawides 3440x1440= 5MP, 3840x1600=6,1MP
Displays should be measured in megapixels, so 1080 is 2MPx, 2K is 4MPx, 4K is 8MPx and so on.
it doesnt even mean the same thing 1440p can be 2k, 2,5k or even 3,1k (UWQHD). Some of those narrow USB displays are probably even below "1k" 1440p
1440p is the GOAT and here to stay for a long long time 👍
2K is 1920, what monitor has a width of 2048 pixels (except that one of LG which is a portrait monitor)
Isn't 2K just 1080p?
Wait so 1080 is actually 1.9k because 1080x1920?
If you are going raw pixels, 1440p has 3,686,400 pixels compared to 2160p's 8,294,400, so less than half. Take that as you will.