View Single Post
      02-23-2020, 02:23 AM   #2
Banana Hammock
Beefcake
Banana Hammock's Avatar
3769
Rep
711
Posts

Drives: NA
Join Date: Apr 2016
Location: Los Angeles

iTrader: (2)

For the most part it's down to pixel density, volume of production, and input lag.

4K is the same amount of pixels whether it be on a 27 inch screen or a 43 inch screen. The difference is that it's much more difficult and more expensive to produce a smaller screen with more pixels on it. A good analogy of this would be a bunch of straws. Say you have 100 straws and you have to store them standing up in 2 different containers. It'll be easier to store the 100 straws in a larger container than it would be to squeeze them into a smaller one.

TVs come in fewer sizes and resolutions than monitors do so it's easier for manufacturers to pump out larger quantities that can can put into their own displays or sell to other manufacturers. Monitors come in wide array of not only panel size but resolution and refresh rates so they're produced in much smaller numbers.

Finally there's input lag. Input lag refers to the time it takes for the display to react when it's given a signal (eg: the time it takes for the menu to open when you press the menu button on the remote or the time it takes for the channel to change after you press the button). It takes a lot more hardware inside of the panel to reduce the input lag of a panel. This really doesn't matter when you're using a TV but when you're plugged into a PC, a couple added milliseconds of input lag is definitely noticeable and very disorienting.

For your use case, a TV is more than fine.

Last edited by Banana Hammock; 02-23-2020 at 02:32 AM..
Appreciate 1
M_Six16556.00