Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
In reply to the discussion: The difference between normal and high-definition TV, for the non-technical among us. [View all]BelgianMadCow
(5,379 posts)20. I should find the formula
here's a good part that explains what I'm on about:
To fully understand the implications of high resolution and high definition vs size, we must first understand something called acuity of vision. The Dictionary of Visual Science defines visual acuity as "acuteness or clearness of vision, especially form vision, which is dependent on the sharpness of the retinal focus within the eye, the sensitivity of the nervous elements, and the interpretative faculty of the brain." What this means is our eyes have a resolution limit. Increased image resolution is simply an technical exercise, beyond our ability to see it, and does not play any part in improving the viewing experience. Our visual acuity is unambiguous and relatively simple to measure.
The most common vision measuring tool is called the Snellen chart. An optometrist will ask you to read from a chart standing 20 feet (or six meters) away from it. The smallets number you can read defines your acuity of vision. This is expressed as a fraction. A normal vision is supposed to be 20/20. 20/10 means that a subject can read, from a distance of twenty feet, the line that a subject with "normal" vision could only read from ten feet. 20/10 vision is therefore twice as good as 20/20. In comparison, 20/40 is twice as bad.
Coming to video display, the human eyeâs resolution (acuity) is directly proportional to the size of the elements of the image and inversely proportional to distance from the elements. This relationship is best expressed in degrees.
In simple terms, we can see things that exist within a known angle with the apex being our nose. If you stare straight ahead, you will have a stereoscopic field of view of about 100 degrees, or about 50 degress to the left and right of your nose. We also have a lower limit to our field of view. Scientists express this as an angle as well, but it is less than a degree, and is expressed differently, For angles smaller than 1 degree we use arcminutes and arcseconds as a measurement. An arcminute is equal to one sixtieth (1/60) of one degree. "Normal" visual acuity is considered to be the ability to recognize an optotype (letter on the Snellen chart) when it goes down to 5 minutes of arc. Taking this to displays, the average person cannot see more than two pixels separated by less than 2 arcminutes of angle.
A 42 inch screen is the minimum size, where the pixels are seperated by 2 arcminutes of angle, if you sit some 6 feet away from it. In smaller screens, the pixels are closer. Though they can also display images with 1080p resolutions, the eye will not be able to appreciate that as compared to say 720p even if you sit very near the screen. Both will look the same.
The most common vision measuring tool is called the Snellen chart. An optometrist will ask you to read from a chart standing 20 feet (or six meters) away from it. The smallets number you can read defines your acuity of vision. This is expressed as a fraction. A normal vision is supposed to be 20/20. 20/10 means that a subject can read, from a distance of twenty feet, the line that a subject with "normal" vision could only read from ten feet. 20/10 vision is therefore twice as good as 20/20. In comparison, 20/40 is twice as bad.
Coming to video display, the human eyeâs resolution (acuity) is directly proportional to the size of the elements of the image and inversely proportional to distance from the elements. This relationship is best expressed in degrees.
In simple terms, we can see things that exist within a known angle with the apex being our nose. If you stare straight ahead, you will have a stereoscopic field of view of about 100 degrees, or about 50 degress to the left and right of your nose. We also have a lower limit to our field of view. Scientists express this as an angle as well, but it is less than a degree, and is expressed differently, For angles smaller than 1 degree we use arcminutes and arcseconds as a measurement. An arcminute is equal to one sixtieth (1/60) of one degree. "Normal" visual acuity is considered to be the ability to recognize an optotype (letter on the Snellen chart) when it goes down to 5 minutes of arc. Taking this to displays, the average person cannot see more than two pixels separated by less than 2 arcminutes of angle.
A 42 inch screen is the minimum size, where the pixels are seperated by 2 arcminutes of angle, if you sit some 6 feet away from it. In smaller screens, the pixels are closer. Though they can also display images with 1080p resolutions, the eye will not be able to appreciate that as compared to say 720p even if you sit very near the screen. Both will look the same.
Last comment in this thread: http://www.hifivision.com/television/1604-whats-difference-between-hd-ready-full-hd.html
By the way, plasma is just a technology, and you can have a HD ready or full HD plasma.
Anyway, to each his own, and if you're happy, you're happy. I just question the everlasting tech drive for "more" when we don't use or need it.
Edit history
Please sign in to view edit histories.
65 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
RecommendedHighlight replies with 5 or more recommendations
The difference between normal and high-definition TV, for the non-technical among us. [View all]
Scuba
Sep 2013
OP
Except of course that there is more high quality television programming now than ever before.
Warren Stupidity
Sep 2013
#3
and anyone sitting a normal distance from a normal size screen will NOT see the diff 1080/720p
BelgianMadCow
Sep 2013
#6
Using a site that wants to sell HDTVs, but which uses the formula, you need 720p
BelgianMadCow
Sep 2013
#25
Yes, and that is the choice consumers are making now. Full HD or HD-ready
BelgianMadCow
Sep 2013
#21