1. Ultra-high-definition TV is 4K TV. Full-HD TV generally refers to 2K TV. 4K TV is the TV with the best definition on the market currently, with a resolution of 3840*2160 than a resolution of 1920*1080. The picture is theoretically 4 times higher, which is about 9 times that of high definition (HD.1280×720).
The resolution of 2K TV is only 1920*1080, with only more than 2 million pixels, while the resolution of 4K TV is 3840*2160, with 8.2944 million pixels, and can receive, decode, and display video signals of corresponding resolutions. The audience can clearly see every detail in the picture, every close-up, and have an immersive viewing experience.
For the same picture, the color of the Full HD TV picture will be dimmer, the color will be distorted, and it will not be so vivid. Ultra-high-definition TVs highly restore their own colors, making people feel as if they are actually there. The color is more pure and natural, and it looks pleasing to the eye.
2. Since 1920×1080 has become the high-definition universal image format (HDCIF), and UHDTV is a digital video system designed to express film and television, drama, variety shows, sports events, concerts and other programs, therefore The image format of UHDTV is higher than the quality of existing HDCIF. The image format of UHDTV is UHDTV1 (3840×2160), which supports several frame rates such as 50Hz, 60Hz and 59.94Hz. The system uses progressive scanning.
Due to orthogonal sampling, the pixel aspect ratio (PAR) of UHDTV images is 1:1 and the display aspect ratio (DAR) is 16:9. In order to maintain compatibility with existing HDTV systems,
In addition to the pixel numbers of the above two image formats being 4 times and 16 times that of HDCIF respectively, the primary color coordinates, standard white, and photoelectric conversion of the UHDTV system Functions, brightness/color difference component equations and other colorimetric indicators are compatible with existing standards such as ITU-R BT.709 and SMPTE RP177.
3. Among the three formats of high-definition signals, 1080i/50Hz and 1080i/60Hz have exceeded 1,000 scanning lines, but they all use interlaced scanning mode, and 1080 lines are passed It is completed by two scans, and the actual number of scan lines in each field is only half, that is, 1080/2=540 lines.
Since a complete picture requires two scans to display, there are still slight flickering and crawling phenomena when displaying fine pictures, especially still pictures, due to the limitations of this interlaced scanning technology principle.
,
Extended information
1. Starting in the 1990s, some standard definition TVs began to use 16:9 or 1.78:1 screen width Height ratio, this aspect ratio is almost the same as the 1.85:1 movie widescreen.
2. At the beginning of the 21st century, new high-definition TVs also adopted the 16:9 aspect ratio, and greatly increased the definition from 720x576 (PAL) in the standard definition era to 1920x1080 , and uses multi-channel surround sound audio. But it uses interlace scanning, which is called 1080i.
3. 1080i can provide more picture details than 720p, but it is not suitable for displaying high-speed and dynamic pictures. Both high-definition standards are 16:9 widescreen with refresh rates of 50 Hz or 60Hz. ?
4. So on July 1, 2004, the Digital Cinema Initiative (Digital Cinema Initiative) composed of seven major Hollywood film companies revised and launched its technical document 4.0 industry standard,
5. The specified digital cinema definition is divided into two levels, namely DCI 2K (2048x1080, 24 frames per second or 48 frames per second) and DCI 4K (4096x2160 pixels, 24 frames per second), of which DCI 4K (4096x2160) The amount of information is more than 4 times that of high-definition TV.
6. Therefore, 4K ensures the technical advantage of digital movies over high-definition TV, and this advantage is absolutely needed when movies and TV compete in the future.
Reference materials? Baidu Encyclopedia; 4K TV Baidu Encyclopedia; HD TV