Japan Broadcasting Corporation (NHK) continues to advance 8K technologies in the wake of its 8K satellite broadcasting in 2018. The company is also looking towards the future by further evolving its broadcast media to suit its "Diverse Vision" development.
Large 8K displays, augmented reality/virtual reality (AR/VR) devices, three-dimensional (3D) displays, haptic devices, and other emerging devices will be widespread in the future among users in addition to smartphones and tablets.
The ways in which users engage with programs and contents will be even more diverse than they are now (Fig. 1). Therefore, the media will evolve along a new axis, presentation dimensions, as well as along the traditional evolving axes, quality and functionality (Fig. 2), by using various devices.
Because of this, NHK Science & Technology Research Laboratories (STRL) is envisioning and conducting further research on its Diverse Vision, a future media that will enable anyone to view and experience content in forms that are appropriate to their circumstances and to the devices that they use. Through Diverse Vision, users can experience worlds that they have not yet seen and can deepen bonds with friends and family, with full assurance of safety and peace of mind by trusted information.
NHK conducts far ranging research and development to support Diverse Vision, from production to reproduction technologies and from the near to distant future. The following is an in-depth discussion of the initiative towards Diverse Vision.
Flexible OLED Display for Large-Screen, Rollable UHD Display
Ultra-thin, lightweight, and flexible displays are crucial components of NHK's Diverse Vision. They accommodate diverse viewing styles and enable users to easily handle large displays and enjoy UHDTV at home (Fig. 3).
NHK's ongoing research, in collaboration with Sharp Corporation, prototypes a 30-inch 4K flexible organic light-emitting diode (OLED) display formed on a plastic film (Fig. 4). This OLED panel is about 0.5mm thick and weighs just 100g. The minimum bending diameter is about 40mm. The unique point of the research highlights the industry-first formation of red/green/blue OLEDs in each pixel (i.e., there is no color filter associated with white OLEDs) on a 30-inch plastic film. The resolution is 147 pixels per inch. The brightness uniformity across the screen is enhanced by special signal processing. NHK STRL will continue research and development on elemental technologies for large and rollable 8K displays.
8K Wireless Camera Using mmWave Band
Wireless cameras are an important element of production systems, as they make expressive production possible by enabling operators to capture dynamic scenes up close in live broadcasts of sporting events and similar formats. NHK STRL has achieved an 8K wireless camera capable of transmitting 8K video at about 200Mbps with low latency in the 42GHz band. The SC-FDE*1 transmission technology has been developed that enables stable mobile transmission robust against nonlinear distortion caused by the power amplifier and can perform waveform equalization of received signals in the frequency domain. The complete 8K wireless camera consists of an 8K camera, High Efficiency Video Coding (HEVC)encoder*2, and transmitter (Fig. 5).
8K JPEG-XS Codec with ST 2110 Interface
JPEG-XS is attracting attention as a mezzanine compression technology featuring visually lossless quality and very low latency. NHK has adapted JPEG-XS to 8K video and ST 2110 and has prototyped an encoder and decoder. One or two 8K 60p streams can be transmitted through a single 10GbE link. The decoder can also provide 4K video extracted from 8K code stream (Fig. 6). High video quality, over 40dB peak signal-to-noise ratio, (PSNR) and very low latency (less than 1 frame) have been achieved. The compression rate can be selected as 1/8, 1/12, or 1/16 for test purposes. NHK is planning to use the 8K JPEG-XS codec as a scalable interface for multi-resolution production facilities.
8K High Frame Rate System
NHK STRL has developed a high frame rate 8K program production equipment as well as transmission and display technologies to achieve clearer and smoother subject movement with a high frame rate (Fig. 7). A complete system comprising functions ranging from live on-site production to encoding, satellite transmission, and display reproduction connected to equipment developed for a frame rate of 120Hz (twice the current BS 8K broadcasting frame rate) was exhibited at the STRL Open House 2019 held in May (Fig. 8).
In the live production demonstration, an 8K 120Hz camera and low-latency, mezzanine compression IP transmission equipment were used to transmit audio and video to STRL from the live venue. The transmitted video was edited in real time using online equipment and 22.2ch audio mixing was performed. The 8K 120Hz video encoder compresses the video content produced live at a high bit rate and high quality in real time using HEVC/H.265 up to 250Mbps for transmission to the BSAT-4a satellite, which is equipped with a 21GHz broadband, high-capacity transponder.
The signal from the satellite was received and decoded at STRL, and the live video from the venue was presented on a thin, lightweight 88-inch sheet-type OLED display. The 22.2-channel sound was reproduced by a binaural reproduction system*3 using line-array speakers. This experiment succeeded in producing live video at 8K 120Hz and relaying the signal via satellite for the first time in the world. Demonstrating the experiment at the STRL Open House enabled many visitors to discover that its research and development on high-frame-rate 8K has nearly reached a practical level. STRL will continue to improve the performance of each aspect of the technology and move forward with research and development aimed at providing a high sense of presence in the public viewing of sports events and other programs by using the best high frame rate 8K technology.
AR for New TV Viewing Experience
If AR glasses become widespread in the future, what might TV viewing experiences be like? NHK STRL is undertaking research into the utilization of AR technology for media services. One potential new media service is what STRL calls virtual space sharing. Figure 9 shows an example scene of the virtual space sharing service and Fig. 10 illustrates the delivery system featuring synchronized broadcast and Internet transmissions.
The features of virtual space sharing could depict TV performers virtually appearing in a living room and commenting on a TV program while standing right next to you in their actual size through AR glasses, which gives a realistic sense of presence.
It could also depict recording the TV viewing space with AR glasses as a life log or when playing back a past TV program, people (e.g., family members including yourself) who watched the same program in the same space in the past can be restored by means of the AR glasses. Another feature is wearing AR glasses to watch TV with family members and/or friends in remote places.
The effectiveness of virtual space sharing and ways of presenting content for different use cases are currently being researched. NHK STRL is also investigating how AR/VR that can provide a high degree of freedom in presentation capability will affect the viewer experience and how AR/VR can be implemented as a media service that can enhance the viewing experience.
NHK STRL is engaged in research on broadcast applications for high-resolution images for VR. In the future, high-resolution VR images that are highly immersive and with a sense of presence and reality will be enjoyed through head-mounted displays (HMDs), dome displays, etc. depending on the viewing style preferred by the user (Fig. 11).
In light of this, NHK STRL demonstrated high-resolution VR images using a large 180-degree cylindrical screen with pixel resolutions of about 12K horizontally and 4K vertically at STRL Open House 2019 (Fig. 12). The displayed images were created using images taken by three 8K cameras that were stitched together to obtain high-resolution wide-angle images. STRL is currently studying the requirements to maximize the viewer's experience and the production technologies for future broadcast services.
Integral 3D Display with Eye-Tracking Technology
The integral 3D imaging NHK STRL is researching makes it possible to view natural 3D images without using special glasses. With the aim of providing a service for viewing 3D video with portable devices, STRL has built a system that tracks the viewer's eye position to adaptively display integral 3D images on a small display with a high pixel density, thereby achieving a wider viewing zone and higher quality 3D video (Fig. 13).
The horizontal viewing zone was expanded by about 3.3 times (81.4°) and the vertical by about 6.6 times (47.6°) that of the conventional viewing zone. The light density for reproducing 3D images both in the horizontal and vertical directions was doubled to improve the image quality by using a small display with a high pixel density (457.7ppi) and a lens array with a long focal length (2.0mm). STRL will continue research and development on technologies for viewing 3D video with higher quality and better visibility on portable devices as well as on large displays.
Media Service Reaching Every Device
Due to the diversification of user environments, it is becoming important for content providers to deliver content to various devices (including non-display devices) through various platforms. NHK STRL is conducting research on a media service featuring various devices, the Internet, and other platforms working together. As part of this initiative, STRL has proposed Content-Oriented IoT, a framework that enables content providers to provide content to a variety of IoT devices (Fig. 14). This framework will give users greater opportunities to obtain information from media services. It will also provide users with new experiences as they utilize the features of each device. The proposal includes a Content Description feature, which describes the content being provided (such as titles, content locations, media types, and recommendation information) and is used when the content is presented.
The Content Description makes it possible for a device to decide which type of content to be presented according to the user environment and content descriptions. For example, a mirror might display web content and a watch might provide audio commentary. Also, the Content Description enables multiple devices to be synchronized with TV programs to enhance the user experience. For example, the lighting color changes according to the scene in a TV program, and a cleaning robot becomes silent while important scenes are displayed.
STRL is contributing to the W3C Web of Things Interested Group*4 by considering interoperability with related standards.