04/09/2021

    Toward Cyber-physical Interaction for Natural Connection of Real Space and CyberspaceNTT Service Evolution Laboratories

    *The names of the laboratories mentioned in the article may have changed since the time of writing/interview.

    Feature Articles: Media Robotics as the Boundary Connecting Real Space and Cyberspace, NTT Technical Review, March 2021.

    Shigekuni Kondo, Atsushi Sagata, Kenichi Minami, and Akihito Akutsu

    1. Introduction

    In fusing real space and cyberspace by using the Innovative Optical and Wireless Network (IOWN) [1], we can expect more precise simulations that will enable better predictions, thus expanding the range of human activities. In such an environment where there is a fusion of real space and cyberspace and our lives are fundamentally changed, it will be possible for anyone, regardless of their information and communication technology (ICT)-literacy level, to benefit from prediction. To achieve this, we believe it is essential to have natural cyber-physical interaction that involves a natural means of information presentation for fusing our daily activities with the environment.

    The Feature Articles in this issue introduce the most recent trends and technologies that concern cyber-physical interaction at the boundary between real space and cyberspace.

    2. Overview of R&D on cyber-physical interaction

    Research and development (R&D) on the fusion of real space and cyberspace is already progressing to the stage of practical application in many areas, and various types of content have been produced. In computer gaming, for example, game characters are displayed on smartphone screens and superimposed in real space in front of the user, creating the illusion of the characters existing in real space. There are also many games that immerse the user in game worlds. Regarding sports, there are, for example, online bicycle races.

    In the future, it will be possible for people to jump into a virtual world (full-dive) and interact with real space through cyberspace. For example, people can share realistic places with others even if no one is actually there or provide realistic support for people who are not at that location, amplifying human knowledge and using human abilities to the maximum extent.

    We believe that the user interface will play an even more important role than before in a future where real space and cyberspace are tightly coupled. Technology for information presentation and input in ways that do not interfere with human activities, new interaction technology that uses haptic sensory effects, technology for making the utmost use of human motor functions, and other such technology are needed for natural integration of people and the environment through natural information presentation, which is to say natural cyber-physical interaction. Cyber-physical interaction expands and develops individual environments by connecting several environments and exchanging well-being and other subjective information as well as objective information, such as efficiency, quality, and cost, between them. A core technology for such interaction provides perception and cognition control.

    NTT laboratories will expand R&D on perception and cognition and focus on R&D in the field of cybernetics on the basis of physiology.

    3. Current work on cyber-physical interaction

    The following Feature Articles in this issue introduce control technology for perception and cognition, which is a core technology for the cyber-physical interaction that is a subject of current R&D by NTT laboratories.

    "Improving Depth-map Accuracy by Integrating Depth Estimation with Image Segmentation" [2] introduces a system called HiddenStereo that enables natural three-dimensional (3D) viewing from monocular 2D images. This system is implemented by combining technology for improving the accuracy of depth maps, which represent the distance from the camera to each pixel in the image, and division of the image into regions.

    "Affect-perception Control for Enhancing a Sense of Togetherness for Remote Spectators" [3] introduces elemental technology for going beyond simply transmitting and reproducing a sense of presence at an event venue to be experienced by remote viewers. This technology also captures the emotional responses (emotional actions) of the remote audience and creates a feeling of shared togetherness, interaction, and excitement through emotional feedback.

    "Visible-light Planar Lightwave Circuit Technology and Integrated Laser-light-source Module for Smart Glasses" [4] introduces an ultra-compact RGB (red, green blue) laser-light-source module sized to fit into the temples of smart glasses. The module is implemented with an optical system that bundles light sources that produce the three primary colors of light (RGB) with a circuit that is drastically reduced in size.

    "Fine-grained Hand-posture Recognition for Natural User-interface Technologies" [5] introduces research for establishing finger-shape recognition technology to implement operation of smart glasses through hand gestures.

    "Information-display Method for Reducing Annoyance by Gaze Guidance" [6] introduces an information-display method that both reduces the user’s feeling of annoyance and increases the certainty of information access by using an imposed display technique that gives the user the feeling that the act of reading information was their own choice.

    "Presenting Material Properties with Mid-air Pseudo-haptics" [7] introduces a mid-air pseudo-haptic technology that gives the user the perception of the material properties of virtual objects as the user manipulates them through their own action.

    "Evaluation of Adaptability to Unfamiliar Environments Using Virtual Reality" [8] introduces research on creating technology for evaluating and improving the ability to adapt to the environment to achieve appropriate exercise and prevent accidents involving the elderly while walking or driving.

    4. Conclusion

    We described the R&D on technology for cyber-physical interaction at the boundary between real space and cyberspace, particularly the most recent research on perception and cognitive control. We are also investigating other core technologies for cyber-physical interaction, including technology for physiological control, emotion and desire control, communication of the five basic and other human senses, communication control, and social capital infrastructure.

    Toward making IOWN a reality, NTT laboratories have been working to bring on-going work related to perception and cognition to maturity and will continue to promote R&D on cyber-physical interaction as an unprecedented user interface based on human proprioception for a seamless connection of bodies in cyberspace and those in real space in the field of cybernetics.

    References

    1. [1]NTT Technology Report for Smart World 2020,
      https://www.rd.ntt/e/techreport/
    2. [2]M. Ono, Y. Kikuchi, T. Sano, and S. Fukatsu, "Improving Depth-map Accuracy by Integrating Depth Estimation with Image Segmentation," NTT Technical Review, Vol. 19, No. 3, pp. 22–26, Mar. 2021.
      https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa4.html
    3. [3]T. Sano, M. Makiguchi, H. Nagata, and H. Seshimo, "Affect-perception Control for Enhancing a Sense of Togetherness for Remote Spectators," NTT Technical Review, Vol. 19, No. 3, pp. 27–30, Mar. 2021.
      https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa5.html
    4. [4]T. Hashimoto and J. Sakamoto, "Visible-light Planar Lightwave Circuit Technology and Integrated Laser-light-source Module for Smart Glasses," NTT Technical Review, Vol. 19, No. 3, pp. 31–36, Mar. 2021.
      https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa6.html
    5. [5]Y. Kubo, "Fine-grained Hand-posture Recognition for Natural User-interface Technologies," NTT Technical Review, Vol. 19, No. 3, pp. 37–39, Mar. 2021.
      https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa7.html
    6. [6]R. Saijo, T. Sato, S. Eitoku, and M. Watanabe, "Information-display Method for Reducing Annoyance by Gaze Guidance," NTT Technical Review, Vol. 19, No. 3, pp. 40–44, 2021.
      https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa8.html
    7. [7]T. Kawabe, "Presenting Material Properties with Mid-air Pseudo-haptics," NTT Technical Review, Vol. 19, No. 3, pp. 45–48, Mar. 2021.
      https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa9.html
    8. [8]T. Isezaki and T. Watanabe, "Evaluation of Adaptability to Unfamiliar Environments Using Virtual Reality," NTT Technical Review, Vol. 19, No. 3, pp. 49–52, Mar. 2021.
      https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa10.html

    Related content