
On Monday, The New York Times published its first augmented reality feature. The article, written by John Branch, includes four AR moments and is a preview of the Winter Olympics in Pyeongchang, South Korea.
Readers are able to meet world-class Olympic competitors — the figure skater Nathan Chen, the big-air snowboarder Anna Gasser, the short-track speed skater J.R. Celski and the hockey goalie Alex Rigsby — midperformance. Through your phone, the room around you looks just as it is, except the athlete is in it with you.

Augmented reality allows us to bridge the digital and physical worlds; graphical elements can be superimposed on your immediate environment. The Olympics project — a major collaboration among the newsroom, design and product staffs that I led, as The Times’s director of immersive platforms — demonstrates one of AR’s richest benefits: deepening the explanatory value of visual journalism. Scale, for example, is incredibly difficult to represent on your phone screen. By conjuring athletes as if they were in the room, scale is conveyed by the context of your surroundings.
Another advantage is the mode of interaction we provide. Instead of the abstractions of pinch-to-zoom or swipe or click, we simply ask readers to treat the graphic as a physical object. If you want to see the form from another angle, you simply walk around to that area. If you want to see something up close, simply lean in to that spot. News becomes something you can see, literally, from all sides.

Bringing the four Olympians into augmented reality required finding a technique to capture them not just photographically, but also three-dimensionally, creating a photo-real scan that can then be viewed from any angle.

We asked each athlete to demonstrate his or her form at specific moments. Nathan Chen held a pose showing exactly how he positions his arms tightly to his body during his quads to allow his incredible speed of rotation. Alex Rigsby showed us how she arranges her pads to best guard the net from a puck traveling at 70 miles per hour.
Continue reading the main storyFor the AR experience, we placed these scans into context — for example, placing Nathan Chen at the 20-inch height off the ground he would be midquad, based off photo reference and sometimes motion capture. In your space, this will truly be a distance of 20 inches because this is all true to scale.

We also programmed interactivity into the graphics to help you discover more as you walk around. As you crouch down to the speed skater J.R. Celski’s level as he makes his turn, for instance, we highlight the angle he needs to maintain his speed and trajectory.

Interacting with augmented reality content, of course, is an unfamiliar notion for most people, so we had to develop an intuitive way to display information around the graphic. Just as readers know how to read a text story on a phone because we’re conditioned to read left to right, and from the top down, we decided to organize information around how one would naturally interact with a space, which is simply to walk around inside it.

As you circle the athlete, you enter into different information “zones” that are appropriate to the particular perspective in which you are viewing them. As you enter a zone — say, looking closely at J.R. Celski’s hand on the ice — you feel a subtle haptic cue on the phone, and you are presented with a bit of information to learn about: in this case, the use of plastic caps on the fingers to protect them as he leans on the ice for support at high speed.

Launching John Branch’s article was not just about this single piece of journalism. It was also about exploring what visual journalism may look like in the near future. We are extending stories beyond the inches of a screen — and in so doing, envisioning a world in which devices begin to disappear and the spaces around us become the information surfaces themselves.
Continue reading the main story