Katy Perry's Super Bowl XLIX halftime show was a large scale spectacle, perhaps the largest halftime show featuring entertainment in history thanks to the 118.5 million viewers who tuned in on Sunday. Projection mapping as an art form excels in these kinds of stadium-sized settings, from ferris wheels to entire islands, but design studio Lightbornestill took on a huge challenge when they agreed to animate the University of Phoenix stadium for Perry's massive performance.
If you didn't watch the halftime show, it featured three main projection-mapped sections: a perspective-bending chess board, a tropical beach party, and an explosive, chaotic finale during which Katy Perry rode through the air on a shooting star.
Katy Perry’s show director, Baz Halpin, approached the studio about doing the show in October 2014. Since, in the past, Lightborne had designed tour visuals for Deadmau5, Kanye West, and Katy Perry herself, the event didn't seem out of reach: "Doing the Super Bowl was a simple extension of the team Katy has put together over the years that she trusts," video content director Ben Nicholson tells The Creators Project. "She is very involved, very savvy and watches every frame of everything."
Alongside Super Bowl halftime show director Hamish Hamilton, producer Ricky Kirshner, production designer Bruce Rodgers, lighting designer Bob Barnhart, and server operator Jason Rudolph, Nicholson could focus on the animation process itself, and have a good time while he was doing it. "The 'Dark Horse' perspective mapping section is obviously super fun, and represents the close collaboration with all aspects of the show," he explains. "It was fun in rehearsals to watch the playback and see which dancers 'fell' into the abyss and then respond by adjusting the video to give them places to stand."
When they weren't rescuing dancers from the projected "abyss," Nicholson's biggest challenge was the pure scale of the show, and the amount of time they had to perfect the visuals. "With perspective mapping, if you are off in your rendered camera view the gag wont work," Nicholson explains. "The surface was huge and there is so little time to rehearse in the actual environment. Preparation and previsualization really made everything work."