Coldplay performed one-off shows at the two live music events: The Global Citizen Festival in Central Park in New York City, which also featured Pearl Jam, Beyoncé, Ed Sheeran, was timed to coincide with the launch of the United Nations' new Global Goals designed to fight inequality, protect the planet and end extreme poverty by 2030; the other was the iHeart Radio Festival at the MGM Grand Garden Arena in Las Vegas, with a plethora of acts including Sam Smith, Jason Derulo and Duran Duran.
"The band wanted me to produce some screen treatments for these one-off shows that were fresh and new," says Miles. "Having worked with Coldplay for so many years now I have a good idea of what they like and how the lighting and stage moves will work for their classic tracks.
"The main aim was to make some real time effects that could be applied to live cameras. The main bulk of the content I designed is produced using custom generative effects in Ai. Whether the images were generated from masks, stills and clips or generated from the live camera inputs they were all fed through a combination of different Ai effects to produce real time generative content."
Miles approached London-based creative technology experts RES to produce a number of Ai effects, one of which was dubbed the 'Spirascope', creating geometric patterns within Miles's specified colour pallet that could be reconfigured for different screens and cued in various different ways for the famous Coldplay tracks. RES supplied the Ai server rack for the two shows, comprising 2x Ai S4 servers.
"As I was using live camera inputs for the majority of the screen treatments it was important to me that the latency between capture and output was as low as possible," Ben continues. "The S4 has minimal latency using the Datapath input cards. The larger of the two shows, iHeart required three outputs for the various screens in the design so the S4 was the obvious choice within the Ai range."
Both servers had 2x HDSDI inputs for the live camera inputs. Miles used input 1 to take in the house IMAG line cut and the second input to take in a FOH camera ISO which he used for specific FX shots.
"I really like the custom effects you can build with Ai, both with the existing library and by adding new custom effects patches," says Miles. "Everything can be easily modified and built on to create infinite possibilities. I also really like the workflow of having an inbuilt 3D visualiser so you can program your show anywhere and then output the same show file in the actual show. I programmed a series of looks on the plane on the way out there, using my Macbook and laptop webcam and then loaded the adapted show file to the server rack on the show day."
(Jim Evans)