With the help of advanced artificial intelligence technology, specifically the GEN2 system, Midjourney 5.2, the clip was created by filmmaker and director Petr Salaba. This innovative combination of musicianship and artificial intelligence represents a new step in the world of video production.
Petr Salaba, known for his experiments with film and technology, used AI to create a visual accompaniment that obeys the musical rhythm and atmosphere of the composition "In the Belvedere Hotel". The result is a hypnotic experience that hits the viewer in a new, unexpected way.
Band member David Hrbek says: "After the classically animated video clip Doctor Foltýn is coming and the lyric video for the song Podzim, which was produced for us by Honza Ponorka Ponocný, we wanted to try something new. And given the nature of the song, which is about longing and emotional unfulfillment, we thought in the band that it would be interesting how artificial intelligence, which as laymen know almost nothing about, would relate to this topic. And that's how we got to Peter, who took on this task."
The author of the clip Petr Salaba adds: “I chose a noir atmosphere of rain, shadows and neon, not only because it suits the song, but because it's something that image synthesis systems can handle surprisingly well. The tool for simulating general physical phenomena in this case enables emotions. It is also because these optical principles are safely repeated in the training data from which artificial intelligence is based. Although our neon noir atmosphere is sensual, it would in itself be a digital substitute for classic motifs that have appeared in cinema for over 80 years. The lyrics of the song itself do not explicitly mention anything about new technologies, but there is a strong motif of separation and an inner longing for contact with another person. And so the theme of virtual reality quickly came to mind. It can bittersweetly fulfill all wishes, but at the same time it chillingly shows some essential emptiness that it places before us. I also enjoy the resulting synthetic anachronism, from the sexy chiaroscuro of the 40s, through the decadent eighties colors to the punch line with contemporary VR glasses. Technologically, I used a combination of Stable Diffusion, Midjourney and Gen 2 tools, which can synthesize a static image from text input and then move it around with a high degree of randomness. I didn't use any specific visual template or classic animation techniques. In a similar way, my just completed half-hour film Scalespace was created, which is about how genetics, culture and technology harmoniously connect in different time scales. The film had its premiere at the Ji.hlava International Documentary Film Festival at the end of October, and then it will be shown on special distribution in cinemas and in other cities."
Group SO WHAT? became famous for her unique musical style, and this experiment with artificial intelligence is another step in the direction of creativity and originality. "At the Belvedere Hotel" already reached listeners with its addictive sound, and now with an AI-generated visual accompaniment, the song takes on an even deeper dimension.
The clip will be available on all major music platforms and fans can enjoy a unique fusion of music and artificial intelligence. Group SO WHAT? it shows again that art has no boundaries and can still surprise us.