Particle Collisions at the LHC

Introduction & description of the data

Spacepoints

Barrel

Track

Semiconductor needed the different types of raw data as point objects in 3D Studio Max. They anticipated that 10TB of data would be required to complete the new Audemars Piguet commission which would be premiered art Art Basel, 2018

MaxScript

MaxScript is a c-like scripting language for 3D Studio with a horribly outdated coding interface. While useful as a sketchpad, it's not fit for pipeline work. But like many other artists, Semiconductor use Maxscript for small scripts in their day to day work so we needed to keep the generative code legible in order that they could make changes.

Since 2014, 3D Studio has had a Python API and after some investigation, we thought the MaxPlus implementation could be a good fit for the pipeline. 

Escaping the 3D Studio Editor

But we didn't want to write Python in the 3D Studio code editor which turned out to be the same as the MaxScript one. Modern code editors feature all kinds of tooling to make coding simpler and we use them every day,. Why work like you're still the 1990s?

Many years earlier a package called External MaxScript IDE had opened up 3DS to external editors and we were pleased to find that some bright spark had already updated the code to work with a modern editor. This didn't work for us and we switched to a more recent implementation. After fixing a couple of problems with that, we had a working connection between our PyCharm python editor and 3D Studio.

Pipeline

With our editor sub-pipeline complete, we developed the pipeline in a couple of days. It came with an easy config file allowing full control over data types, tasks and subtasks providing fine-grained control over generation, postprocessing and rendering of the ATLAS data in 3D Studio Max.

PyMidi

After all of the scienfitic data had been converted and re-generated inside 3DS Max, it became evident that the physical events on the sculpture needed to be synced to visual events. The sculpture contains 384 arduinos driving hammers that strike vertically strung wires in a similar way to that of a piano. As the particles in the 360 video projection crossed the vertically strung wires, the hammers needed to strike the wires. 

We chose MIDI as the protocol to drive the hammers and so each camera that was tracking particles through the 3D space needed its trajectory to be exported as midi events (notes in time) that would activate the arduino assigned to that note.

Since we already had our Python pipeline set up, we could also make use of the wide array of open-source libraries in the Python eco-system. Of these, the most useful was midiutil, a pure Python library that allows the writing of multi-track Musical Instrument Digital Interface (MIDI) files.

So our code really only need to iterate over any camera nodes in 3DS, then trace along the cameras' tracks outputting keyframes as MIDI notes on a separate track per camera. The resulting multi-track MIDI file then contained a synchronised representation of all cameras crossing the strings at defined points. As each point was MIDI note in the file, each arduino could be assigned to a note and thus triggered by notes in the file as it was played.

It was a truly concise and elegant solution, using existing stable technologies to provide a simple to comprehend solution that bridged the diverse knowledge sets of the artists, instrument makers, programmers, and electronics engineers working to deliver the project.

 

See more at semiconductorfilms.com/art/halo