We were contacted by the Barcelona city council to create an interactive exhibition for the rehabilitation of dwellings. By researching similar projects, we came across the Bare Conductive website and saw how accessible it was, we decided to use their technology.
The aim of the exhibition is to raise public awareness and provide tools to make visible the benefits of building renovation.
For this project, which consists of two interactive panels, we have used two Wall Kits.
The first tests were made with 3 different materials: a corrugated cardboard plate (which is the material of the structure), a solid wood board, and a simple frame cardboard. We followed the Interactive Projection Mapping Installation tutorial on the Bare Conductive website and attached the Electrode Pads on the back and in the middle of copper tape squares without having it screwed to the front side of the panel.

Even though we haven’t figured out yet how to actually make the sensors work by proximity rather than touching the points, we decided to start testing with the projection. To do this, we had to update the Touch Board’s using the Arduino and the Touch Board installer. It may seem intimidating to work with code, but actually, the process is quite simple. We had to follow a mini tutorial and after that, we were able to activate the midi sketch.
Understanding the sensors:
It is worth pointing out that, once the Midi Sketch is uploaded to the Board it is not possible to go back to having it playing only sounds with the MicroSD and for that, the sounds would have to be integrated into the animations. So, in this case, as we have two different kinds of interactivity, we left one board with the original MP3 set up and the other changed to Midi. In order to not mix them up, we labeled each board with an M for the board with the midi configuration or an S for the board that will only work with sound.
We needed a video mapping program, for this case, we used MadMapper. Once we had set up, we try and make the proximity sensors work, but it was not going well. We watched some more tutorials again and found out that we could visualize better the sensitivity of the sensors by using the Grapher.

With Grapher, everything went much clearer. We were able to check if the sensors were actually working, which was very difficult to understand before. So, we downloaded Processing and updated the code on the Touch Board. In the bar graph, you have an option that you can see all the sensors together and their reactivity. By touching the copper tape on the back, the bar went down to almost 1/3 of the size and became white, whilst by bringing my hand closer, the bar became white but just came down the littlest bit.
We did multiple tests with the copper tapes and we didn’t understand why the sensors didn’t work with proximity. We tried to control the reactivity by changing the bar limits on the Grapher and it didn’t work. We thought that by modifying the Arduino code, we could change the sensibility parameters. We looked for the input and output MIN/MAX and changed the distance between them so it would be higher. We then changed the input max 640 as the highest number, because that is how far down the white column went, and the input min 280 so that the distance range was sufficiently big. When we reset the board and put the hand closer to where the sensor was, we were positively surprised. The proximity sensors were already working. Then we changed the parameters of all the sensors and found that they also worked at a distance.
After a long process of design, we were ready to print the panels and assemble them creating two big canvases of 3x2m.

The first board to be assembled was the soundboard. We mapped out the places where the sensors should be using a flashlight so we could see where to stick the copper tape from the inside. After we had all the copper squares with the xs glued on, it was time to attach the electrodes to the back of the panel.
The electrodes and the touch board were attached to the back of the panels with small screws and connected to the sensors respective outputs on the board. For organization purposes, we used the numbers that came with the kit to identify every conection and also a post-it to relate them to which sensor they were connected.

After this, we took the mini SD from the sound card and took it to our computer with the adaptable mini pen that also comes with the kit. We changed the names of the voice-overs related to the project and transferred them to the mini SD and then put them back on the board. With that settled, sensors in their places and connected, we connected the board to a power bank and the speaker. Everything worked fine with the soundboard, including the proximity sensors.
Now the biggest challenge: to make the panel work with the projector.
Since then, the first challenge is to square the projection of the animation. Behind it are the squares of copper tape and the electrodes connected to the board by wires. As this panel has to send signals to the computer, we bought a small computer that for the set up is connected to a large screen, but that for the exhibition could be a much smaller screen or even an iPad. As this small machine does not have the same capacity as the computers where we created the animations, the alpha format did not work, the computer could not handle running all the videos in alpha at the same time and crashed, making it impossible to work. So we decided to use mp4 instead and it really was the solution, the files were 10x lighter and the computer and Madmapper worked perfectly again.

The second challenge was to figure out how to switch animations depending on which sensor is triggered. In the tutorial, each animation was in alpha, in its own space and they could use the control list to activate each animation. In our case, the animations were activated, but they were one under the other, and since it was all in mp4, you couldn’t see anything and it looked like it wasn’t working.
So to make sure that each animation replaces the other, instead of using the control list, we must select from the drop-down menu the scenes/cues option and configure each scene so that when activated the relative sensor the entire scene replaces the previous one. To activate each animation, first, we had to adjust the projection, then create a scene for each animation, and then for each scene, select the “edit midi controls” option and relate each sensor touch to the activation of the corresponding scene.
During the Christmas holidays, something happened with the board configuration and the electrodes were being triggered all the time, not letting me configure the interactivity. We noticed that one sensor was really blank all the time and changed its parameters both in the grapher sketch and in the Arduino. We opened the Arduino sketch again and opened the generic midi interface, to then reconfigure it as I had done in the beginning. We changed the false to true and changed again the numbers of the sensors to 640 and 280, using our own previous points. We checked that the sensors worked in the grapher again and reset the board several times. After many tests and changes in the grapher and Arduino’s code, we managed to make Madmapper recognize the proximity again.
How to make the touches on the panel activate each animation in turn?
We searched a lot on the internet, but we couldn’t find a tutorial that specifically clarified this. Using bare conductive tutorials we could run several tests until we found a solution. Our problem was that we couldn’t make the activation of a video overlap with the others and by touching the sensors the videos were activated.
With the board turned on and connecting the sensors to the computer, we first adjusted the layout to the projection so we could copy the other scenes and organize each animation in its respective Cue column, renamed and each one with its respective color. To activate the sensors, we had to map the scenes by selecting each column in the scenes tab and then edit it so that the program could learn which sensor activated its respective animation. After this process, we were ready to do the final adjustments and prepare the installation for the inauguration day.
On the 3rd of March of 2022, the interactive exhibition was inaugurated in CAATEEB, in Barcelona city, where it will stay for 3 months until it begins its itinerant route through other cities of Catalonia. You can check out the full project on our portfolio.
Ana F. Netto – Art Director at Framemov