Introduction by the curators Rosa Casado and Mike Brooks: “The exhibition brings together a selection of major and influential contemporary artworks from within the fields of social robotics, bio electronics and artificial intelligence, that will together inhabit Azkuna Zentroa’s gallery, to create an evolving ecology of interactive and adaptably performative machines. With the artists who have worked with us over the past year to realise this collective of works, we have focused on how these strangely familiar yet extraordinary robots – through their behaviours, and their endeavours to adapt and function within their environment – might both reflect and respond to our own choices and experiences. Perhaps our meetings with them might open new perspectives onto the things we ourselves choose to do and not do, to make and not make happen, as together we navigate and shape the shared spaces we live in.”
I am in the middle of a new Lead Creative Schools project (I ran another one last year). Aimed at promoting a creative approach to teaching and life, the scheme allows a class in a primary school to work with a creative practitioner for approximately 10 teaching days on a made-to-measure project. This time I work with a year 6 class (10-11 year olds), combining creative technology and outdoors activities. We code Microbits and upcycle e-waste in the morning and build dens and mud batteries in the afternoon, a fine balance!
December 2017. The Garage contemporary arts centre invited me to wrek Russian e-waste with local participants and build some post-apocalypse trees as part of their 8th Art Experiment season. This year the theme was Laboratories of Earthly Survival. Curator Snejana Kratseva says:
“Each winter, Garage transforms its galleries into an experimental laboratory for art. Visitors of all ages are invited to participate in hands-on experiences with artists, as well as innovative creative collaborations between peers. Art Experiment is the flagship initiative of Garage Education and Public Programs and attracts students, parents, local residents, and Moscow visitors.
This year will be the eighth annual interactive initiative,focusing on science art and survival ethics. It will consist of hands-on experiments in “hacking” life sciences and equipping participants with skills in agricultural, biological, genetic as well as robo engineering, preparing kids and adults for an imaginary future after the world ended, cultivating a future generation of home-grown brand of “garage scientists” who will be able to not only to generate new inventions with low-fi materials but do so evaluating one’s ethical values with every new discovery.”
I knew there was little hope to get some exotic soviet era e-waste, and I was right. We got lots of Hewlett-Packard PCs, a crate of early 2000s Panasonic cameras and various other familiar consumer electronics items.
Other artists in the show were Anastasia Potemkina with an hydroponics installation for growing resilient, apocalypse resistant plants such as nettles, and the collective Where Dogs Run who had 20 odd live chicks providing the data for a vintage slide show and a great-looking electronic sculpture based on Dante’s inferno.
Art Experiment, Laboratories of Earthly Survival ran from December 19th 2017 to January 8th 2018.
I found this photo while tidying some drawers. These are two friends in robot costumes for a no-budget sci-fi pilot I shot in 1993 called Euronutrifood. They are supposed to be evil slave robots. Thanks again and respect to the ghosts in the machine: Raphaëlle Paupert-Borne and Matthieu Demouzon.
Atmoshere, Geosphere, Biosphere, Noosphere: The sphere of human thought
now criss-crossing the world in binary strings
AD DA conversions analog to digital --> to? analog? data fit for human understanding
See our mini machines moved by bacterial mud power, among a great selection of cutting edge projects by international artists and designers, during the Digital Design Weekend in Victoria and Albert Museum London, 23rd and 24th September 2017.
Detailed info on our work with microbial fuel cells here.
LAUREN is a project by new media artist Lauren McCarthy. She will impersonate a home automation assistant not unlike Amazon’s Alexia, responding to users’ vocal commands and acting on their connected domestic environment. Project Lauren will last 3 days.
“Lauren will control your home for you, attempting to get better than an AI, understanding you as a person”.
I reckon it is a no brainer for Lauren the HI (human intelligence) to be better than Alexa or Siri, examples mentioned by the artist on the project’s website.
Volunteers might feel more morally observed than by an artificial assistant, and may have to deal with interruptions of service due to naps or other very human breaks.
You can apply here if you are interested in hosting Lauren in your home.
— Why do I blog about that?
I have an ongoing interest in the way machines and humans roles overlap or shift, takeover, resistance, harmony, symbiosis. Power, delegation, cyber-isation. Lauren is an interesting gesture that reminds us about the unique -as yet- touch humans can bring to other humans in a way machines cannot. My own Am I Robot installation works on a similar principle of injecting HI in a system normally driven by AI or simpler algorithms.
The Wizard of Oz, HI trickster, exposed by Toto's down to earth DI (dog intelligence)
The robot fans readers will know about Boston Dynamics‘ Spot Mini, a pretty amazing quadruped robot I mentioned a while ago. On 19th July 2017, an upgraded version was introduced to the clients of Boston Dynamics’ new owner, Softbank Japan. Spot has been given an “arm”, that looks as much as a neck and jaws as it does an arm.
These characteristics would make Spot Mini an ideal candidate for the Coy-B Wild Robot experiment that’s been haunting me fo the past ten years, with highs and lows. For reasons of reliability, cost and battery usage, I did not think of Coy-B as a legged robot. Yet, something agile and uncanny like Spot Mini would be very good for the job!
The demo shows a robotic creature whose abilities are, if not quite at the level of those demonstrated by the coyote that inspired Coy-B, fluid and fast enough for a very engaging real-time interaction with a human. Loaded with a suitable set of teeth and a wild AI program of course.
When will it be available at the robot shop around the corner?