Audio: 'Homeward Blues' - AfroLektronik
https://soundcloud.com/afrolektronik/homeward-blues
Untitled
Performance + Tech [Motion Sketchbook]
Using: EbSynth - ebsynth.com
Track: My Flash On You by Love
'Mr Scoff' - Mini sketch 01
Performance + Tech [Motion Sketchbook]
Using: EbSynth - ebsynth.com
Untitled test
Performance + Tech [Motion Sketchbook]
Using: EbSynth - ebsynth.com
https://rive.app/community/files/13638-25829-light-touch/
Rive - Interactive Animation
Light Touch
A simple exploration to learn Rive's design/animation/interactivity ability.
Techniques / Creative Tech / Software
Smode, Unity, Twine, Rive, Spline, AI workflows,
Ableton Live, Isadora...
There's lots of new tools (software) to get up to speed with.
Music / Sound
Playing with Smode
Live Audio Visual software.
Linking up live sound to live visuals to play in a live setting.
I'm exploring motion graphic visualisation in combination with live performance (Music, Soundscapes)
Getting my head around the live audio/visual process.
I've a decent amount of experience in motion design and animation.
Im hopeful I can transfer such skills to be used in live/immersive/interactive projects.
The performing live will be the really scary part.
(Lots to say about stage fright and mental health and pushing through it)
I'm actually a bit overwhelmed with what is possible on the likes of live motion software such as Smode.
I have a great deal of ideas on the horizon of how such technologies and techniques could be utilised for live storytelling, immersive arts, theatre and spaces.
ㅤ
ㅤ
ㅤ
Playing with Unity
Unity is not essential to my workflow as I currently see it unfolding.
I spent some time learning about assembling/lighting scenes, post processing, setting automatic cameras and cut scenes, 'cinemachine', volumes. The volumes in particular can be set up to act as trigger points based on the players location in the world space. Below, I used stock assets and it didn't take too long to understand the workflow, as I find Unity very intuitive to use.
These functions are particularly interesting for me in the live/interactive performance/experience arena.
The stumbling block I had was when I attempted to export the scene as a WebGL (Web 3D) file and was getting nowhere with it and nowhere with workarounds.
I can still run files locally in dedicated exhibition/experiences should the need arise.
Unity and its capabilities most useful to me, have been compartmentalised and may perhaps be
used for later creations.
ㅤ
ㅤ
Playing with Ebsynth (+Performance tests)
I've used a timeline scrubbing technique here, which makes this in effect, a digital puppet that can be manually controlled. Not the most elegant solution, but a start and the manual aspect is very connective and intuitive to 'drive'. Its fun to use and the results, if janky, can be enough to sell the effect.
Its not about realism. Its about responsiveness , live feedback, shared experiences in spaces,
bodily integration (Active Participation) via Music / You / Moving image - In Space.
Engaging with the content in such a way is much more immersive and visceral.
There is variation and randomness in the 'scrubbed' performance.
Yet agency, spontaneity and a sense of control.
Im being quick, dirty and lazy at the moment, when developing these new workflows.
Its handy, because I really do like the hand drawn, sketchy aesthetic, combined with clean lines and processing. Above, I have composited/combined the live action shot back in, to reveal only my eyes and mouth. I can drive my own drawings (Animate) via video performance, rather than spending more time in software. This is effectively rotoscoping and I always found the job tedious, so... there's a win for new tech.
Live + Interactive is a constant consideration when exploring Animation / Storytelling -
Combined with Performance / Audience / Space, which will go towards the ends of creating Intermedia Experiences across (Inter-operable) public spaces / buildings / AR (Augmented Reality) \ Digital \ Online.
_
ㅤ
ㅤ
ㅤ
Playing with Rive
Rive can provide, interactivity with animation on the web and screen devices. I'm excited about utilising live animation that an audience can interact with.
https://rive.app/community/files/13638-25829-light-touch/
In its idle state, the eyeball tracks the user's cursor position, providing real time interaction with animated content. When the cursor clicks on the eyeball, an animation is triggered, then it returns to its idle state. All of which can be played within a web page, in the browser.
Such capability could greatly aid online learning, or make for richer, interactive experiences.