Jump to content
IGNORED

Damogen Furies creative process, DAW, software system?


Recommended Posts

Huh strange- it was just posted to the Squarepusher FB feed, but now that post and the Youtube video are gone. Weird. It was just a better version of the NYC livestream with more camera angles and the background visuals cut in.

  • Replies 92
  • Created
  • Last Reply

Top Posters In This Topic

  On 4/24/2015 at 7:04 PM, weakmassive said:

Huh strange- it was just posted to the Squarepusher FB feed, but now that post and the Youtube video are gone. Weird. It was just a better version of the NYC livestream with more camera angles and the background visuals cut in.

 

sounds pretty interesting! hope it resurfaces again.

  On 4/24/2015 at 3:27 PM, Ascdi said:

 

 

I'm curious in what way an Orville could be lower-latency than a Reaktor patch running on a sufficiently beefy system. You mean like if it was doing pitch-tracking to convert bass playing into MIDI data? Otherwise the buffer size is the buffer size, yes?

yes only for real time processing of live input, otherwise it wouldn't really make a difference.

  On 4/24/2015 at 10:36 PM, John Ehrlichman said:

so that video with all those flashes of reaktor patches is down? thats funny

 

Not that one... he took down a video that was posted this morning from the NYC event (that I posted those screenshots from above), but it's back now!

 

https://www.youtube.com/watch?v=e-rhK4l6Ado

Edited by weakmassive

Went and rewatched the Creators Projects interview with Tom from 2011 and it showed some Reaktor/VSigFile shots at the end. The Reaktor UI has the same aqua color as the shots I put up early from the NYC event the other day.

 

FnqsVmJ.png
o9ZMCCA.png
cFrEKEu.png
CXBqLfc.png
N9Y1lzS.png
e4s2EMO.png
8SJstvZ.png

Edited by weakmassive

what the fack am I looking at

 

edit: I'm unfamiliar with reaktor, are all those items patched together manually or what? =/

 

Is the code shown at the bottom there resultant of the patches, or the other way around?

Edited by StephenG

 

  On 1/19/2020 at 5:27 PM, Richie Sombrero said:

Nah, you're a wee child who can't wait for official release. Embarrassing. Shove your privilege. 

  On 9/2/2014 at 12:37 AM, Ivan Ooze said:

don't be a cockroach prolapsing nun bulkV

damn that looks like a pretty impressive and long gestating patch, hats off to Tom. I'd love to see a raw video of what this looks like in action, not necessarily under the hood explanation just what it sounds like before going into a production.

the last two look like Orville editor screenshots but I've never used one so i'm not 100%. if it is that means my theory is partially right, and Tom probably is boss enough to have all the CC automation coming from reaktor to just have it all controlled by that patch

  On 4/25/2015 at 6:44 AM, John Ehrlichman said:

damn that looks like a pretty impressive and long gestating patch, hats off to Tom. I'd love to see a raw video of what this looks like in action, not necessarily under the hood explanation just what it sounds like before going into a production.

 

the last two look like Orville editor screenshots but I've never used one so i'm not 100%. if it is that means my theory is partially right, and Tom probably is boss enough to have all the CC automation coming from reaktor to just have it all controlled by that patch

I've tried really hard to make out some of the text in the UI panels but can't read much. Yeah, I'd especially love to see how he builds a track using this system. The Q700 still seems to do most of the sequencing (in the NYC video you can see how much he's using it).. and it sounds like all sound generation/fx are in the software. Keep in mind those screenshots are from a video from 2011 too!

 

Yeah, the last 3 screens are the Eventide editor called VSigFile.

 

  On 4/25/2015 at 5:44 AM, StephenG said:

edit: I'm unfamiliar with reaktor, are all those items patched together manually or what? =/

 

Is the code shown at the bottom there resultant of the patches, or the other way around?

Yeah, all patched together manually.

 

As for the code - in VSigFile, you can work in both "graphic" and "sigfile" (the text-only screen) modes. They are just different representations of the same information. I've never used it, just checked out the manual. :)

Edited by weakmassive
  On 4/26/2015 at 4:03 AM, logakght said:

he used a banana connected to an arduino i heard from stanley kubrick

When 'gear wizardry' starts to become too literal

  On 4/26/2015 at 3:34 AM, Mesh Gear Fox said:

so these eventide things are like max in a rack unit? certainly would be cool to play around with that sort of stuff without booting the computer up each time. not enough for me to want one but i can imagine before pcs were powerful enough having some gear like this would be a solid way into that world (assuming you have like 2k or something )

yes and no, the effects algorithms on them are far higher quality than anything I've heard out of max/msp. They also have really good dsp chips so that real-time audio input will come out the other end effected with zero latency. I believe the ORville can do this on 8 separate channels at once, and at that point no powerful computer soundcard in the world would be able to do the same in real-time without latency.

 

the eventide Vsig editor from the looks of it looks like it operates at a higher level than Max/msp, doesn't seem like you can get as deep. To me it looks more like the working level of Reaktor modules or Kyma objects

Edited by John Ehrlichman

Found this old post on xltronic of a Go Plastic-era interview where Tom talks alot about Reaktor and the Eventide hardware.

Too bad it's like babblefish translated from German so reads like this: "The first software with which I argued
correctly was reactor of native of instrument." And the original link is dead and not all of the interview was posted in the forum.

 

http://xltronic.com/mb/14152/squarepusher-keys-interview

Tracked down an Ufabulum-era Future Music interview, where Tom talks a bit about what his Reaktor patch does:

 

  Quote

 

Then there's the sample player which I've developed in Reaktor. I started it in 2005 and it has massive scope for real-time processing. So even if you've got the same source samples you can mangle them so you've a wide variety of how they can be used even within a single piece.

The way it works is that you hit a MIDI note then all these rows here with bigger knobs are individual parameters that can be set and mapped into arrays and then you hit a different key and brings up all the values associated with that key. So each key maps to different parts of an array on each parameter.

There's a load of parameters here for distortion and when you hit the different keys you get different distortion parameters that'll be sent to the relevant section of the sample playback as the sample goes out.

There's not a lot of soul to it. It's an excercise in making the absolute most of control data."

  On 4/28/2015 at 5:49 AM, weakmassive said:

 

Tracked down an Ufabulum-era Future Music interview, where Tom talks a bit about what his Reaktor patch does:

 

  Quote

 

Then there's the sample player which I've developed in Reaktor. I started it in 2005 and it has massive scope for real-time processing. So even if you've got the same source samples you can mangle them so you've a wide variety of how they can be used even within a single piece.

 

The way it works is that you hit a MIDI note then all these rows here with bigger knobs are individual parameters that can be set and mapped into arrays and then you hit a different key and brings up all the values associated with that key. So each key maps to different parts of an array on each parameter.

 

There's a load of parameters here for distortion and when you hit the different keys you get different distortion parameters that'll be sent to the relevant section of the sample playback as the sample goes out.

 

There's not a lot of soul to it. It's an excercise in making the absolute most of control data."

 

thats a great little bit of insight onto his working process. The part about having an 'array' of parameters is the same thing i was saying about how one would do this with an eventide machine, where you basically keep the machine on one main preset and instead use a reaktor CC panel/front end to have 6 or so knobs where the 'array' of data is stored as snapshots inside reaktor.

It's basically the only way I've managed to get anything close to the Go Plastic sound (in realtime) by using this technique.. but it turns out my eclipse can't even handle more than 10 minutes of doing this until it has an error where it just spits back out the dry signal and im not even trying to do program changes. maybe something is wrong with mine. Reaktor manages to get the job done pretty well though, the internal effects people have made in the user library are pretty damn high quality. A couple of nice spring reverb emulations too

These visuals are pretty fun... the guys who programmed them put their names (Anton Marini, George Toldo, Zak Norman) in the text screens, along with Squarepusher references and other nonsense. Wait_Time_Coleslaw.

 

3tGDdRk.png

 

Frame by frame this video for fun:

Edited by weakmassive
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   1 Member

×
×