Mixed in key Satellite
- Banned
- Topic Starter
- 484 posts since 29 Jun, 2020
- Banned
- 11467 posts since 4 Jan, 2017 from Warsaw, Poland
That's pretty clever!
One thing I couldn't find - will it also play in sync with the transport all of the unmuted audio that's inside the Satellite? I hope dragging the stems from the plugin to own session (as shown in the video) isn't the only way?
One thing I couldn't find - will it also play in sync with the transport all of the unmuted audio that's inside the Satellite? I hope dragging the stems from the plugin to own session (as shown in the video) isn't the only way?
- Banned
- Topic Starter
- 484 posts since 29 Jun, 2020
-
- KVRAF
- 1520 posts since 23 Feb, 2017
Good thinking! This could be revolutionary, maybe it would be a good idea if someone set up a thread for people to find others to collaborate withSneakyBeats wrote: ↑Mon Jan 04, 2021 5:19 pmAFAIK it works even if you toss it on Audacity.. Since it just rips the audio from the channel you put it on.
Signatures are so early 2000s.
- Banned
- Topic Starter
- 484 posts since 29 Jun, 2020
Just testing it. I can confirm that it works in Bitwig at leastKongru wrote: ↑Mon Jan 04, 2021 5:31 pmGood thinking! This could be revolutionary, maybe it would be a good idea if someone set up a thread for people to find others to collaborate withSneakyBeats wrote: ↑Mon Jan 04, 2021 5:19 pmAFAIK it works even if you toss it on Audacity.. Since it just rips the audio from the channel you put it on.
-
- KVRian
- 1253 posts since 17 Oct, 2018
Whoever figures this out and makes it super accessible will be huge. Collaboration has been a thing thrown around for music for a while but no-one has figured out properly yet. There is still a lot of workarounds. Some use Google Drive or other services like that. Some use Splice, but we need is realtime collaboration of audio. The thing is the DCC and game industry figured all this out years ago why can't the audio industry? This is a good first step though.
What we need is the same level of collaboration tools you see in development tools and DCC apps.
Like for example:
A fellow collaborator and I are working on a project together, same setup on both ends. We are both using some sort of high quality conferencing tool with minimum latency and high quality audio just to communicate or even just chat. My collaborator is recording some vocals in their Daw session. Once they are done they just click save in their session.
I can then either do a manual pull request or the collaboration tool will automatically do a push to my client. That pull request will import the track, plugins and data with the possible choice of bouncing the audio if plugins are not available on one of the client DAWs. That can be done by the one doing the push if need be. I'm assuming some kind of ad-hoc or 1:1 connection with encryption versus a full GIT like structure but both could possible work. That will allow for more realtime like collaboration as you can see changes happening in your session as the other person adds or removes tracks. Ofcourse this should be optional so you are not seeing everything the other person is doing (That's what the conferencing tool is for), just the end result when they autosave or save manually.
Versioning is a must. Some modern DAWs already keep track of versions for a project session.
I'm really wondering why something like Perforce or something GIT based isn't being used to build these collaboration tools.
What we need is the same level of collaboration tools you see in development tools and DCC apps.
Like for example:
A fellow collaborator and I are working on a project together, same setup on both ends. We are both using some sort of high quality conferencing tool with minimum latency and high quality audio just to communicate or even just chat. My collaborator is recording some vocals in their Daw session. Once they are done they just click save in their session.
I can then either do a manual pull request or the collaboration tool will automatically do a push to my client. That pull request will import the track, plugins and data with the possible choice of bouncing the audio if plugins are not available on one of the client DAWs. That can be done by the one doing the push if need be. I'm assuming some kind of ad-hoc or 1:1 connection with encryption versus a full GIT like structure but both could possible work. That will allow for more realtime like collaboration as you can see changes happening in your session as the other person adds or removes tracks. Ofcourse this should be optional so you are not seeing everything the other person is doing (That's what the conferencing tool is for), just the end result when they autosave or save manually.
Versioning is a must. Some modern DAWs already keep track of versions for a project session.
I'm really wondering why something like Perforce or something GIT based isn't being used to build these collaboration tools.
Last edited by apoclypse on Tue Jan 05, 2021 5:07 pm, edited 1 time in total.
Studio One // Bitwig // Logic Pro X // Ableton 11 // Reason 11 // FLStudio // MPC // Force // Maschine
- Banned
- 11467 posts since 4 Jan, 2017 from Warsaw, Poland
You mean Satellite doesn't work?
As far as I can tell what they do is:
- you put a hub plugin on some track
- you put "recorder" plugins (satellites) at the ends of other tracks you have - audio, midi, whatever
- satellites record whatever is coming into them & transfer the audio to the hub, which gets synced appropriately to transport
- anyone on the net to whom you shared the session, will see & hear your uploads in their hub, synced to their transport
- now they can add their own tracks in their own DAW, feed them to their satellites and transport to hub, so that you can also see & hear their tracks synced to your transport in your session
I think. Or am I completely off base here?
-
- KVRian
- 1253 posts since 17 Oct, 2018
That's not ideal imo. It requires too much setup and manual work imo and also imo this should be done at DAW level not via a plugin you need to keep manual recording to. Like I said these are things the DCC, developers, and gaming industry figured a long time ago. Most DCC apps have some sort of version control or GIT system built right in or a plugin that integrates that right in to the app. All you need to do is save you file and do a push/pull request on the clients.antic604 wrote: ↑Tue Jan 05, 2021 4:52 pmYou mean Satellite doesn't work?
As far as I can tell what they do is:
- you put a hub plugin on some track
- you put "recorder" plugins (satellites) at the ends of other tracks you have - audio, midi, whatever
- satellites record whatever is coming into them & transfer the audio to the hub, which gets synced appropriately to transport
- anyone on the net to whom you shared the session, will see & hear your uploads in their hub, synced to their transport
- now they can add their own tracks in their own DAW, feed them to their satellites and transport to hub, so that you can also see & hear their tracks synced to your transport in your session
I think. Or am I completely off base here?
In most gaming engines the developer, modeler, shader writer, animator, FX artist are all working on the same project at the same time. The can literally drop in their files into the project or work on it directly and changes will update in almost realtime. The animator can see the model update as they animating the object, the texture/shader developer could be updating the grass texture while you work on the terrain. These are huge datasets (bigger than anything happening on a DAW level per track) and a lot of game companies are doing this remotely now. MoonStudios (Obi and the Blind Forest, Ori and the Will of the Wisps) is a full remote staff only game studio (they use Unity which has great versioning tool support). The technology is not new. This stuff has been around for a while. It just hasn't made it to the audio industry yet.
Studio One // Bitwig // Logic Pro X // Ableton 11 // Reason 11 // FLStudio // MPC // Force // Maschine
- Banned
- 11467 posts since 4 Jan, 2017 from Warsaw, Poland
Yeah, well. Hard to argue with that. But as an independent effort, working in any DAW and OS without needing any overhaul of audio standards, that's still quite an impressive 1st step.
-
- KVRian
- 1253 posts since 17 Oct, 2018
Yep. I'm glad people are starting to think about this now in the audio world. I'm just surprised that this hasn't been addressed by now. I know Bitwig supposed to be working in something in this regard and I'm interested to know what direction they take with it.
Studio One // Bitwig // Logic Pro X // Ableton 11 // Reason 11 // FLStudio // MPC // Force // Maschine
- KVRAF
- 5506 posts since 23 Aug, 2014 from Boston/Cambridge
Version 2.0
Compatibility
Ableton, Logic Pro X, FL Studio, Cubase, Studio One, Pro Tools, Reason, Reaper, Bitwig, GarageBand and Digital Performer
Pricing and Availability
Satellite Plugins 2.0 is free, available from:
https://mixedinkey.com/satellite/
Compatibility
Ableton, Logic Pro X, FL Studio, Cubase, Studio One, Pro Tools, Reason, Reaper, Bitwig, GarageBand and Digital Performer
Pricing and Availability
Satellite Plugins 2.0 is free, available from:
https://mixedinkey.com/satellite/
-
- KVRist
- 39 posts since 12 May, 2016
I appreciate the effort of this plugin and I like how they have tried to make it work across DAWs easily.
However on the few times we tried it, my friend and I really struggled to enjoy it. Being forced to record sounds in gets pretty annoying quickly, especially if you are working on a long loop or stem in your daw and you don't want to 'record it in' after all that work. It's also a step that often didn't work, and I would have to re-record it several times until it was right. After having to do this with many stems/loops we got sick of it. The spontaneity of the jam/interaction with each other got lost etc.
Everything is easy to do in the DAW, why not allow drag and drop audio? I guess it might be difficult to implement this within different DAW's.
We will continue to try it, but if it doesn't change then we are hoping something else comes up, for example the Bitwig implementation.
However on the few times we tried it, my friend and I really struggled to enjoy it. Being forced to record sounds in gets pretty annoying quickly, especially if you are working on a long loop or stem in your daw and you don't want to 'record it in' after all that work. It's also a step that often didn't work, and I would have to re-record it several times until it was right. After having to do this with many stems/loops we got sick of it. The spontaneity of the jam/interaction with each other got lost etc.
Everything is easy to do in the DAW, why not allow drag and drop audio? I guess it might be difficult to implement this within different DAW's.
We will continue to try it, but if it doesn't change then we are hoping something else comes up, for example the Bitwig implementation.