Roland Cloud

VST, AU, AAX, CLAP, etc. Plugin Virtual Instruments Discussion
Post Reply New Topic
RELATED
PRODUCTS
JX-3P Roland Cloud

Post

Funkybot's Evil Twin wrote:
fmr wrote:You have to login periodically to verify your subscription. I think that's the Cloud Manager that's behind that.
I did until yesterday, but I no longer think that's actually the case. Let me explain: earlier in the week I tried to update the Cloud Manager, and something went wrong. It uninstalled the old one, but never installed the new one. So for a few days I wasn't running any Cloud Manager software at all. I went to use the System-8 VSTi yesterday, and was still asked to Log In to authorize and after I did, the instrument was authenticated and worked fine. So even though the Cloud Manager wasn't installed at all, the instruments still knew to ask for authorization after a few days and the process worked. I only installed the updated Cloud Manager this morning.
So, maybe the protection is built-in in each instrument, after all.

All this thing about the way Roland Cloud works is still pretty obscure to me. I was thinking that Cloud Manager was mandatory to run in the background, but according to your experience, it isn't. :shrug:
Fernando (FMR)

Post

As I said, it's just for downloading, installing and updating the instruments. You can't download them without it. It's *only* ever manually run by me when I want to check for updates. Then it's killed.

You don't even have to use it for installing (at least on the Mac) either. The Cloud Manager just downloads a regular installer package and runs it - but you can abort this, and run an instrument installer manually if you want to.

Edit: BTW The reason I've not used the new Cloud Manager yet is everytime (for some days) I try to login to the RolandCloud website to download it, the site just crashes with .Net runtime errors... :roll:

Post

fmr wrote:Maybe, but how much power do they squeeze out of those CPUs? Are we talking about "computers" or iPADs? Are we talking about a real, multitasking operating system able to run big applications concurrently or an operating system that runs one application at a time, full screen? Are we trying to go up or to go down?
Since Roland don't really have an equivalent yet, let's take Korg Odyssey. A first gen i7 quad can only manage about 12 notes on an instance before glitching under Windows. Even an old generation iPhone 6 can manage 8 notes no problem. So it's maybe not as far off competing, as you're thinking, at least when it comes to realtime audio processing.. :) In terms of multi-tasking, that's more about a small touch interfaces versus a larger desktop display with many floating windows. So more the form factor than any inherent restrictions they couldn't change..

Post

beely wrote:They've always said they see them as separate things. What they *are* doing is making it easier for iOS developers to ship a version of their app that works on the Mac. It will still be a limited app compared to a "proper" Mac app, but may likely boost Mac interest and software availability.

The fact that they aren't deprecating the Mac frameworks in favour of iOS ones, and instead providing effectively a kind of translation layer for the iOS frameworks to map the the Mac ones, shows they aren't interested in "merging" the two platforms in the way you're suggesting.
Separate things, but lets just run these iOS apps on here anyway "to boost Mac interest"? Because running (primarily) touch based apps, on a device with no touch screen, makes so much sense? ;)

iOS devs already need a Mac, so not much sales boosting there, and few are going to drop the money on a laptop to run their iOS apps when a $300 iPad is so much cheaper. I guess someone could figure some sort of mobile/desktop integration justification, I mean if they really had to find a reason..

Good catch on "merge" btw. I said transition in the very next sentence, and that's what I actually mean. So, merge only in the PowerPC/Intel sense, of having things work alongside each other for a limited time. I do think the destination is iOS only. Of course, when I say iOS, I would expect them to accomodate specific changes for the desktop form factor (floating Windows etc).

And, no, even if this speculation is correct, they're not just going to suddenly kill stuff (frameworks etc)
The developer roadmap is clear that they are not (at least for the next five years or so) forcing Macs to only use software downloaded from the App Store. It would be a pretty stupid move. What they *are* trying to do is offer more customer protection and security, and I don't think it's a bad thing.
Stupid for who? How much does it cost Apple? Is this the same company which glues batteries into their systems, turning a simple $20 keyboard replacement into a $700 repair? You know it glues them in so customers don't use those cheap Chinese batteries right? It's a safety issue! They're only doing it to protect you. No other reasons ;)
I don't personally agree with this opinion or view of the near-term future.
Neither do I, unless near-term is considered (at the very least) 5 years out. ARM Laptops supposedly 2020 (Bloomberg report), and that will only be the start of a long transition process.
In general a lot of people's views on the future involving this stuff tend to be biased by how they perceive Apple (including my own views, of course.) If you see them as evil (or whatever), you only expect evil things to happen... :P
Well, a lot of how people perceive is based on past and current actions. That's perhaps a better measure than "feelings". Apple's never shown any tendencies towards control, have they? That's why people are free to install any software they want on iOS, as long as its from the App store, which Apple must approve and will take 30% of the money from. No reason at all to think they'll try to do exactly the same on desktop computers, if they want to transition them to their own ARM solution. Hehe..

Anyway, peoples eyes need to be open when it comes to dropping lots of cash on software. Whatever happens, there's some rocky roads ahead methinks..

Post

I'm not gonna do a point by point, as it will be a thread diversion and no one needs that! ;)

All I'll say is from the moment iOS took over as being Apple's main focus, and source of revenue, people feared that they would apply those same things that make sense on the iOS platform, to the Mac.

And ten years on, the Mac is still a separate thing, doesn't have touch bolted on, can still run all the software you want including unix utilities, and is much the same as it ever was, with necessary changes and improvements and an increase in security (not that all of these things are always brilliant, there have been missteps along the way as Apple often does). They haven't so far showed any desire to "iOS-ify" Mac OS, and have very clear and consistent messaging that they regard the two things as distinct, and for different purposes, and I don't see them going back on that (at least near-term). They *have* worked hard at integrating the two system better, to make better sense and have consistency between them both, which is pretty much a good thing.

I don't worry about IOSification of Macs at all, and I think the people that do over the past years should probably realise their fears aren't happening so they should maybe start to worry about something else... ;)

As you were... :D

Post

beely wrote:ten years on, the Mac is still a separate thing, doesn't have touch bolted on, can still run all the software
Maybe a bit less of "all the software" when they kill 32 bit support next year, huh? :) Oh, and they're killing OpenGL / OpenCL. But they do have Metal. Hey, wait, isn't that the same thing they use on iOS? Hmmm.. :D
They haven't so far showed any desire to "iOS-ify" Mac OS
You mean, except for allowing it to run iOS apps? The same file system? Same graphics API? ARM T2 taking over more functions? Nope, none at all. You're also thinking about it in one direction. Is the same true in the opposite? Didn't they try to market the iPad Pro as a laptop replacement and add features, like a dock, to iOS 11? I'm not sure that worked well for them, so now come the rumours of the ARM laptops. I don't think the plan is to iOS-ify Mac OS either. I think the plan is more to Mac-ify iOS, but only specific to those desktop / laptop devices it runs on.
I don't worry about IOSification of Macs at all, and I think the people that do over the past years should probably realise their fears aren't happening
Change happens. Change used to happen a lot more with computers. Few complained, because what came was usually better. Then things kinda stayed the same for a long time. We got used to it. Now, various factors (online, mobile, break down of Moores Law, break down of common sense) mean more change is coming.

No point fearing what will happen anyway. But what you can do is consider where your money is spent in the context of it all. Though I do think Roland virtual instruments will still likely be there, on the other side of it, in one form or another - to put it somewhat back on topic ;) And, btw, I know some devs (including Spectrasonics iirc??.. ) have said that, in the event of having to shut down online services, they would release a patch which removed the online requirement. So if anyone has actual "fears" about not being able to use software etc, companies can address it if they choose. Though, in Roland Cloud's case, this may be complicated by their legal relationship with Roland..

Post

PAK wrote:
fmr wrote:Maybe, but how much power do they squeeze out of those CPUs? Are we talking about "computers" or iPADs? Are we talking about a real, multitasking operating system able to run big applications concurrently or an operating system that runs one application at a time, full screen? Are we trying to go up or to go down?
Since Roland don't really have an equivalent yet, let's take Korg Odyssey. A first gen i7 quad can only manage about 12 notes on an instance before glitching under Windows. Even an old generation iPhone 6 can manage 8 notes no problem.
Korg Odyssey is NOT a good example. Everybody agrees that the code needs to be optimized and that it should be lighter than it is. Anyway... 12 notes... My i7 3770 (which is far from being top of the line nowadays) holds much more than that without problems, and without a single glitch. I can perform a test, if you want, with several instances, and see how much I can squeeze out of it, but I played eMotional Pad clustering in a quick succession with both hands, and the top I got was around 45% CPU (and this was with way more than 12 notes - probably double than that, or even more, if you consider the long releases).

But you may have lot's of other examples if you have an iPad. For example, Waldorf Nave - this one I know is well coded. How many instances of NAVE, and how much polyphony can you get on an iPad? And in OS X running on a machine with Intel i7?
Fernando (FMR)

Post

fmr wrote:Korg Odyssey is NOT a good example. Everybody agrees that the code needs to be optimized and that it should be lighter than it is.
Everybody? How did they reach their conclusion? Did they see the source code? Are they well versed in the latest coding techniques? Did they take latency and sample rate into account, host used, whether they're using USB or PCI(e) audio, and did they use features like ASIO guard? Most of all, have they actually made polyphony count comparisons with other software of equivalent quality? EG Diva, Roland Cloud VA polys etc..

Odyssey itself is single core per instance. It also uses polyphonic unison on presets. This means it can use a lot of polyphony without people understanding that they're doing it. If a synth has poly unison, and high CPU complaints, my first thought is whether the user knows this :)

My remarks were based on worst case examples. More typically Odyssey does 16 notes at about ~65% CPU on a first gen i7 (EG bump the first preset up to 16 note unison.) Now try a preset, like 47 Playcode, and you should see the usage rise up higher.
I got was around 45% CPU (and this was with way more than 12 notes - probably double than that, or even more, if you consider the long releases).
Well, it wouldn't have been since it has a 16 note polyphony voice limit ;) (Unless they changed that recently?)

And, as said, CPU usage varies by presets. I picked 47 as an example, but iirc there's likely heavier ones. Many other assumptions there too (EG sub 4ms output latency, ASIO guard off etc etc.)

Overall, I don't think Odyssey compares too badly to other synths on its level. EG I think it's likely you'll see higher per note usage from Diva (certainly with Divine, probably with great.. though its usage depends on the resonance setting used.. )

BTW The Odyssey iOS version has an 8 voice limit per instance, at least on the devices I have, though I've never tested how many instances I could use.. (Can't remember what hosting limits Gadget has)
But you may have lot's of other examples if you have an iPad.
It'd depend pretty heavily on which iPad. Ideally you'd want to compare to an iPad Pro, and there's likely to be a new version of that out in September, with a new generation of CPU. There will then be another generation of ARM CPU's after that, before whatever Apple propose to put out in 2020. At least you can somewhat predict this, due to the iPhone release cycles. So even those results, from a September 2018 iPad Pro, will still be two generations off whatever they're proposing to use on an ARM laptop, assuming the 2020 date even holds up. This also assumes they go with the same design and thermal contraints of the iPad. They may well use something which goes beyond that, or is customized in some other way. BTW the fact that they're including Intel, in the iOS app stuff, rather than skipping straight to ARM, also tells you they're hedging their future bets to some extent.. ;)

But, it'd be interesting to see the results people get, if there were a thread which made direct comparisons.. and maybe a good time to do it if there's new iPad Pros in September(ish). Bit off topic for this one though, at least if/until Roland Cloud arrives on iOS :)

Post

PAK wrote:Odyssey itself is single core per instance.
And that's its problem. Diva has a multicore switch, which can help tremendously depending on which CPU you're sporting.

Post

EvilDragon wrote:
PAK wrote:Odyssey itself is single core per instance.
And that's its problem. Diva has a multicore switch, which can help tremendously depending on which CPU you're sporting.
Yep. Fairly unique at release time too (only other software which had it, at least that I was aware of, was HALion 4 back then).

Diva doesn't deserve the criticism it got in this department, IMO. High-ish CPU, but good management features (not buried in the UI either), and generally very usable lower quality modes.

(on topic :D) unlike the low quality mode of the Roland Cloud Jupiter 8. Does anyone have any idea WTF it's doing? You'd think it'd just turn down the internal oversampling, but it actually changes the stereo image etc, so I'm assuming it's actually doing some sort of voice clone thing (ie processing certain things for only one note, and then applying it to the others..)

Would be interested if someone knows what's going on there.. ;)

Post

PAK wrote:
fmr wrote:Korg Odyssey is NOT a good example. Everybody agrees that the code needs to be optimized and that it should be lighter than it is.
Everybody? How did they reach their conclusion? Did they see the source code? Are they well versed in the latest coding techniques? Did they take latency and sample rate into account, host used, whether they're using USB or PCI(e) audio, and did they use features like ASIO guard? Most of all, have they actually made polyphony count comparisons with other software of equivalent quality? EG Diva, Roland Cloud VA polys etc..
Comparisons were made with other synths considered of comparable quality (yes, including U-He ones). I can say, from my own experience, that I bought Korg Odyssey because I had a bargain offer due to the fact I own Korg Legacy Collection, but I would NOT buy it on its own. It is eating too much CPU power for what it offers, IMO. That's why I (and others) said the code is NOT (or needs to be) optimized. But maybe you have more information on that. :shrug:

Anyway, if the best Korg has to offer with optimized code is that I think other programmers make miracles, in comparison. Nonetheless, my experience is not THAT bad, as I already said. I will try the patch you referred and will post some conclusions.

(EDITED - LATER) OK, I tried that patch (Playcode). It effectively ate around 79% of the CPU. A poor arperggio, with nothing special sounding, a polyphony of 16 and a Unison of 1.

Now, explain to me what is the synth doing to demand so much CPU? Is it the graphics (some sliders are moving in realtime)? Because I cannot hear anything that justifies so much CPU. Funny enough, I increased the unison to 3, and then to 6, to try to take the CPU to the max, and the CPU taxing decreased (???). Now, is this strange or what?

Remember I am using an i7 3770 @ 3.4GHz. An iPad Pro costs more than a desktop computer with a CPU like this one :hihi:

According to Benchmarks ( http://cpu.userbenchmark.com/Compare/In ... 3647vs1979 ) with an Intel 7700 @ 4.2GHz I would have a gain at around 37%-39% (36% in single-core)
Fernando (FMR)

Post

PAK wrote:unlike the low quality mode of the Roland Cloud Jupiter 8. Does anyone have any idea WTF it's doing? You'd think it'd just turn down the internal oversampling, but it actually changes the stereo image etc, so I'm assuming it's actually doing some sort of voice clone thing (ie processing certain things for only one note, and then applying it to the others..)

Would be interested if someone knows what's going on there.. ;)
No idea, I don’t think they’ve said any details about it, but it definitely sounds not as good. It seems some modulations are stripped out too, and the whole thing definitely feels less “alive” than it otherwise does...

Post

PAK wrote:Diva doesn't deserve the criticism it got in this department, IMO. High-ish CPU, but good management features (not buried in the UI either), and generally very usable lower quality modes.
High-ish CPU today, sure, but I'm old enough to remember the time before that multi-core button existed and we all had slower processors. Did we have the "Draft" quality mode back then? Can't remember that part. But depending on when exactly it was, I may have been on a Core2Duo or maybe an early i7 and trust me, when it was released, Diva was definitely high CPU when polyphonic. Maxing out the CPU was something you had to actively take measures to avoid.

With some updates, and time, It's aged very well however.

Post

fmr wrote:Now, explain to me what is the synth doing to demand so much CPU? Is it the graphics (some sliders are moving in realtime)? Because I cannot hear anything that justifies so much CPU. Funny enough, I increased the unison to 3, and then to 6, to try to take the CPU to the max, and the CPU taxing decreased (???)
It means it's doing something intelligent with unison which is saving some cycles. The other presets.. I dunno.. You'd need to pick them apart to try to find the bits which are causing higher usage. I don't think it's the animation since some presets seem to move the sliders without much difference iirc, though it's basically modulation (via the sequencer), which will likely increase load. Only worry if the sliders turn super blurry (I'm picturing them doing audio rate modulation ;) )

I think the polyphony to quality ratio is ok myself. Big cause of higher CPU is usually the amount of oversampling I guess. Many devs have their own secret sauce for this, but the obvious way is to dodge extra calculations (oversampling only certain bits) or not doing some calculations at all if a setting isn't using it. Special instructions sets (SSE/AVX etc) likely harder to implement with code and limited to the CPU's which support them. Also might be some additional reluctance on that front, lately, given that devs can also see some of the current trends..

In the case of Roland, they always had to consider the whole plug-out thing, and creating differences between native plugins and the DSP versions. That maybe put some restrictions on how they were able to proceed / reduce CPU, though things seem to have went very quiet on that whole Plug-Out / System 8 front lately..
Funkybot's Evil Twin wrote:High-ish CPU today, sure, but I'm old enough to remember the time before that multi-core button existed and we all had slower processors.
I thought multi-core was from launch, but (one check later) it actually came 4.5 months after. Not quite years in the wilderness walking barefoot in the snow though :roll: :D Plus there might've(??) been a public beta between then with the feature included..
Did we have the "Draft" quality mode back then?
I think so, but I never went lower than Fast. The main thing I remember is people insisting on using Divine mode on older dual cores. Didn't matter that they wouldn't hear the difference from Great mode usually. They deserved "da best" and they were gonna run it at Divine. The human brain, huh? BTW Even at release 1.0, a Nehalem quad i7 was already 3 years old.

Post

PAK wrote:
Funkybot's Evil Twin wrote:High-ish CPU today, sure, but I'm old enough to remember the time before that multi-core button existed and we all had slower processors.
I thought multi-core was from launch, but (one check later) it actually came 4.5 months after. Not quite years in the wilderness walking barefoot in the snow though :roll: :D Plus there might've(??) been a public beta between then with the feature included..
Don't tell me what those 4.5 months were like. I was there...

...walking 3 miles to school each day, barefoot in the snow, uphill both ways! You damn kids! Now get off my lawn or I'll sic the dog on ya!

Post Reply

Return to “Instruments”