Gaming Computer vs Mac

Configure and optimize you computer for Audio.
Post Reply New Topic
RELATED
PRODUCTS

Post

He lives on a boat and, if what I’ve read is true, shits in a bucket.

I believe him if he says he has lower energy costs than normal folk.
I lost my heart in Taumatawhakatangihangakoauauotamateaturipukakapikimaungahoronukupokaiwhenuakitanatahu

Post

revvy wrote: Sat May 06, 2023 2:32 am He lives on a boat and, if what I’ve read is true, shits in a bucket.

I believe him if he says he has lower energy costs than normal folk.
Ha ha ha, I can perfectly picture that 😄.

Beyond pure economic reason, I must admit that green considerations start to be part of my living habits. I am not hard-core at all. But if I have a choice between 2 appliance, one being cleaner than the other, I'll take the cleaner....
My kids have a big gaming pc with a 3080, it is hot as hell, to me it is a sinful as a big burger with fries, lol....

Post

PAK wrote: Fri May 05, 2023 5:42 pmYou're confusing raw performance with performance per watt.
I'm not confusing anything, Performance per watt is of less than zero significance to me and has been for at least the last 15 years, probably longer. It's a bullshit metric that Apple pushes in your face, knowing you'll be stupid enough to swallow it hook, line and sinker. It's a classic marketing technique that's been around since the 1920s - you create the perception of a problem that needs fixing, even though no-one has ever previously identified that particular thing as a problem, and then you sell the gullible fools a solution. The famous example is Listerine. You can read about it HERE. Apple do this all the time and there seems to be an endless supply of idiots lining up to fall for it.
The energy advantage of RISC has held for those 40+ years. You're right that traditionally it hasn't mattered as much for desktop performance computing. But times they are a changing..
No they aren't. It's even less relevant today than it was 40 years ago. You only have to see how sleek a modern laptop is and compare it's performance and battery life to one from a decade ago to understand this. All the big advances have been made, what Apple have done is a day late and a dollar short. If they'd done it 10-12 years ago, it would have been relevant, possibly even amazing, but today their biggest advantage over Intel isn't RISC, it's the 5nm process they are using on their chips. Once Intel get down to the same die size, their performance per watt will see massive gains. But no-one will care because it has ceased to be an actual problem for most users, if it ever was.
You linked to an article where AMD proclaims its new processor is "just as good as hoity toity Apple, innit". In other news you may have missed that NVidia's graphical performance also kicks Apple's ass. The problem Bones, is that it uses the power equivalent of a small Australian city to do so.
That's not a problem at all, it's the halitosis effect, Electricity has never been more accessible than it is today and there is nowhere in the world where it is so prohibitively expensive as to be any kind of factor worthy of consideration. If it is a problem for you, personally, you can buy a 300W solar panel for well below $100, which will make up any perceived shortfall for free for the next 20 years or so. Double your investment and you'll be getting most of your electricity for free. Or go into a cafe and charge up for the price of a cup of coffee, including a free cup of coffee. The last, stupidest thing to do is spend an extra grand on a laptop to save yourself a few bucks a year in electricity costs. Oh, and for the record, Australia is one of the 10 most expensive
countries in the world for electricity, according to Statista.
If you look for anything power related in that article you linked here's what you find: "AMD also boasts of improved efficiency to provide the longest possible time on battery power, although it doesn't provide estimates". Ahhh.. ;)
You mean in the same way that Apple cherry picks which stats to feed y9ou and which ones to ignore? Why is it OK when Apple does it but not OK when it's AMD. Personally, I'd trust AMD a lot more than I'd trust Apple when it comes to being honest about their products and I'm certain that when the product actually ships all that information will be freely available from AMD.
Run? I thought it was more of a crawl.
Well, in the 90s everything ran at a crawl, although it was probably your dial-up internet that made it seem slow. No, people don't buy ARM powered Windows laptop because they aren't stupid, they know there is no point. And because they aren't stupid, the companies who cater to them can't afford to be stupid, either, so they don't bother porting their applications to run on ARM. Apple force change on their customers and their partners. They treat everyone like shit, confident that there will be no shortage of idiots who will lap it up.
MacOS is based on FreeBSD, which is based on Bell Labs Research Unix. MacOS has biggest peepee.
Everyone knows that Apple will steal anything that's not nailed down, it's no secret.
You seem to know a lot about the thread management issues involved here Bones. Maybe you should contact Korg and tell them how to fix things since they've had the same problems since launch?
That's not the problem with Opsix and Wavestate. It is definitely the problem with Cubase, though. But, as I said, when you compare it to the shitstorm that Apple's move to their own chips has caused, it is definitely a point in favour of sticking with Intel and shows how desperately poor your side of the argument is.
Your parsing is not the best. The whole point was that, though it's still ongoing, Apple has (largely) went through that process. It controls both the hardware AND software, which also gives it major advantages here.
That just puts them on par with Windows now, where the companies making the components have been writing their own drivers all along. The problem with Apple, though, is that they don't give you any options, you do things their way or not at all. e.g. Apple's graphics drivers have always had excellent multi-monitor support but it was always at the expense of outright performance, which made Mac a really poor choice for things like game development and visual effects. OTOH, nVidia's Quadro drivers include per application settings to ensure maximum performance for the applicaiton you are using. e.g. If you working in something cross-platform, like Maya, you get optimised performance under OpenGL but, if you running something that is Windows-only, like 3DS Max, you get optimisations that maximise performance under DirectX. But for Apple, having to write nVidia and Radeon drivers was too much like hard work, so they dropped support for the industry standard nVidia cards years ago.
We'll see how things go on the Windows side come time for Windows 12.. ;)
Microsoft learned their lesson with Vista and were given a big reminder with Windows 8, they aren't going to f**k their users over like Apple does. They know we won't put up with it.
Unfortunately, I do know how much its costs Bones. Do you? A UK household pays approx Aus $30 per month just to be connected. That's before you use any electricity. Prices, in places like Germany and the UK, have approx doubled in 2 years. An "average" household can easily have bills 10-20x what your bill is.
It's the same here and prices are set to double again in the next few years. That's the price you pay for fear. I looked up last month's invoice and I paid $12 for electricity in April, at 38c per kWhr. That puts me on par with Spain, the 5th most expensive in the world. I reckon $5 of that would have gone to my fridge, another $4-$5 for the TV, lights and wi-fi router (4G). That leaves $2-$3 to run my laptop and power the four speaker set-up I use, which seems about right. I doubt I'd notice the difference if I stopped using my laptop altogether.
"British Gas average bill", it will show you "typical" household costs, and you'll find electricity kWh rates there too.]
IN June 2022 it was 57c per kWhr. That would mean my bill last month would have been $20, not $12. That's a difference of one schooner at the pub on a Friday night. Hardly the end of the world, or a reason to pay through the nose for an overpriced, underperforming laptop.
Once you've managed that feel free to search "total system power usage" benchmark measurements for PC's and Apple M1 etc. Then do the calculations. You won't, of course. :)
I'd be more likely to make an appointment to see my doctor about my halitosis problem.
NOVAkILL : Asus RoG Flow Z13, Core i9, 16GB RAM, Win11 | EVO 16 | Studio One | bx_oberhausen, GR-8, JP6K, Union, Hexeract, Olga, TRK-01, SEM, BA-1, Thorn, Prestige, Spire, Legend-HZ, ANA-2, VG Iron 2 | Uno Pro, Rocket.

Post

I have a limited forum experience but Bones is my favourite troll so far.
Can repeat the same stuff over and over again without any new arguments. Doesn't care when everybody point him to the reality. Deaf to any reason.
Love it ...

Sad it will be replaced by chatGPT soon...

Post

PAK wrote: Fri May 05, 2023 7:54 pm 150W per hour x 24 hrs = 3.6kWh per day x 365 is 1314kWh per year. Now times that by the true cost (0.43 US dollars) and we get? $565.02 total cost. The PC, at 400W is 0.4kWh x 24 = 9.6kWH per day. Times 365 = 3,504kWH/yr x 0.43c = $1506.72. That's an operating cost difference of $941.70 in just a single year using your numbers. Even with one tenth of the usage the difference would still be $94.17 per year. For your $30 amount to be true you'd have to use the computer for barely half an hour per day! ;)
That was the consumption per day, not per hour. You won't be doing extremely heavy calculations on it 24 hours a day, 365 days a year. It's a reasonable estimate and if you are a very heavy user you could multiply it by 2, or even by 3, but not much more than that.

And that would imply that you do 10 hours of CPU intensive work every day, 365 days a year, which for most of you just won't happen.

Post

2DaT wrote: Sat May 06, 2023 4:47 am That was the consumption per day, not per hour. You won't be doing extremely heavy calculations on it 24 hours a day, 365 days a year. It's a reasonable estimate and if you are a very heavy user you could multiply it by 2, or even by 3, but not much more than that.

And that would imply that you do 10 hours of CPU intensive work every day, 365 days a year, which for most of you just won't happen.
Apparently gaming pc are costing 4 cents per hour in average. I think that's coherent with the discussion. If you take an average of 30 hours a week you are at 60USD per year. Heavy usage I guess will be 100usd..

Post

Jac459 wrote: Sat May 06, 2023 5:22 am
2DaT wrote: Sat May 06, 2023 4:47 am That was the consumption per day, not per hour. You won't be doing extremely heavy calculations on it 24 hours a day, 365 days a year. It's a reasonable estimate and if you are a very heavy user you could multiply it by 2, or even by 3, but not much more than that.

And that would imply that you do 10 hours of CPU intensive work every day, 365 days a year, which for most of you just won't happen.
Apparently gaming pc are costing 4 cents per hour in average. I think that's coherent with the discussion. If you take an average of 30 hours a week you are at 60USD per year. Heavy usage I guess will be 100usd..
Gaming IS heavy usage. If you take for example a big GPU and game at high resolution, you would consume 300-500W easily and that would be around 10-20 cents an hour.

Gaming is heavy on power, and nobody denies that. For macs gaming is not an option, therefore using gaming power usage to compare is out of question. For pure CPU workload, such as DAW usage, your power consumption would be around 50-150W for PC, and I guess for mac it would be 20-50W?

You have to be using your computer for 10 hours a day with DAW full of heavy plugins running playback all the time 365 days a year to make a difference of 100$. That's just unrealistic.

Also, you need to measure the efficiency per unit of work done and not just power consumption alone. Your PC can do much more work per hour @ 250W, than mac @ 50W, therefore you can't just compare wattages alone.

Post

2DaT wrote: Sat May 06, 2023 6:21 am
Jac459 wrote: Sat May 06, 2023 5:22 am
2DaT wrote: Sat May 06, 2023 4:47 am That was the consumption per day, not per hour. You won't be doing extremely heavy calculations on it 24 hours a day, 365 days a year. It's a reasonable estimate and if you are a very heavy user you could multiply it by 2, or even by 3, but not much more than that.

And that would imply that you do 10 hours of CPU intensive work every day, 365 days a year, which for most of you just won't happen.
Apparently gaming pc are costing 4 cents per hour in average. I think that's coherent with the discussion. If you take an average of 30 hours a week you are at 60USD per year. Heavy usage I guess will be 100usd..
Gaming IS heavy usage. If you take for example a big GPU and game at high resolution, you would consume 300-500W easily and that would be around 10-20 cents an hour.

Gaming is heavy on power, and nobody denies that. For macs gaming is not an option, therefore using gaming power usage to compare is out of question. For pure CPU workload, such as DAW usage, your power consumption would be around 50-150W for PC, and I guess for mac it would be 20-50W?

You have to be using your computer for 10 hours a day with DAW full of heavy plugins running playback all the time 365 days a year to make a difference of 100$. That's just unrealistic.

Also, you need to measure the efficiency per unit of work done and not just power consumption alone. Your PC can do much more work per hour @ 250W, than mac @ 50W, therefore you can't just compare wattages alone.
Well it is not true that you can't play on Mac, with Apple Arcade you can play candy crush :-).

More seriously, I generally agree with you. If I were on a desktop, I would seriously study a PC. The liberty of choice of model is way better than a Mac, you still lose in CPU power efficiency but the raw power can go way beyond Apple if you put the price.

And Windows 11 vs MacOS, really... both are imperfect, both are ok so I don't care.

On the contrary, for a laptop, to me it is a bit difference. My previous laptop was a Razer ultra high end (2020). I miss its gorgeous 15 inch oled 4k screen and beautiful design but it was like having a hoven on your lap. The fan was ultra noisy. The temperature of the PC was almost unsustainable (and full unsustainable while gaming). The battery life was ridiculous (like 2 hours).
So I can understand that it is better now, laptop market has evolved, but until I am really convinced, I will stay on my M2 Pro MBP. It is cheaper than my Razer, way way way faster, silent, cold. The design is sad like a Ken Loach movie and I can't believe they don't have OLED tech in 2023 but I just don't want to risk to go to the same thermal issues I was having.

So far the only person saying that thermal issues on PC laptops are fully over is Bones, and my trust on his objectivity is absolutely 0%.

Post

revvy wrote: Sat May 06, 2023 2:32 am He lives on a boat and, if what I’ve read is true, shits in a bucket.
As I recall he doesn't utilise the bucket for defactory purposes. He evacuates the bones bowels at work during the week, then on a weekend, by dint of some mystical self control (or starvation diet of very bland food..), doesn't shit at all.

Post

donkey tugger wrote: Sat May 06, 2023 7:01 am
revvy wrote: Sat May 06, 2023 2:32 am He lives on a boat and, if what I’ve read is true, shits in a bucket.
As I recall he doesn't utilise the bucket for defactory purposes. He evacuates the bones bowels at work during the week, then on a weekend, by dint of some mystical self control (or starvation diet of very bland food..), doesn't shit at all.
and thus KVR rolled over and died...

Post

Yeah, the thing is, PCs are loud. Macs on apple silicon are silent, MBA is best for me cause it is fanless and with upgrades to RAM and SSD powerful enough to meet almost all my needs, if you have crazy projects with hundreds of tracks, work with live orchestral recordings or work in 96k exclusively, MBP will have more power to handle anything, but might have some slight fan noise sometimes, although it is nowhere near as loud as typical gaming PC. I like my studio dead silent, if you too, might consider Apple.

Post

machinesworking wrote: Sat May 06, 2023 7:08 am
donkey tugger wrote: Sat May 06, 2023 7:01 am
revvy wrote: Sat May 06, 2023 2:32 am He lives on a boat and, if what I’ve read is true, shits in a bucket.
As I recall he doesn't utilise the bucket for defactory purposes. He evacuates the bones bowels at work during the week, then on a weekend, by dint of some mystical self control (or starvation diet of very bland food..), doesn't shit at all.
and thus KVR rolled over and died...
'Kicked the bucket', one might say...but not hard enough to disgorge the contents if you please.

Post

I have a pc, based on 7950x. Top-notch components. Costs more than my 16-inch MBP m1 Pro.
I still prefer my MBP for any tasks (except gaming of course).

Post

donkey tugger wrote: Sat May 06, 2023 7:01 am
revvy wrote: Sat May 06, 2023 2:32 am He lives on a boat and, if what I’ve read is true, shits in a bucket.
As I recall he doesn't utilise the bucket for defactory purposes. He evacuates the bones bowels at work during the week, then on a weekend, by dint of some mystical self control (or starvation diet of very bland food..), doesn't shit at all.
His avatar pic checks out.
I lost my heart in Taumatawhakatangihangakoauauotamateaturipukakapikimaungahoronukupokaiwhenuakitanatahu

Post

I have both a desktop gaming pc and a Macbook pro 14 and I prefer using the mac for audio.

Also I’m not sure if it’s still the case but the nvidia drivers used to add a lot of DPC latency on windows.

If you get enough storage/ram from the beginning upgradability isn’t such a big deal. At least for me every time I want to upgrade the computer there’s already a new cpu socket and I have to replace most of the parts anyway.

Post Reply

Return to “Computer Setup and System Configuration”