Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


ā€ŗ Apple šŸ’µ printers go brrrrrrr, Intel šŸ’µ printers go šŸ˜¢
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Apple šŸ’µ printers go brrrrrrr, Intel šŸ’µ printers go šŸ˜¢

AMD is absolutely destroying Intel (fact).

Now today, Apple announced their first MacBook using ARM.

I'm gonna' buy it.

If you still have stonks in Intel abort mission asap.

The sauce:

https://9to5mac.com/2020/11/10/new-macbook-air-apple-silicon/

Thanked by 1TerensM

Comments

  • raindog308raindog308 Administrator, Veteran

    @SirFoxy said: If you still have stonks in Intel abort mission asap.

    Typo or slang?

    image

    I am not excited about this. Updating all apps, some older apps may not work, a few rounds of headaches and bugs, etc. I'm sure it makes more money for Apple but I'm skeptical of the consumer benefit.

  • SirFoxySirFoxy Member
    edited November 2020

    @raindog308 said:

    @SirFoxy said: If you still have stonks in Intel abort mission asap.

    Typo or slang?

    image

    I am not excited about this. Updating all apps, some older apps may not work, a few rounds of headaches and bugs, etc. I'm sure it makes more money for Apple but I'm skeptical of the consumer benefit.

    You weren't supposed to find my alternative identity, Retardy McMemester, but yes it is slang.

    One of the most immediate benefits I see is the lower power consumption.

  • raindog308raindog308 Administrator, Veteran

    You weren't supposed to find my alternative identity

    Sir Foxy Retardy McMemester has quite a ring to it. You should make a coat of arms.

    Thanked by 1SirFoxy
  • Before y'all get too excited... Apple is comparing their M1 chip against Intel CPUs from 2~3 generations ago, in older Apple devices. And for their 4K video tests they are using their own proprietary ProRes 422 codec which Intel doesn't have hardware acceleration for.

    All their tests are carefully chosen to make their M1 chip look good, so I'm pretty skeptical of its bold performance claims. I'm sure it won't be slow, but probably won't be as magical as Apple claims it to be.

    You can simply look up the Intel machines in the fine prints on EveryMac, but I've done the leg work for you:

    • 4K video: "1.2GHz quad-core Intel Core i7-based MacBook Air" = i7-1060NG7 in 2020 MBA, known for severe thermal issues. That plus software decoding (Intel doesn't support Apple's ProRes codec) makes this absolutely unfair.
    • Image processing: "3.6GHz quad-core Intel Core i3-based Mac mini" = i3-8100B in Mac Mini 2018, with Intel UHD 630 (facepalm).
    • "fastest CPU core when it comes to lowā€‘power silicon": Don't even know what was being benchmarked against, but okay. I personally care about performance, not "performance per watt," though. Ditto for all the "performance per watt" claims after it.
    • "Machine learning performance": Ok, Intel doesn't have the ML cores that you have, so sure. This is on the same 2018 Mac Mini above.
  • brrrrrrr

    Thanked by 1SirFoxy
  • So it begins..

    The death of x86_64..

    Thanked by 1SirFoxy
  • raindog308raindog308 Administrator, Veteran

    @lowendboi said:

    • "Machine learning performance": Ok, Intel doesn't have the ML cores that you have, so sure. This is on the same 2018 Mac Mini above.

    Iā€™m skeptical many MacBook Airs are being used for ML workloads...

  • @stefeman said:
    So it begins..

    The death of x86_64..

    Lmao. No.

  • LeeLee Veteran
    edited November 2020

    I have a new Mac Mini on order, arrives 25th, fully aware this is 1st gen and I will be upgrading again sooner rather than later but wanted to see the performance for myself and really understand how things will go in a post intel world.

  • I currently use Air for my daily tasks and it's more than enough for my workflow. If this actually outperforms my 2017 i5 air, I'd gladly buy one just because it can handle the battery much better.

  • ClouviderClouvider Member, Patron Provider

    @lowendboi said: And for their 4K video tests they are using their own proprietary ProRes 422 codec which Intel doesn't have hardware acceleration for.

    I think their point is to be able to produce silicon that does the specific accelerations that will make the specific job they intend their hardware to be used for better than the competition ;-).

    We will see when it ships.

    Thanked by 2bdl TimboJones
  • serv_eeserv_ee Member
    edited November 2020

    @Clouvider said:

    @lowendboi said: And for their 4K video tests they are using their own proprietary ProRes 422 codec which Intel doesn't have hardware acceleration for.

    I think their point is to be able to produce silicon that does the specific accelerations that will make the specific job they intend their hardware to be used for better than the competition ;-).

    We will see when it ships.

    In that case I hope they will show us how this M1 performs in cinrbench or handbrake against x64. Oh wait...

    It's just cherrypicking nothing else.

  • ClouviderClouvider Member, Patron Provider

    @serv_ee said:

    @Clouvider said:

    @lowendboi said: And for their 4K video tests they are using their own proprietary ProRes 422 codec which Intel doesn't have hardware acceleration for.

    I think their point is to be able to produce silicon that does the specific accelerations that will make the specific job they intend their hardware to be used for better than the competition ;-).

    We will see when it ships.

    In that case I hope they will show us how this M1 performs in cinrbench or handbrake against x64. Oh wait...

    It's just cherrypicking nothing else.

    I disagree. Had I been a graphic designer (yes, I'm cherrypicking this because Adobe said they will use the new chip capabilities) I would very much care how my Photoshop and Illustrator performs on the hardware I use. I don't need to care whether this is Intel or Apple, so long as my apps run faster on Apple I'd take apple if I can afford it, no? I most certainly wouldn't care how it benched against other, irrelevant for me, parameters.

    They didn't market it as a server CPU, did they ?

    Thanked by 3imok Lee PUSHR_Victor
  • @Clouvider said:

    @serv_ee said:

    @Clouvider said:

    @lowendboi said: And for their 4K video tests they are using their own proprietary ProRes 422 codec which Intel doesn't have hardware acceleration for.

    I think their point is to be able to produce silicon that does the specific accelerations that will make the specific job they intend their hardware to be used for better than the competition ;-).

    We will see when it ships.

    In that case I hope they will show us how this M1 performs in cinrbench or handbrake against x64. Oh wait...

    It's just cherrypicking nothing else.

    I disagree. Had I been a graphic designer (yes, I'm cherrypicking this because Adobe said they will use the new chip capabilities) I would very much care how my Photoshop and Illustrator performs on the hardware I use. I don't need to care whether this is Intel or Apple, so long as my apps run faster on Apple I'd take apple if I can afford it, no? I most certainly wouldn't care how it benched against other, irrelevant for me, parameters.

    They didn't market it as a server CPU, did they ?

    I guess I worded myself a little wrong.

    better than the competition

    I meant that part. It's not better than the competition if the competition can't even run the said test cause insert company here uses some self made codecs etc.

    It's basically AVX512 all over again. Intel is better cause AMD can't do it. Same scenario in this "benchmark". Although we'll not know before Adobe releases their stuff and then it's clear which arch will actually do the tasks quicker.

    While it's clear that the M1 will be get the "per watt" performance crown or what not it's kinda misleading since and x64 will still do the task faster, with more W used.

    It's just my personal opinion of course.

    Also keep in mind that Nvidia now owns ARM and when Apple will start pissing on their parade anything can happen.

  • marceloridermarcelorider Member
    edited November 2020

    @Clouvider said:

    @serv_ee said:

    @Clouvider said:

    @lowendboi said: And for their 4K video tests they are using their own proprietary ProRes 422 codec which Intel doesn't have hardware acceleration for.

    I think their point is to be able to produce silicon that does the specific accelerations that will make the specific job they intend their hardware to be used for better than the competition ;-).

    We will see when it ships.

    In that case I hope they will show us how this M1 performs in cinrbench or handbrake against x64. Oh wait...

    It's just cherrypicking nothing else.

    I disagree. Had I been a graphic designer (yes, I'm cherrypicking this because Adobe said they will use the new chip capabilities) I would very much care how my Photoshop and Illustrator performs on the hardware I use. I don't need to care whether this is Intel or Apple, so long as my apps run faster on Apple I'd take apple if I can afford it, no? I most certainly wouldn't care how it benched against other, irrelevant for me, parameters.

    They didn't market it as a server CPU, did they ?

    Quoting @Clouvider it's basically what you need. You don't buy a Xeon for gaming, for instance, you would buy a Ryzen or top spec gaming processor for that.

    They made that chip to do somethings way better than others for sure, they target their market so they can sell way more confident on it. For sure video editors and perhaps foto editors would love to use their M1 chip if Adobe brings all the support for those.
    And that doesn't mean you could do faster browsing benchmarks or cinebench benchmarks.

    Also, from a Performance / Watt panorama, that would blow battery life numbers pretty sure. When you are on the move and having a Air that outperforms every other laptop in battery and performance in video editing, you would definitely buy it.

    And atention, I'm not saying that it is a other worlds CPU, I'm just saying it is marketed for a niche. People know Apple does it quite right on that.

  • SirFoxySirFoxy Member
    edited November 2020

    @bdl said:
    brrrrrrr

    brrrrrrrrrrrrrrrrrrrrrr

    Thanked by 1bdl
  • @sdglhm said:
    I currently use Air for my daily tasks and it's more than enough for my workflow. If this actually outperforms my 2017 i5 air, I'd gladly buy one just because it can handle the battery much better.

    For me 6 extra hours of battery is a huge advantage, I usually work in coffee shops or around my uni, I'm usually not at home so I rather not stay glued to an outlet.

  • Mac on arm sounds like a raspberry pi, or an.. Apple Pi...

    Thanked by 2TimboJones Shot2
  • Apple doesn't do server or enterprise stuff, they don't sell to other companies, and they sell single digit laptops and mini's compared to PC. They also like a fat margin so nothing will be "cheap".

    But they control the OS and can do the necessary optimizations to make use of BIG.little. Their market is relatively small and will focus all on that segment alone. There's definitely benefit for the unplugged people, but what about 24/7 powered workstations that used to have Xeons with a shitload of cores and ran $50k fully loaded?

  • @TimboJones said:
    Apple doesn't do server or enterprise stuff, they don't sell to other companies, and they sell single digit laptops and mini's compared to PC. They also like a fat margin so nothing will be "cheap".

    But they control the OS and can do the necessary optimizations to make use of BIG.little. Their market is relatively small and will focus all on that segment alone. There's definitely benefit for the unplugged people, but what about 24/7 powered workstations that used to have Xeons with a shitload of cores and ran $50k fully loaded?

    Problem is that Apple do the "professional" market. Check out all the studios running Mac for example.

  • What do you mean by "printers"? like, devices by Epson Canon etc.? or do you imply that AMD and Apple are somehow Brother in ARMs?

  • @Shot2 said:
    What do you mean by "printers"? like, devices by Epson Canon etc.? or do you imply that AMD and Apple are somehow Brother in ARMs?

    money printer go brrrrr

    Thanked by 1kkrajk
  • @SirFoxy said:

    @Shot2 said:
    What do you mean by "printers"? like, devices by Epson Canon etc.? or do you imply that AMD and Apple are somehow Brother in ARMs?

    money printer go brrrrr

    What a beautiful sound..

  • jsgjsg Member, Resident Benchmarker

    FWIW: I don't know or care about Apple products but I recently happened to come across some new (?) Apple processor ("A12" or something like that) that lead the field in terms of complexity and came close to AMD performance ("close" as in "not dimensionally behind or even on par with Zen2" iirc).

    @stefeman said:
    So it begins..

    The death of x86_64..

    I guess we'll see that only with Risc-V, once it really takes off. Also I guess that quite a few Arm based servers (and the processors themselves) have been designed back when there was reason to fear that intel might push AMD aside or, in other words, because they didn't like to be depending on a monopoly - not because Arm is such a great architecture.
    I think that Arm will stay for many years but the one sending x86 to rest (6 feet under ...) will be Risc-V.

  • There are now synthetic benchmark numbers for the M1 on Geekbench, and I have to say that I'm surprised (in a positive way): https://www.macrumors.com/2020/11/11/m1-macbook-air-first-benchmark/

    It appears at least for Geekbench's synthetic benchmarks, M1's single-core performance isn't bad at all. Now I'm genuinely interested in how this can go...

  • raindog308raindog308 Administrator, Veteran

    @TimboJones said:
    Apple doesn't do server or enterprise stuff

    They did once. Their Xserve kit was very nice for the time but itā€™s been dead for more than a decade.

Sign In or Register to comment.