7
kiki
3y

Apple denies jelly scroll problem on new iPads: https://arstechnica.com/gadgets/...

This is nothing new: Steve Jobs himself told people to “just avoid holding iPhone 4 that way” in response to people being mad because iPhone 4 lost connection.

Apple is the WORST in handling user feedback, on par of Microsoft sponsoring concentration camps for immigrants (https://github.com/drop-ice). Though I still stand by my words of Apple products being engineering marvels.

Comments
  • 0
    I remember some android phone had a similar feature years ago. I recall the display was mounted upside down on purpose but it made the scrolling look weird.
  • 4
    Apple products aren't engineering marvels. Their engineering ist just as shitty as their customer service: https://youtube.com/watch/...

    The only thing that Apple is excellent at is marketing and brainwashing. It's a cult.
  • 0
    @Fast-Nop do we have another 900g 12” laptop?
  • 0
    @kiki That's not marvellous engineering because making very small things pretty light isn't difficult. A kitten is even lighter than that and actually more useful.
  • 0
    @Fast-Nop but still, I need my laptop to be that light and thin because I like it. Does non-apple world have something to offer? I’m not being sarcastic, just genuinely interested
  • 1
    Oh yes, engineering marvels. Which are broken and underpowered by design, and when someone points it out Apple first deletes any forum post about it, then says the issue is super rare and only after they get sued to oblivion they offer a repair program for less than half of the affected devices.

    And people keep rewarding that behaviour by buying everything while iPhone prices start to follow Moore's law.
  • 2
    @kiki several exist.

    The trend of ultrabooks has diminished a bit, but in the category of ultrabooks are always models with less than 1 kg weight in total.

    Fujitsu Lifebook UH, LG Gram, Lenovo X1, HP Aero / Dragonfly,
    ...

    Apple isn't as revolutionary as they claim.
  • 1
    @kiki LG Gram. Less than 1kg and 17'' screen.

    Anyone can build a light laptop if it's like 13'' and comes with no cooling.
  • 0
    @kiki I still have an ancient netbook even smaller than that, from the netbook hype back then.

    Given that Apple's current M1 lineup starts at 13.3", I guess even Apple figured out that there's no real market for 12" laptops. 13.3" is already at the extreme end, rather designed to be carried between docking stations than to be used as mobile standalone device.
  • 1
    https://notebookcheck.com/Fujitsu-U...

    Eg here. 749 gram.

    I think there were more extreme versions, going down to 600 gram, but as I mostly skim newspages I could be wrong.
  • 1
  • 1
    Btw, the current M1 laptops are an excellent example of the "marvellous" Apple engineering. Soldered RAM at two times the market rate, and soldered flash at even three times the market rate. Soldered to shut out competitor parts, obviously.

    My Linux laptop with 32GB RAM and 2TB SSD clocks in at half of what an M1 laptop would cost with these specs. Oh wait, 32GB RAM isn't even possible because 16GB is the maximum. That's what my previous PC had back in 2010.
  • 0
    @Fast-Nop I don’t care much about ram and ssd capacity. I want my laptop to be thin, beautiful and weigh less than a kilo. MacBook 12 is absolutely perfect for me, though core m3 is somewhat weak, that’s why I got MacBook Air with M1.

    I’m disappointed by what apple does to hardware to prevent me from running whatever the hell I want. Yes, I’m kinda starting to look around to get some non apple laptop, but I don’t like windows, and the software I use like Affinity Photo can’t be installed on Linux without workarounds
  • 1
    @Fast-Nop I expect more stuff to be integrated into or near the SoC as 3D and interposer and chiplet and network-on-chip technology gets better. Long physical traces are just too much of a performance bottleneck especially now as it's memory and data movement performance that's usually the biggest bottleneck. Phone SoCs already have RAM much closer than desktop SoCs, it's manufacturing difficulties and vendor flexibility that holds them back from fully integrating the thing. Also thermals I guess, but that's under active development too. You know, as they say in signal integrity "if it's large enough to be visible to the naked eye, it's going to be a performance bottleneck". Enjoy that pluggable SSD and RAM while you can. It's too late for GPUs and northbridges and southbridges and network/sound/usb cards.

    Just like way back when caches were on a physically separate thing that you could swap them out but I don't see anyone complaining about caches being fabricated onto the CPU die because that's the only way to get single digit access time. Integration is pretty much the way forward. Apple's just ahead of the curve in bringing it in because they have a vertical slice of the market and a captive ecosystem (and a bunch of very good engineers and a ton of money to spare). I'd expect it to be a growing trend.
  • 2
    @RememberMe Apple's flash is not faster than an NVMe SSD. Apple's RAM is not faster than any other LPDDR4. They're just way more expensive for nothing but the Apple sticker.

    Also, they are not invisible to the naked eye - unless you count putting on half of a metal cover sheet as being on the tech forefront because tHiNk DiFfErEnT.
  • 0
    @RememberMe I'm not so sure about that cache analogy. The payoff for integrating much more memory than that is lower compared to the cost and complexity, and in most applications you just don't need it. It will be slower than cache either way. In high-performance applications where this makes sense they're already experimenting with this. But not for touchbars and animoji shit.

    Want more speed in consumer hardware? Put sufficient cooling and power delivery in there and boom, you now have much better performance - or to be more precise, the performance users actually paid for and never received. A large chunk of the rest is shitty software. And you can't fix that with integrated RAM, at least not for more than 1-2 years. Because at that point the software industry will have caught up again with even shittier programs, for which you now need more expensive hardware. Fantastic.
  • 0
    @RememberMe

    Your statement is in my opinion a bit too much "mingle-mangle".

    An ARM SOC as Apple built it with the M1 is nothing spectacular in itself.

    What Apple did was a lot of fine tuning and in a nutshell what AMD and Intel do since decades: Squeeze everything out of the die you can squeeze.

    I'm not saying that it's not an achievement, just that it isn't as spectacular as some press guys try to make it sound.

    The part where Apple excels is their closed ecosystem. And this has a lot to do with why and how the M1 came to be and why Apple choosing an custom ARM SOC design was - all in all - something only logical.

    After all, that's what the M1 is - a custom tailored, specifically designed puzzle piece that just combines years of effort from multiple vendors.

    Regarding the "SoC" is always a better option...

    The trouble with ARM SoCs is the missing standardization - Device Tree Blob / Device Tree Source as one example.

    Regarding the interconnects between (CPU) core and devices - AmD and Intel are working since ages on that topic.

    Eg. the move from AMD from Infinity Fabric to Infinity Architecture.

    Plus that PCI Xpress has kept the promise to double the bandwidth:

    2010: 8 GT/s (3.1)
    2017: 16 GT/s (4.0)
    2019: 32 GT/s (5.0)
    2021: 64 GT/s (6.0)

    The problem becomes more and more to actually saturate the given bandwidth - especially software wise.

    It's good that Apple has become an competitor and isn't anymore Intel's cash cow, but I think there's a lot of movement going on in every direction.

    Eg. Intel restructures currently every department and Gelsinger is very keen to take the crown back, AMD has to keep now the uphill battle to even battle few of their own "old" engineers (e.g. Mr Kaduri)

    All in all, it has become more interesting.

    But I'm really tired of the Apple hype: After all, Apple has built on the knowledge of many - like any other vendor - and is hyped like the invented everything on their own. Which they didn't do...
  • 0
    @Fast-Nop missed my point entirely (also I should have said power efficiency as the other benefit) - note I said it's difficult to integrate them on the same die and I'm pointing out the *trend* compared to fully discrete systems like most laptops. Integration is natural. Phone SoCs already do that. Which is my point here, AS is descended from phone SoCs rather than PCs. If you want to make comparisons, compare there.

    Also are you looking at standard marketing numbers/stuff like aggregate bandwidth? Those don't mean much, it's regular LPDDR4X memory on a 128bit bus/SSD at the end of the day and since it's still going off-chip and through standard memory controllers it'll look about the same as any other. Look at utilization, power consumption, memory sharing, transaction efficiency, prefetching efficiency, scheduling efficiency, and (most importantly) actual system performance and efficiency under realistic load instead. A bit like taking GPU or AI accelerator performance numbers at face value - meaningless without context and in general you'd be lucky to get even a third of that in a realistic application and probably worse on average.

    If you still disagree, then eh, I've exhausted my arguments. I'll claim the engineering is great. Not magic fairy dust, it's "great". I'm no fanboy, I'm talking from my own experience as a daily user of an AS Mac and from the verifiably awesome numbers it puts out. Their scummy policies as a company have nothing to do with how good their silicon engineering teams are.
  • 0
    @deadlyRants agreed and I agree the cache example was extreme but that's taking current and applications only into account. To get serious performance jumps, you need hardware - software will give you incremental updates at best and it's too much to expect most people to actually fix software given how the market works. At the end of the day it's absolute performance/efficiency given the state of the market that matters, not what we "should" be doing.

    As a hardware person, I'm all for better software, but it's basically not going to happen and it doesn't give you the kind of improvement you need to stay in the market.

    @IntrusionCM not sure what your point is tbh. Also Apple hate is as annoying as Apple hype. I don't care much either way as such, I'll use pretty much anything based on what the best device is for that use. If that's a Mac, I'll use a Mac.
  • 1
    @RememberMe In the M1, the RAM and the flash are not on the same die as on the CPU, so they are not integrated. They are soldered to shut out market competition and demand two resp. three times the market rate. That's not great engineering, that's ripping off customers.

    Energy efficiency - a CPU on 5nm is more efficient than the competitors on 7nm or even larger? Well yeah, I didn't say TSMC was bad.

    Take also their competitor situation into account. Intel has done nothing for years and instead bought back shares, indicating that they had no idea what to do with the money. AMD had been in a tight spot because of Intel's outright criminal anti-competetive measures and only now has wriggled themselves out so that they have considerable R&D money.
  • 1
    @RememberMe I think my brain produced a fart.

    I don't hate Apple _hardware_ per se, but I was pretty annoyed about the hype of the M1 and how some described it as something "evolutionary only Apple could do".

    Dunno why my brain read that in your post, the mingle mangle was pretty much because Intel and AMD are working on interconnects for a long time and PCI-X as an interconnect has a hard to saturate bandwidth...
  • 1
    @Fast-Nop M1 is a stretched phone SoC. Phone SoCs have been doing this since forever. What I've been trying to say all along is that just like phones have basically eliminated customization, expect laptops to as well for reasons outlined above and this is only the start because Apple has the ecosystem that will accept this early. Also it's a bit more than just ye olde soldering - it's a system-in-package design (although yes, you can technically remove stuff manually, unlike in a SoC). As a *trend* it's going to lead to more chiplets and more stacking and more packaging, so the days of customisable parts are likely numbered for now. That last bit has nothing to do with Apple, it's an industry wide trend - packaging/integration just tends to be more efficient.

    I don't deny that it lets them make upgrades costly, but stopping the discussion there is premature and leads nowhere w.r.t the engineering. There's more to it than just gouging the market, that just happens to be a "happy" side product.

    Also 5nm is only half the story - 5nm is impressive gains but not everything (look at Qualcomm's botched SD 888). Yes like I've said before it's not magic fairy dust or worth the hype it gets, but it's not worth the hate either.

    Also, new M1 Pro and M1 Max chips look pretty formidable, although GPU still looks iffy. Not worth an upgrade for me though, my fanless M1 is still the best for me.
  • 1
    @RememberMe Again, the flash and RAM is NOT integrated. It is NOT on the same die. It is indeed just ye olde soldering that cuts down a little bit of cost, but more importantly, allows Apple to rip off its victims who buy into that crap because they can't purchase these parts on the free market. There is nothing innovative about that approach and no engineering merit. It's just a blatant rip-off as you'd expect from Apple.

    Leveraging vendor lock-in is in no way innovative, and it rightfully punishes people who have been gullible enough to buy into a proprietary, locked platform in the first place.

    The rest of the world has competition of many different suppliers and hence defined interfaces for interchangeable parts - which also reflects in the price level.
  • 2
    @Fast-Nop well, not much of a conversation here then. Good talk though, as usual.
  • 0
    @RememberMe why not upgrade from say M1 to M1 pro?
  • 0
    @kiki because I don't need it. My Air does everything I want from it, which is precisely why I chose it.
  • 0
    @RememberMe mine too but 120hz screen…
  • 1
    @kiki that *is* nice but not enough to justify a whole upgrade. I don't notice the lack of high refresh rate on my Mac as much since it's a small screen, and I'd probably keep it off anyway for extra battery.

    I have a 34 inch 144Hz gaming monitor anyway, so it's not like I'm missing out.
Add Comment