8

Got myself a new work computer. Aside from setting everything back up, it's been an absolute treat. I didn't even have to move to Windows 11.

Why Dell feels the need to put 7TB of garbage, including literal adware that spews notifications, escapes me. All it does is hurt their reputation.

I would have been allowed to build my own from scratch, but I didn't even ask since it's been so long since I built my last machine and I don't even know where to start hardware wise these days.

12th gen i7

GTX1080 that has all the video memory I could need

RAM just pouring out of the thing

I'm living the life.

Comments
  • 4
    Dell is confused. They thought "Download More Ram" was "Download More Spam".
  • 2
    Dell. Reputation. Gnnnnihihihih.
  • 0
    @mansur85 don't be the hype guy
  • 0
    @mansur85 nope
  • 0
    @mansur85 don't be so 2021, let people do whatever they want.

    Besides, plain old Ubuntu with Budgie DE is good enough for me 👌
  • 0
    The GPU is of the 4th newest generation. I wonder if they charged your company as if it was the latest and greatest.
  • 0
    @electrineer I chose it because it's what I have at home and I know it works. Also, spending $1000 vs $600 for an unknown variable wasn't worth it for me. I don't do graphics work, but my tools use a lot of graphics memory, so cutting edge isn't terribly necessary.
  • 0
    @mansur85 I work in a Windows environment and our customers use mostly Windows. I'm also the main IT guy, as well as the lead developer, so I need to be able to support all of it with minimal headaches.

    I work in Debian with Mate DE in a VM at home and work, which works well. Migrating full time to Linux would take weeks that my workload doesn't allow.
  • 0
    @mansur85 What do you mean, "no more Linux at home", and then Pop! OS?! Or did you mean "no more Windows at home"?

    Also, yeah, wait for buying any AMD GPU because you don't want to pay the high early price and on top of that get a worse product with shitty drivers. Buy later for less money and get a better product.

    Except if you do anything other than rasterised gaming with your GPU, then AMD is out of question because they have missed the future.
  • 0
    @mansur85 CUDA/ML/AI, RT, NVENC (for streaming), rendering.
  • 0
    @mansur85 ML/MI is not coming because the whole software stack is on CUDA. That's because AMD has missed the boat for at least a decade. Unless they manage to offer something that can run CUDA applications, whatever AMD offers will be DOA. That's the price for being late.

    Similarly for RT - AMD does not have an alternative, but is trying to catch up. I think you're mixing up RT and DLSS.

    OpenCL totally sucks, Blender even removed it. AMD now has the HIP API instead in Blender - but its performance is abysmal compared to Nvidia, so AMD being unsuited for rendering still holds.

    Actually, AMD is still struggling with total basics such as monitor suspend without GPU crash and shit. Though that does get better late in the product cycle, which is why you shouldn't buy freshly launched AMD stuff.
  • 0
    @mansur85 Ofc AMD GPUs can do RT - just not with any serious performance. The 7000 seem to be better at that though, but still way behind Nvidia. In fact, the gap has widened because RDNA3 is lackluster while Lovelace is a leap forward.

    The 7040 is irrelevant because it's a laptop chip, you simply won't get anywhere close to a 4080 performance in CUDA. Also, the HW is not the point because the SW ecosystem is on CUDA, and nobody will change.

    That's why I wrote that if AMD doesn't offer CUDA application level compatibility, at the very least at source level, better at binary level, then whatever AMD presents is DOA.
  • 0
    @mansur85 In fact, AMD isn't even trying anymore in laptop, desktop, and dGPU. Their market share in these domains has gone down, not up.

    AMD doesn't care because these markets are collapsing, so AMD instead focuses on other, growing markets (such as data centre) because that's much more profitable.
  • 0
    @mansur85 I actually doubt that AMD will even be able to do that because they have that awkward split between their RDNA and CDNA product lines.

    Nvidia doesn't have that, which is why CUDA has taken off that much: because you can dual-use a consumer card also for CUDA. AMD just won't achieve that amount of mindshare with split product lines.

    Also, even if AMD finally presents some useful SW ecosystem, but if it's not compatible to CUDA, then it's already doomed from the beginning.

    So that's two obstacles right there, and you can always count on AMD fucking up in some weird way.

    Btw., I say that as a 6650 XT owner, and the GPU is nice for Linux rasterised gaming - but it took AMD way too long to get their shit together with their usual driver quality issues that impact both Linux and Windows.
  • 0
    @mansur85 Sure AMD will be able to solve that in terms of profit. However, nothing suggests that they would improve their dGPU situation.

    In fact, RDNA3 pretty much falls short of the expectations AMD had stoked, in every way: performance, energy efficiency, drivers. Plus that the pricing is also off. But so what, AMD makes their gaming money on consoles.

    So while AMD's recent earnings call was fine, considering the market situation, part of the reason was that they pretty much abandoned the collapsing desktop CPU and dGPU market as well as the laptop market.

    Btw., their earnings are much better than what the the GAAP results superficially suggest. The non-GAAP results are more relevant because the difference is mainly the writing-off of the stock trade for the Xilinx merge. These aren't actual costs.

    So while AMD seemingly just broke even, the actual organic 2022 profit is a billion dollars, with other segments making up for the losses on Ryzen and Radeon.
Add Comment