31
Comments
  • 9
    Apple will always suck don't even try to change my mind.
  • 3
    Isnt that statement referring to the 500W just for graphics and 28 Teraflops of the new Apple workstation? From reading the description it sounds like it will be a high end workstation for movies and the like.
  • 2
    Where is ATI/AMD? Is this the full meme?
  • 3
    @FrodoSwaggins Oh I do use logic, thing is, after what apple pulled off in general I just don't see anything good about them anymore.
  • 3
    @asgs The apple GPU uses Radeon
  • 2
    A couple folks here taking about power efficiency, but the quote quite clearly states "powerful".
  • 2
    @FrodoSwaggins Their insane prices and that's all I hate them for :D
  • 1
    @FrodoSwaggins I don't agree with that. Nvidia's cards are certainly getting damn expensive and they become like Apple price-wise, but the quality is better and not overpriced.
    Nvidia makes the best drivers. They are preferring windows operating systems, but regardless of that, the quality speaks for itself.

    AMD has great products. Not gonna lie. Until about a month ago, I decided to build a pc. I compared every single nvidia gfx card to the AMD series gfx card. I was about to buy the RX 5700 XT, but the drivers were shit. Even on Linux based operating systems where you normally have "amdgpu" drivers built-in on the kernel level.
    In the end I bought the RTX 2070 SUPER due to its driver stability and got extra features such as DLSS 2.0 (it basically lowers the initial graphics, scales it to a bigger resolution and the AI that's built in does give it the graphics you would normally see when you would have let the resolution be at let's say 4k.
  • 1
    ... That may sound like extra work for nothing, but in gaming this makes a huge difference fps-wise) and Raytracing on a hardware level (faster calculations and less resources wasted on a software-level).
  • 0
    To that comes that Nvidia gfx cards are known to be power efficient and consume less than AMD gfx cards on average.

    That being said AMD gfx cards might be sold at a lower msrp, but in the long-term the power efficiency of the Nvidia gfx cards makes up for it.
  • 0
    I like about AMD gfx cards the fact that they are very endurable when it comes to overclocking unlike Nvidia gfx catds.
  • 1
    @FrodoSwaggins their i-device cpus have a lot of fixed function hardware that is very fast at a specific task, but that don't make them competitive with Intel offerings unless all you do is transcode h. 264 or debayer photos...
  • 2
    but they're right.

    the imagination of their fanboys is better than any hardware in existence.
  • 0
    @FrodoSwaggins whoaa! thank's bruh :)
  • 0
    @Ranchu LOL, that's . . .
  • 0
    @asgs ya that's all ,
  • 0
    @Demolishun lmao .....
  • 0
    @stonestorm yap!
  • 0
    @Midnight-shcode it's the truth bro! hahah
  • 0
    @FrodoSwaggins whoaaah!
  • 0
    @-ANGRY-STUDENT- whoaaa new science, thank's bruh
  • 1
    @-ANGRY-STUDENT-
    I'm afraid it is mostly marketing bullshit.
    Especially the "Raytracing" is offending. It does not offload anything!!! A lot of profesionals where really stoked about it real time ray tracing. Unfortunately the hardware can't do it at all. What it can do is when games specifically implement it as an extra feature get better lighting. So nothing is faster by offloading it's an extra. Don't get me wrong Nvidia did a great job but just like pixel shaders something feels off when you compare it to the real thing/real life. In some scenes it looks brilliantly in some it will miss the mark. Overall it's a great improvement but should not be allowed to be called ray tracing. That is just marketing so people who look it up think this shit will be photorealistic.

    The scaling trick is making the assumption that the render engine does it wrong. Sometimes this really helps sometimes you get details that just don't represent well on a lower resolution.
    This again is mostly created for marketing purposes. You don't have a 4k screen? Don't worry we will get you 4k quality anyway. Not even close...
  • 1
    @hjk101 true, I've seen video benchmarks and reviews about DLSS and especially DLSS 2.0.
    DLSS was a miss. Barely did any changes to the FPS at the same resolution, but then the DLSS 2.0 was released and when it was applied to the gfx card, it made an FPS jump of at least 20% at 1080p.
    At about 4k the fps trick will barely do any fps changes, but only future can tell what it beholds.

    About the Raytracing feature...
    After building the PC I installed the minecraft Raytracing beta version of the game (It is not yet available as a stable release) and it made a huge difference visually when I compared it to my RX560 AMD gfx card which didn't have that feature.
    Didn't test it performance-wise though and honestly I am a bit lazy to do that kind of experiments haha.

    Thus yes, the games still need to be optimized for this particular Raytracing feature of those gfx cards (Calling GPU specific hardware level APIs instead of calling game APIs (software layer)), but I might be wrong with that.
  • 1
    @-ANGRY-STUDENT-
    No you are correct they need to call specific API for "ray tracing" and yes it makes things look a lot better. I wholeheartedly love the technology. Just hate the name they gave it.
    I hope 3d software will use it too for better instant previews and that it will become more accurate over time.
  • 2
    Apple: we have the best CPUs

    Amd: hold my beer
  • 1
    @dontbeevil jaajajajaj 🤣
  • 1
    @-ANGRY-STUDENT- don't know what research you've been doing but that is the most bullshit reason I've ever heard for not buying amd... https://youtu.be/IK_Ue4d9CpE
    2070s is 6% to 7% faster than 5700xt but 20% to 30% more expensive... Even with DLSS it's 9% faster and still not worth the price. Also you've completely ignored the fact that you can still upscale to 4k with the 5700xt and use their Radeon image sharpening feature giving you relatively the same result as DLSS...
    https://techspot.com/article/...
    As for the drivers, I've had plenty of issues my gtx 1060, I even complained on their forums where no one bothered to help https://nvidia.com/en-us/geforce/...
    Also Linux is known for always being late at updating drivers especially on nvidia hardware because of their closed doors approach...
  • 1
    @-ANGRY-STUDENT- I tried lots of distros and the most stable one I could find that worked with my nvidia card was Linux Mint, an Ubuntu based distro, however when I tried installing Ubuntu itself it glitched out... It seems that AMD cards are more compatible since their solutions tend to be open source which appeals to the community...
    Also personally I would never go for 4k gaming since even the most powerful cards can barely run at 60fps with high settings. And spending all that money on hardware and monitor just to use some scaling gimmick, be it from AMD or Nvidia, I would just feel robbed... I would much rather buy a 1080p 144hz IPS monitor, a decent card like 5700xt and the very powerful 3600 ryzen CPU would be a extremely future proof system
  • 2
    @REXTON haha another, AMD fanboy.

    "Even with DLSS it is 9% faster".
    Idk where you pulled that number from, but that is not true.

    I've been talking about DLSS 1.0 being a miss, but 2.0 being a success.
    With DLSS 2.0 on, you get a performance boost of at least 30% and in some rare cases even up to 90%.
    https://m.youtube.com/watch/...

    "As for the drivers, I have had issues with my [random nvidia card]"

    I don't care about your graphics card. It is not a 2070 Super and it also is not a RX 5700 XT. It has nothing to do with this comparison.

    I do own a 2070 Super myself and it is running Arch Linux, Kubuntu and Windows 10 out of the box.

    Here you can read about the AMD driver issues of that specific card.

    https://community.amd.com/thread/...
  • 2
    @REXTON

    Btw. It seems like you have totally ignored my power consumption argument.

    2070 S = 215 W
    RX 5700 XT = 225 W

    This pays off in the longterm.

    https://pcmag.com/comparisons/...

    Reread my fucking comment, you illiterate piece of dogshit
  • 1
    @-ANGRY-STUDENT- yes, the fact that I currently own zero AMD products makes me a AMD fanboy...
    Your insults are just proof of how much of a moron you are... The only reason I'm continuing this conversation is that I hope someone with more brain doesn't buy into your bullshit...
    Since 10W is such a HUMONGOUS difference why won't you show us some calculations to prove just how much you're spending on power if let's say you're gaming for 8 hours a day for a whole year... I'd do it myself but since I'm a proven AMD fanboy I might be biased...
    I said I had trouble with my "Random nvidia card" but if you would've bothered to check (guess that also makes you a dog shit) the thread I posted you would've seen that it affected every single nvidia card at the time... Talk about quality drivers...
    I lol'd when you said you're an Arch user, that explains the toxic attitude
  • 1
    @REXTON says "that explains the toxic attitude", but is the same person that started this heated discussion by calling my side of the argument "most bullshit".

    Imma tell you one thing.:
    If you are not ready to get punched in the face, do not punch other people in the face.

    Do not be a bitch now. I'm heading out.
  • 2
  • 0
    @REXTON jajajajjaa
  • 1
    Well I feel bad now.
  • 0
    @Ranchu lmao 🤣
Add Comment