It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
This might explain why they didn't want to play with Valve on SteamOS. That's going to be heavily OpenGL focused, due to Linux support.
avatar
Arkose: Back in the '90s many games supported multiple renderers, e.g. Deus Ex supported 3dfx Glide, Direct3D, OpenGL and S3 MeTaL in addition to a software renderer; having two renderers in 2014 onwards seems simple by comparison.
This is actually a great example of this problem. It uses the Unreal engine (first generation). This was written for software and Glide rendering, and worked great on 3DFX cards only. DirectX was added to support other cards, but was buggy, less pretty and much slower.

When the engine got ported to Linux they had to work on OpenGL (it existed but was marked 'experimental' - for very good reason). For early Linux ports you either run glide (and buy a 3DFX card) or had a hideously broken game. This was made worse when X4 only supported OpenGL, so you had to stick with old software as well.

Only by the end of the engine's life they had fixed the OpenGL renderer. Playing the latest patched versions of these games now hides most of these problems.
avatar
_Bruce_: This is actually a great example of this problem. It uses the Unreal engine (first generation). This was written for software and Glide rendering, and worked great on 3DFX cards only. DirectX was added to support other cards, but was buggy, less pretty and much slower.
DirectX wasn't very good at the time but that has been resolved. Regular DirectX will still see a lot of work with or without Mantle since those with Nvidia and older AMD GPUs will be relying on it. AMD are expecting developers to work with both simultaneously (some DirectX code will be directly reusable in Mantle according to the presentation).

Having two more or less equally viable APIs available also gives gamers more options; in the event of problems with the Mantle renderer players could switch to DX11 mode instead.

...of course all of this will depend on AMD being able to convince major developers to implement technology that is only usable by a subset of players.
I don't see the problem. Mantle is not getting rid of DX. That will likely not happen any time soon. This is in conjunction with DX11, and both forms of API's will be usable on Battlefield 4.
avatar
crazy_dave: I think the OP is more worried that AMD's new API will reintroduce the bad old days of device fragmentation in graphics for games than proclaiming AMD cards to be the future. :) Myself, I am pretty much GPU agnostic.
avatar
StingingVelvet: I know, my upgrade was just on my mind so I turned it into that.

Segregation... yes. And companies don't care, all about the bottom line.
Companies do\sometimes care about technologically segregated markets as it can damage profits.

Look at PhysX; nVidia spent a fortune aquiring a company for a technology that has barely been used because harldy anyone wanted to take a risk on a technology that only gave a marginal benefit to about half of their customer base. nVidia might not have wasted the money but it doesn't look like they made great investment.

In this case the tech offers a, potentially, massive benefit and has large market. Even by claiming to be open is likely to encourage developers to adopt the technology figuring that they'll hit a big market and hoping the last big segment left adopts the tech in the mean time.

At best this is a breath of fresh air that will finally give a cross platform, game focused, high performance graphics API . At worst it fails AMD can't afford to back down and its Glide all over again but with the fringe benfit of easier console ports for some PC users.
EDIT: Ninja'ed, I see others are discussing about PhysX already.

avatar
crazy_dave: I think the OP is more worried that AMD's new API will reintroduce the bad old days of device fragmentation in graphics for games than proclaiming AMD cards to be the future. :) Myself, I am pretty much GPU agnostic.
Didn't we already have this, e.g. in a form of NVidia PhysX? I recall AMD people complaining some NVidia optimized PC games running poorly on AMD hw, probably due to PhysX. Or then the AMD users would have to switch off PhysX and miss some extra debris flying around etc.

Maybe AMD feels now is a payback time... But as said, end-users lose in this fragmentation. I always wondered how AMD was going to use the leverage on PC that it is the sole graphics HW provider for both next-gen consoles... and this seems to be it.
Post edited September 26, 2013 by timppu
avatar
StingingVelvet: I'm buying a new video card next week or so, but I just can't consider ATI. I've been into nVidia a long time and am just used to it, and the one time I tried an ATI card it underperformed the card I replaced, which was shocking.

Sorry fans :(
It's ok, everyone have a little bit of fanboy inside. For example I had terrible experience with Nvidia and I don't consider buying it in foreseeable future.

But still, it would be extremely silly thing to buy Nvidia card right now. From early benchmarks it seems that R9 290X is a little bit more powerful than Titan and it is supposed to cost 400$ cheaper than Titan, so huge price drops from Nvidia are unavoidable and they will happen very soon. So if I would be you, I would wait few weeks for price drops.
Post edited September 26, 2013 by Aver
avatar
Aver: But still, it would be extremely silly thing to buy Nvidia card right now. From early benchmarks it seems that R9 290X is a little bit more powerful than Titan and it is supposed to cost 400$ cheaper than Titan, so huge price drops from Nvidia are unavoidable and they will happen very soon. So if I would be you, I would wait few weeks for price drops.
Price thing is a decent point. I might wait, not sure.

As for ATI being faster etc., my experience is that despite better tech specs sometimes ATI cards still lag behind because their drivers are much worse. That's a big reason I won't use them again.
avatar
Aver: But still, it would be extremely silly thing to buy Nvidia card right now. From early benchmarks it seems that R9 290X is a little bit more powerful than Titan and it is supposed to cost 400$ cheaper than Titan, so huge price drops from Nvidia are unavoidable and they will happen very soon. So if I would be you, I would wait few weeks for price drops.
avatar
StingingVelvet: Price thing is a decent point. I might wait, not sure.

As for ATI being faster etc., my experience is that despite better tech specs sometimes ATI cards still lag behind because their drivers are much worse. That's a big reason I won't use them again.
The thing about AMD having worse driver is a myth for me. Look at tech forums of Tomb Rider, The Bureau or Hitman and take a look how many nvidia users are complaining about performance. You might said that TR and Hitamn are partnered with AMD. But which games aren't these days? :P Also The Bureau was partnered with nvidia and still a lot of nvidia users are complaining about performance on its forums.

Whenever I ask someone who claim that nvidia has better drivers, what he means by that, then they never know what to answer. Everyone just heard so, so they say so.

I just care about frames per second. I have more of them for same amount of cash, so I don't care if someone say that I have worse drivers. At least TR didn't crashed on me when I turned tessellation on ;).
Post edited September 26, 2013 by Aver
avatar
Aver: Whenever I ask someone who claim that nvidia has better drivers, what he means by that, then they never know what to answer. Everyone just heard so, so they say so.
A lot of developers, even Valve themselves, have said nVidia works with them much more than ATI. Also I tried an ATI card as recently as 2 or 3 years ago during my last upgrade, and despite it being 2 generations ahead of my nVidia card I noticed few to no real upgrades in any of my games that didn't have ATI partnerships.

So, in my experience it's not a myth. Things might have changed, but it's hard not to stick with what you know.
avatar
Aver: Whenever I ask someone who claim that nvidia has better drivers, what he means by that, then they never know what to answer. Everyone just heard so, so they say so.
avatar
StingingVelvet: A lot of developers, even Valve themselves, have said nVidia works with them much more than ATI. Also I tried an ATI card as recently as 2 or 3 years ago during my last upgrade, and despite it being 2 generations ahead of my nVidia card I noticed few to no real upgrades in any of my games that didn't have ATI partnerships.

So, in my experience it's not a myth. Things might have changed, but it's hard not to stick with what you know.
Oh come on. This is just ass kissing contest. I can post tons of comments from devs saying that AMD works with them much more than Nvidia, like Eidos Montreal or Irrational Games. They just praise their partners because they are paid to do so. This whole praising is complete bollocks - just look at The Bureau or Resident Evil 6, if nVidia would work with devs closely then maybe those two ports wouldn't be so hideous. It's weird that they wanted their logo on them.
avatar
Aver: Oh come on. This is just ass kissing contest.
Haha... maybe, maybe. We all go with what we know, I guess.

I'll read some articles about ATI's latest slate and see if it sways me.
avatar
StingingVelvet: I'm buying a new video card next week or so, but I just can't consider ATI. I've been into nVidia a long time and am just used to it, and the one time I tried an ATI card it underperformed the card I replaced, which was shocking.

Sorry fans :(
Creative 3D Blaster and 3dfx Voodoo forever!
I've owned two nvidia cards since I started PC gaming. The first was back around 2002 and had horrible image quality that made the screen look like it was covered in a thick coating of dust no matter what I did. The second which I got a couple years ago had a ram chip just fall off after about 7 months.

I did have a Radeon 9800 die on me after a few years as the power connector on the card burnt, but my other ATI cards since then have been fine.
avatar
Fictionvision: I've owned two nvidia cards since I started PC gaming. The first was back around 2002 and had horrible image quality that made the screen look like it was covered in a thick coating of dust no matter what I did. The second which I got a couple years ago had a ram chip just fall off after about 7 months.

I did have a Radeon 9800 die on me after a few years as the power connector on the card burnt, but my other ATI cards since then have been fine.
I've got pretty bad experiences with ATI HD4850 and HD4870 cards. My first, a Powercoloer HD4850, ran too hot and died due to artifacts. Then I exchanged it with a XFX4870 which crapped out with a white screen in certain games due to crappy RAM. And then I got a Sapphire HD4870 and that one died after some small chip thingy fell off near the VRM. So I'm with NVIDIA now - a Gigabyte GTX 660 TI - and it works fine except for a small problem with the fan constuction which I could fix with a tiny bit of rubber.