PDA

View Full Version : The future of Direct X


Logical
03-16-11, 01:41 PM
I've just read a fascinating article over at Bit tech about the advantages and disadvantages of going through an API to develop games.

Link (http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1)

I didn't realise that DirectX held developers back so much, but would it be beneficial if developers could code direct to hardware with all the different GPU architectures out there? Maybe the standardised opencl and directX need a lot of improvement to give developers more freedom. Either way the future for DirectX doesn't look to great in terms of hardware and software capability.

Ancient76
03-16-11, 01:50 PM
DX has no future.

LydianKnight
03-16-11, 03:12 PM
Free-to-access hardware in the PC realm nowadays???? God... NO!

It works for consoles, where you need to optimize your software as much as you can to extract all the 'juice' from the console, but on a PC? I mean, yeah... would be nice, but not as long as you have freedom to choose among a myriad of different components.

If the PC ever gets to the point of becomign a kind of 'unified' architecture (and not the horrible mess it is currently), maybe... but when you want to optimize your code to run on such a variety of different configurations? No way... (I wouldn't wish that even to my worst enemy...) xD

stevemedes
03-16-11, 09:14 PM
Like previously mentioned, with all that hardware, coders are going to need a wrapper.

mojoman0
03-16-11, 09:48 PM
DICE dev repi's input
http://forum.beyond3d.com/showpost.php?p=1535975&postcount=8

I've been pushing for this for years in discussions with all the IHVs; to get lower and lower level control over the GPU resources, to get rid of the serial & intrinsic driver bottleneck, enable the GPU to setup work for itself as well as tear down both the logic CPU/GPU latency barrier in WDDM and the physical PCI-E latency barrier to enable true heterogeneous low-latency computing. This needs to be done through both proprietary and standard means over many years going forward.

I'm glad Huddy goes out and in public talks about it as well, he get's it! And about time that an IHV talks about this.

This is the inevitable, and not too far, future and it will be the true paradigm shift on the PC that will see entire new SW ecosystems being built up with tools, middleware, engines and games themselves differentiating in a way not possible at all now.

- Will benefit consumers with more interesting experiences & cheaper hardware (more performance/buck).

- Will benefit developers by empowering unique creative & technical visions and with higher performance (more of everything).

- Will benefit hardware vendors with being able to focus on good core hardware instead of differentiating through software as well as finally releasing them and us from the shackles of the Microsoft 3 year OS release schedule where new driver/SW/HW functionality "may" get in.


This is something I've been thinking about and discussing with all parties (& some fellow gamedevs) on different levels & aspects of over a long period of time, should really write together a more proper blog post going into details soon. This is just a quick half-rant reply (sorry)

The best graphics driver is no graphics driver.

ViN86
03-16-11, 09:49 PM
I've just read a fascinating article over at Bit tech about the advantages and disadvantages of going through an API to develop games.

Link (http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1)

I didn't realise that DirectX held developers back so much, but would it be beneficial if developers could code direct to hardware with all the different GPU architectures out there? Maybe the standardised opencl and directX need a lot of improvement to give developers more freedom. Either way the future for DirectX doesn't look to great in terms of hardware and software capability.

It would be a much better landscape if OpenGL were more competitive. Back in the days when OpenGL was much better, DX was moving forward at a rapid pace. Now that DX dominates the market, the push isn't there any more.

Toss3
03-17-11, 08:46 AM
*Something about consoles holding us back and how PC gaming is not dying*

Logical
03-17-11, 12:31 PM
Too often i have thought that maybe our GPU's aren't being used to potential, which maybe true in some ways but it seems to be the API that is actually holding the GPU's back. If this is the case why do GPU manufacturers make GPU specs outside that of the API ?

Does that make sense lol ?

Bman212121
03-17-11, 02:30 PM
Not really sure what to think about the article. Obviously there is going to be a performance hit any time that you use a middleman to get from point a to point b. In their one scenario they might have found a instance where DX isn't optimized for what they need to accomplish. Take tessellation as an example. Nvidia was the first to have everything in place for tessellation and could smoke ATI's cards in benchmarks that were heavy in that area. Go to a actual game and you'll never see that gap because no one actually uses that type of setup. There are probably other ways to handle the same thing without running into limitations.

I really don't see DX going anywhere anytime soon as it makes programmers life a lot easier. As stated twice already a developer isn't going to waste their time optimizing their software for a whole lot of cards. They already have shown how badly they can be when porting software so if they had to program direct nothing would work right.

Here's another example why I think this is never going to happen. It is very likely that we will see a shift towards more architectures in the near future. With MS already announcing ARM support for Windows 8, a program will not only need to be able to run on many types of GPUs but also multiple CPU architectures. Your one program will need to be able to run on AMD, Intel, Qualcomm, Nvidia, TI and probably others. Then on the graphics side you'll need AMD, Intel, Nvidia, PowerVR and possibly more. Even if each company made their own API you would still need to program for at least 3 different APIs. A company that is making a AAA title might be able to afford having software engineers to handle all of that, but I'm guessing that most other software companies simply do not.

Too often i have thought that maybe our GPU's aren't being used to potential, which maybe true in some ways but it seems to be the API that is actually holding the GPU's back. If this is the case why do GPU manufacturers make GPU specs outside that of the API ?

Does that make sense lol ?

That is where they make their money. Innovation. There wasn't an API for surround gaming or 3d monitor support. It wouldn't make sense for Nvidia to have not come up with their own solution to the problem rather than relying on someone else's implementation.

Redeemed
03-17-11, 02:35 PM
Not really sure what to think about the article. Obviously there is going to be a performance hit any time that you use a middleman to get from point a to point b. In their one scenario they might have found a instance where DX isn't optimized for what they need to accomplish. Take tessellation as an example. Nvidia was the first to have everything in place for tessellation and could smoke ATI's cards in benchmarks that were heavy in that area. Go to a actual game and you'll never see that gap because no one actually uses that type of setup. There are probably other ways to handle the same thing without running into limitations.

I really don't see DX going anywhere anytime soon as it makes programmers life a lot easier. As stated twice already a developer isn't going to waste their time optimizing their software for a whole lot of cards. They already have shown how badly they can be when porting software so if they had to program direct nothing would work right.

Here's another example why I think this is never going to happen. It is very likely that we will see a shift towards more architectures in the near future. With MS already announcing ARM support for Windows 8, a program will not only need to be able to run on many types of GPUs but also multiple CPU architectures. Your one program will need to be able to run on AMD, Intel, Qualcomm, Nvidia, TI and probably others. Then on the graphics side you'll need AMD, Intel, Nvidia, PowerVR and possibly more. Even if each company made their own API you would still need to program for at least 3 different APIs.

This is essentially my stance on the whole issue.

I know that developers are not nearly taking advantage of all the CPU and GPU horsepower, not to mention RAM, available on your average desktop system- far from it. So much more could be done to optimize the coding and make it far more efficient making far greater use of the hardware's abilities.

However, this wont happen due to limited resources... far too many hardware variation for a company to be able to support them all.

That said, I'm absolutely certain that PC gaming would be in far better shape if MS hadn't abandoned it. If MS gave PC Gaming as much love as they do XBox, this platform would be much more utilized by developers and thus the games on the platform would reflect it.

Yaboze
03-17-11, 02:37 PM
I'd like to see them prove it, somehow.

Make a bootable DVD for the PC, kind of like a Xbox or PS3 game. Make it a demo. Put no OS on it, just a bootloader with the code and have texture data on the DVD with a basic file system. No GUI, no drivers or anything. Write the code right to the metal, as they say.

Then make the same demo in DX11 for Windows using .NET and all the typical MS Dev. Tools.

Let's see how much better it is.

Redeemed
03-17-11, 03:37 PM
I'd like to see them prove it, somehow.

Make a bootable DVD for the PC, kind of like a Xbox or PS3 game. Make it a demo. Put no OS on it, just a bootloader with the code and have texture data on the DVD with a basic file system. No GUI, no drivers or anything. Write the code right to the metal, as they day.

Then make the same demo in DX11 for Windows.

Let's see how much better it is.

That would be very interesting. But again the complexities... do the do it on a GTX580? GTX480? Radeon 6970? Radeon 5870? AMD Phenom II x6? Core i7 990x? All of the above? How about how much RAM is in the system or systems? So many hardware variables to account for... with Intel they'd at least have a near constant chipset but with AMD... so many different chipsets... just seems like it'd be a monumental undertaking imo.

Granted, I'm not really a programmer, game developer, etc... just seems to me that by taking away the API it's gonna' be a huge mess of variables to have to account for. If anything, DX very well may be the one thing saving PC Gaming. Without it we'd have OpenGL but...would we really be in any better shape then?

frenchy2k1
03-17-11, 04:11 PM
This is a non-issue.
Or rather, this is an unavoidable issue until the platform stabilize and standardize. Which makes it impossible, as the strength of the PC *IS* its wide ecosystem of components and their varied configuration.
Until everyone agrees on a single interface or programing language for their chips, this *CANNOT* happen and provided how fast GPUs are still evolving, I doubt we'll see that for years to come.

AMD loves to comes with big news like this, but in reality, they tend to act differently. AMD pushed for OpenCL because nvidia had a head start with CUDA, but even there, they trailed Nvidia's driver support in OpenCL (NV released their driver publicly the same day the OpenCL spec became final).

Sure, everyone could benefit of a stable interface to the GPU, much lighter than DirectX or OpenGL, but how do you do this when even inside a company, each generation of product is pretty much incompatible with the previous? The gift of OpenGL/OpenCL and DirectX is you do NOT have to worry what you run on. This is paid by performance.

frenchy2k1
03-17-11, 04:15 PM
And to add on top of that: most games/software are coded to the lowest common denominator. What use is it to have the best and fastest software if your market is only 2% of what it could be?
This is why most games are coded using DX9. The biggest market is still there so far... Same reason people will not include physics in their game in a meaningful way: not enough support. You can use it to make things prettier, but if it becomes necessary (blow up walls...), you need adoption to make sure your game sells.

Armed_Baboon
03-17-11, 07:52 PM
*Something about consoles holding us back and how PC gaming is not dying*

*something further about how developers are absolutely mad for not tailoring their games specifically to the 5% of the market who are PC gamers.*

mojoman0
03-17-11, 08:12 PM
this is why larrabee apparently was such a good idea. Developers could program all the way down to assembly to maximize the throughput of the processor and not have to worry about the compatibility but rather gain fps with the addition of more and more cores to split processing tasks

ViN86
03-17-11, 08:38 PM
this is why larrabee apparently was such a good idea. Developers could program all the way down to assembly to maximize the throughput of the processor and not have to worry about the compatibility but rather gain fps with the addition of more and more cores to split processing tasks

Given how many devs use devkits nowadays, you think any of them could program assembly? :lol:

The only one that probably could is Carmack haha.

jlippo
03-18-11, 01:30 AM
this is why larrabee apparently was such a good idea. Developers could program all the way down to assembly to maximize the throughput of the processor and not have to worry about the compatibility but rather gain fps with the addition of more and more cores to split processing tasks
There would have been problems with compability, the code would have not worked with anything else than the larrabee.
If you used DX in between, it would have worked with other hardware as well.

Yes, direct access to metal was fun in the old times of Dos, but the problems were there as well.
Anyone who remembers the times of manually tweaking config files to free enough memory does know what I mean.
We did some fun things though. ;)

LydianKnight
03-18-11, 06:47 AM
Yes, direct access to metal was fun in the old times of Dos, but the problems were there as well.
Anyone who remembers the times of manually tweaking config files to free enough memory does know what I mean.
We did some fun things though. ;)

Yeah... there were LOTS of quirks to run all of our code, but hey, you have to admit writing this...


xor ah,ah
mov al,13h
int 10h


...to call the almighty 320x200 graphics mode (among other useful tricks) was just... full of awesome :wonder:

jlippo
03-18-11, 08:13 AM
Yeah... there were LOTS of quirks to run all of our code, but hey, you have to admit writing this...


xor ah,ah
mov al,13h
int 10h


...to call the almighty 320x200 graphics mode (among other useful tricks) was just... full of awesome :wonder:
:wonder:
Sweet times and crazy ideas..

Sadly many of those were too slow at the time or too much work to properly test.
Like scene relighting system for adventure games.. render or paint z, normal & color to screen, light scene in second pass. ;)
One friend had screenspace shadows going on around same time. (ran fine with pentium1 or 2?)

Anyone could have an idea and coder could write a test in reasonable time, without worrying if hardware or api gives enough flexibility for it.