1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

DirectX9 programming - Video Mixing Renderer 9

Discussion in 'Developers Area' started by elbows, Jan 7, 2003.

  1. elbows

    elbows PixelRomper

    Are there any DirectX & C++ coders here? Im about to bite the bullet and learn the language Ive avoided for 9 years, but Im realistic I know it will take me a long time to make anything vaguely useable/good.

    Anyway Ive been looking at the DirectX SDK's for some time now, and they introduced the Video Mixing Renderer 7 in Windows XP, and have integrated it into DirectX 9 as VMR9 so its no longer limited to WindowsXP.

    So Im keen to learn whether these things will be any good for VJ software as the examples in the SDK are certainly interesting, things like mixing 2 clips together, using video as the faces of a cube, live video shown on a waving flag etc, using your hardware 3D card. I really guess I am looking to see how flexible this pipeline is and what performance is like.

    So just wondered if anyone else is interested in this stuff, if not I shall have a play around over the next few months and see if I can learn enough C++ to hack the samples into something with more VJ-friendly features as a technology test.

    Also interested in these new shader script type things, wondering whether M$ or nVidia is better so far, time will tell I guess!
  2. MoRpH

    MoRpH Moderator

    Hahahaha interesting, BeOS was doing both of them years ago and without some woop-de-doo hardware video card, just bog standard hardware. Not that I don't think its great they have finally caught up, just sick of this stupid reliance on extra hardware to get a job done that NEVER needed it b4.

    LEVLHED gear whoreder

    I'm very interested in seeing what DX9 can bring to the VJ table...having just picked up an ATI AIW 9700pro that can utilize DX9. I've read about some of the things in there...apparently thru DX9 we could have "photoshop-like filters applied to a video input"!!! Know that I know absolutely NOTHING about programming, but I sure would like to see some VJ apps catch up and try incorporating these "new" features...
  4. burstingfist

    burstingfist Manipulator of Light

    I am down with software that depends on specific hardware, especially if the end results look better and perform better. That new PixelShox software is a great example of how software becomes much more powerful with tight hardware integration.
  5. MoRpH

    MoRpH Moderator

    Hmmmm ok photoshop like filters to live video in... hmmmm pete has replicated a lot of photoshop filters in VJo/TZT/CS oh and you can apply them to his live video input plugin... LOOKS LIKE ITS ALREADY here AND hardware INDEPENDENT.

    Boooo down with specific hardware dependence, up with USING the power in the stuff we already HAVE.
  6. Bluelive

    Bluelive Its Somalicious

    Using the ActiveMovie interfaces you can build your own render pipeline, including your own filters to the pipeline.
    Did some fiddeling, seems nice and simple, i had one big problem when using avi's is that it took some time before it was ready to run.
    The programming behind it has been in DirectX for a while, the program to manipulate the pipeline is new with DX9.
    Dont know enough about it at the moment to build my own filter though.
    MDO's are fun :)
  7. Rovastar

    Rovastar /..\

    The main power of DX9 will come when DX9 cards specific cards are out in Force.

    At teh moment only a few card use DX9 specific instruction set. The ATI Rad 9700 series. They cost about 300 pounds a pop.

    So laptops using these cards are some way off. DX9 is a v. new technology if you want to use it to it full degree.

    That said teh is no really difference in perforamnce of using the older DX technology above OpenGL you can do just as much nice stuff in each.

    I have good resources for either but am perfering OpenGL at the moment.

    But the Video Mixing Renderer stuff might be a useful addition teh DX's bow. I will have to look in it more depth to tell if it is really useful or not.

    LEVLHED gear whoreder

    The ability to run a live video-in thru the 3D pipeline sure sounds useful to me! You silly buggers are actually sitting there thinking DX9 isn't a big step foward for VJs? Whether its due to added abilities or added interface matters not to me...just give me an app that can utilize it!!! Realtime or not.


    Morph, I think you're just ornery because of your MiroDC10 (or whatever that propriety capture card is called) woes....
  9. MoRpH

    MoRpH Moderator

    SOunds good to me just let it run on the hardware I have!!! rather than having to get a new (expensive) video card and stupid desktop box to lug around.
  10. t2k

    t2k New Member

    rendering video into 3d scenes (ie, video on the walls of a tunnel, for instance) is already possible and has been for years


    sorry, no video to "prove" it but if there are doubters i'll render one up later today.

    the only thing Dx9 might offer is a *software* API which allows developers access to some sort of newly developed hardware acceleration route to speed up the process of getting video frames out of the encoded stream and into texture memory where they can then be used for rendering.

    The current state of affairs with OpenGL involves decoding video with the host CPU, then uploading video to the gfx card as a texture - subject to further restrictions like texture size requirements, texture memory limitation, etc. I dont know anything about the DirectX solution to this problem, but my intuition tells me its probably much the same.

    The ultimate solution for incorporating video into accelerated 3d graphics is for the gfx card to recognize "streaming textures" natively, and allow the streaming textures to be submitted to the card by the host cpu in their compressed form (or captured from a live video input). Its quite an investment for a hardware manufacturer to include chips for decoding compressed video, or capturing live video input, but not uncommon. Many cards include DVD (mpeg2) decoding chips, and many cards have video inputs you can use for viewing or capturing incoming video.

    However, no chips (to my knowledge) have any sort of path between an mpeg2 decoder and texture memory or a live capture input and texture memory. Unfortunately this means that for live capture the data path for each frame of video must be gfx card input -> gfx memory -> system memory -> back to gfx memory as a texture

    I'm not sure if DX9 goes down the road of accelerated video capturing / decoding / texturing, but that is the road we'll all be down in several years. My guess is that DX9 just adds a bunch of poorly designed interfaces (to the already horrendous glut of poorly designed interfaces that make up DirectX) that take care of the decode / texture upload automagically, by calling on whatever codecs are needed and doing the "busy work" of making sure the right frame is in video memory at the right time.
    It also may compensate for things like if the texture is not an acceptible size, an incompatible pixel format, or other details that M$ wants to hide from their developer base.

    Like I said, coders can already handle all those issues themselves if they really want to mix video into their 3d scene.

    Personally I'd like to see an opengl card + extension that supports accelerated texturing of live input. A video card with 5 dv/composite/svideo inputs, for example, would be righteous. Then all those hardware video mixers can be gutted and turned into controllers (as has already happened with audio mixers).
  11. elbows

    elbows PixelRomper

    Cheers for the info, I was aware that it can already be done in software, I have several apps that do just that. I was just curious to see if there were any merits from the DX9 way, I guess from your description I'll have to sit back and see if hardware manufactuters move in that direction.

    One of the M$ DX9 SDK examples is for game creators to see video being used in a 3D scene, so I guess M$ are keen to get people thinking about those possibilities, so I guess its down to whether game developer want such things and thus encouraging hardware designers to go that route.

    Agree completely about no known pathway from video input -> texture on the 3D card, in general I now avoid video capture thats integrated into AGP cards because it is currently not a great path for the data to flow out of the card then back in again, its better for now to use a PCI capture card, or so Ive found. Of course this may also be partly due to manufacturers just being lazy with their capture hardware and bolting it on in a sloppy way.

    Morph dude this is the developers forum, so while I agree that theres probably enough VJ apps out there and it would be nicer to see the existing stuff become more mature, I cant personally do anything about that and Im interested with playing with this new technology just simply to try to catch a glimpse of what the future might hold. Dont expect any functional VJ apps from me in a hurry, Im still on the Hello World example in C++ :p All I gonna do for now is hack around with the examples in the SDK for a bit to gain an understanding of just how good or bad Microsoft have done this. Sounds like a good reason to waste loads on a true DX9 3D card too :p
  12. MoRpH

    MoRpH Moderator

    Cheers mate its all good, just chucking in my 3cents. Hey and if someone gets a good mature app out of it, I just might consider making DX9 support part of the shopping list for the next laptop :) (although I'm pretty centain it will be a MAC & @ the rate I'm allowed to buy new laptops it'll be DX15 by then :p )

    LEVLHED gear whoreder

    well you certainly won't need to justify it to me! :)
  14. sleepytom

    sleepytom VJF Admin

    Hmmm great

    Micro$oft have done it again - several years after everyone else in a way that requires extra hardware that doesn't exist yet and will only ever work in winblows (and probably only in XP)
    has anyone actually noticed anything in this spec that isn't allready easaly done in software??

    if you want hardware specific graphics APIs may i recommend OpenGL (seems to be ok for the people who did the effects in titanic) it works better than anything micro$oft have ever made and is fully supported under ALL the popular operating systems

    "photoshop style filters to a video line in" well how fucking clever - effectv have been doing this for years using standard computer hardware - and its fully possable in winblows to - why not let people do it on old computers and use a sencible crossplatform system to achive it (oh somebody allready has)

    Really there are much bigger more important things going on for vj effects at the moment than the latest update to directx
  15. elbows

    elbows PixelRomper

    Well if there are more important things going on with effects then talk about them here rather than just being negative about the stuff I happen to be investigating :p

    As I said Im well aware that most of this suff can be done in software already, I was merely intregued to see if the way M$ had done it will make any difference to performance, as if you try fancy effects such as some of Pete Wardens with a high resolution, the resulting framerate is unacceptable.

    The stuff Im initially looking at doesnt require any extra hardware that doesnt exist yet, and isnt limited to WinXP either.

    Im ready to accept that OpenGL might be better and that M$ might have coded this stuff in a way that doesnt lend itself to high performance, but Im not going to write it off until Ive at least had a little play with it. If you think Im wasting my time then good for you, I really dont care.
  16. ExInferis

    ExInferis New Member

    I completely agree with sleepytom...i prefer that sw must be a crossplatform product....DX9...bullshit(for me)....and it is very bugy...like all M$ products...so if your prepair to amke a VJ app....switch to openGL..its cheaper and much more faster and relaibale than DX....and isn't this the point of VJ app that video clips works fast enough on 500Mhz cpu?!
  17. MoRpH

    MoRpH Moderator

    Installed DX 9 on my old lappy test machine along with Media Player 9. Talk about rooted......... had to reinstall the OS to get things working again, wouldn't play anything in DivX player OR media player with either the DivX or MS MPEG-4 codecs, would play old (cinepak) stuff but everything else just hung the player app.

    That will teach me for installing MS BETA products.
  18. elbows

    elbows PixelRomper

    Hey its fair enough, personally I prefer cross platform too, it just happens that I like to investigate all options purely out of interests sake.

    MoRpH yeah unfortunately Im not surprised that Media Player caused mad problems - every version of that just keeps getting worse and I wouldnt be surprised if broken DivX is deliberate. Original DivX was just eh m$ mpeg4 codecs hacked to allow encoding in normal apps anyway - was it DivX 3.11 content that was broken or 4.xx and 5.xx DivX as well?

    Anyway I'll try to spend some time this weekend on my original plan, and will post again if only to confirm everyones performance suspicions and wait for cries of "told you so" ;)
  19. MoRpH

    MoRpH Moderator

    It was all versions of DivX and the MS MPEG 4's it was damn annoying, especially as it seemed to cascade through apps... :( oh well fixed now :)
  20. Esotic

    Esotic Fine Ass Posts

    You're A Sissy

    So does everyone here just sit around and bitch all day or do you actually VJ at some point in time?


    I thought this would be informative...
  21. Nema

    Nema New Member

    DirectX is better for maximizing performance on windows, while openGL is better for portability.

    personally, i am drifting more and more towards DirectX. pixels- and vertexshaders will dominate the next generation of VJ-software and it somehow surprising that not many VJ-software uses this mighty possibilities, even not visualJockey R3.5. however, the visualJockey-developers are not sleeping! :)
  22. Goz

    Goz New Member

    Sigh. Im no M$ fan but come on people you are slating them when you don't even know what you are talking about. Yes putting live video on a 3D surface ahs been around for ages (You could do it in DX3 ... the first version for example). However what you can do in DX9 is capture the video input directly from the video in/USB/firewire/whatever and DMA it directly to video memory. Better still it does it for you.

    In the old days you had to create a system memory texture surface grab the video and render it to this surface. Once it was on the surface you could THEN transfer it to Video Memory and render with it. IE you have to read from the source, write to the texture then read from the texture and write to VRAM. A double read/write cycle with CPU intervention. All you can do now is do it in a single read/write cycle (effectively doubling memory bandwidth) and without any CPU interaction. Thats good in my mind.

    And yes before anyone says it you COULD have done this on your SGI render bitch 10 years ago. But hey it cost a bit more than a 250 quid GFX card ;)

    LEVLHED gear whoreder

    thanks goz for summing up exactly what I've been trying to tell all these knuckleheds for the past year....
  24. Goz

    Goz New Member

    hehehehe 'tis always the way :)

    All i can say to the rest of you is "Open your minds". There is a LOT you can do if you are prepared to think outside the box. Its just such a shame so many people prefer a blinkered view of things.
  25. Rovastar

    Rovastar /..\

    Still it does change the fact that at the moment not enough people are using DX9 for a worthwhile app to be viable.

    But no doubt Dx9 is getting more commonplace and on laptop to not just a desktop thing anymore.

    I might convert to DX instead of OpenGl I was considering this anyway for Windows Media Player releted stuff but as ATi are the worst company inteh world for OpenGl drivers and cannot even cope with the simplist of commands in the 'standard' - I think a new'standard' of Direct X is the way of the future for me.
  26. Goz

    Goz New Member

    And back to the chicken/egg argument.

    My only issue is that VJs should (IMO) be using cutting edge technology to produce their results. If people continue to stick to yesterdays technology little innovation will come and the whole community will move nowhere. Realtime visual effects have become a possibility in the past couple of years in a way they never could have before and not to use these advantages in a field such as VJ'ing is almost criminal in my mind. But, i guess, thats why i write game renderers for a living ;)

    LEVLHED gear whoreder

    Bring it on!
  28. burstingfist

    burstingfist Manipulator of Light

    To all you doubters about the power of directX, I would say take a look at Pilgrim. It handles complex 3D scenes and High res video better than anything out there. I mean big deal, so you gotta buy a $130 video card, if yer srious about VJing it's a worthwile investment.

    I don't think Pilgrim uses the VMR9, but it uses DirectShow filters for the effects and they are really quite amazing and perform great. Running 4 layers of 3D and 2 layers of video, my machine was still very functional so I could surf the net, use flash, render clips all without interupting the output from Pilgrim.

    Just playing with some of the VMR9 sample apps in the DX9 SDK, I was mixing 3 mpeg2 movies (1 GB each) without even pegging my CPU. I am all for hardware dependance if it means significant performance gains. I mean there is so much un-utilized power in video cards, off loading previously CPU intesive tasks to the video card, frees up more CPU for other types of stuff.
  29. Goz

    Goz New Member

    Like 4D fractal noise generation in real time to use as an input feed into the visualisation engine ... mmmmm high detail noise effects :D
  30. Nema

    Nema New Member

    meanwhile VMR9 does the job. everything is exlained in the DirectXSDK.

    yes Direct3d in combination with DirectShow (using VMR9) is *very* powerful, it makes it possible to have dynamic "streams" as textures in high resolutions.

    instead of using ATI or nVidia specific commands, you can also use HLSL, a common c++ - like language for pixel- and vertexshaders. you can then "compile" this language in realtime into optimised hardware-specific asm-like commands. the next generation of VJ software will definately be dominated by pixel- and vertex shaders, as they allow all kind of color- or geometric transformation much faster and more precise than if it is done with cpu-power and system-memory.

Share This Page