Discussion in 'Developers Area' started by asterix, Dec 7, 2010.
This may explain a bit of the significance:
And another great example:
OSCeleton now has a mode which is compatible with Quartz Composers standard OSC Receiver, yay.
Finally realised what the calibration pose reminds me of.
Muuuuahahah! I am cornholio, I need tipi for my bunghole!! :roll:
my mate did this lkast weekend
Kinect + iInferno (FFGL Fire FX Plugin)
Using depth image to fuel the fire. OpenNI + PrimeSense Kinect mod, skip to 0:39 for the fire Guess I am aiming at more to come the following weeks, so many ideas now..
..running through my VJ app Euphoria then output + music grabbed with Fraps, quality goes down but u get the idea
Music: "Ambitude" (unreleased/unfinished track I worked on a couple years ago)
More to come, surely!
Thomas / Intrinsic
classy mate. I like this alot.
I've actually been playing with some of this jazz myself and i;ve been having a really hard time getting it to work right with animata.
I've never used git hub before but pretty sure i got things compiled and installed via terminal.
Any of you kids know a good primer on quartz?
Its been a lot to wrap my head around.
Check the audio track in the vid posted by PCProject... B&B samples! Coincidence? )
Yum. Looks like early development. Are the legs not tracked yet?
BVH format makes me very interested though!
Im not doing anything with BVH myself at the moment, so I cant really say. But there is an alternative that may be better:
Thx for the link elbows, this one looks waaays better than the japanese one.
I've been having a lot of fun playing with VVVV/openNI flavour Kinect.
As far as I can figure, at present openNI will do full tracking on two people at a time, and silhouette or body shape detection on up to six. When you put more peoples in front of it both systems get confused.
I'm still hoping to be able to use it in a mob situation with the depth image with more traditional contour tracking though.
Ive just been trying to get MSAFluid working in processing... after an hour of trying to get it to find the library I gave up.
Then started reading about openFrameworks and that confused me even more... couldn't even open a file.
Now I have a Kinect sat here doing nothing.... not a good day.
Hi Elbows, just found this thread, great info
thanks for sharing,
am i right in saying the shortest route from kinetik to qc is:
Kinect --> OpenNi --> NITE --> OSCeleton --> Quartz Composer
i want to do some particle systems / trace on limbs for a dancer that kinda stuff....
pretty similar to the body paint vimeo example.
Last time I checked, yes, at least if you need the skeletal tracking. Also OSCeleton has been known to have a few issues on the mac, and from the sound of some posts on their mailing list, a few people have started making alternatives. I havent had time to look into this further.
Had our first field usage of kinect in May- here's the vid:
Great work Kyle I really like the fact that it doesn't look like a tech demo !
For Windows 7 users, Microsoft version of SDK is out. Non-commercial use, but the skeletal tracking probably has some advantages, eg no calibration pose required.
Not tried it myself yet, may get the opportunity to do so this evening.
FF1.0 PC new Kinect FreeFrame w/User Detection, free download at VJFX.com
Ok this is a free download from the updated article on VJFX.com with updated tutorial/instructions: FF DepthCam plugins at VJFX.com
The new FF1.0 (PC Only) plugin which I called NIUserKey1, is based on OpenNI libs and avin2 modded primesense drivers for Kinect (I suppose it would support other OpenNI supported devices too), it has 4 parameters:
- one controls foreground (detected user) transparency
- one controls background transparency
- the other two controls saturation of either
..no skeletal detection or anything more yet, but maybe I'll look into that too, any ideas for hybrid/depth FX are welcome. I'll admit I haven't been so frequent in VJForums lately, a bit much with all at once, anyway I'll try drop in more often..
and, merry xmas!
Thomas / Intrinsic
Cheers for that, and a merry christmas to you too! Im not on a PC and Im doing skeletal stuff so I can't really comment on your stuff, but I would think some should find it useful. If only this tech had arrived at a time in the past when VJForums & associated meetings were more alive, would have been a lot of fun.
Anyway recent versions of OpenNI/NITE have removed the need for calibration pose for skeletal tracking, which is nice, although I've not had a chance to try this yet.
Highly polished and rich Kinect applications remain rather elusive, but in most other senses there has been quite some progress in the year+ since this stuff became a reality. There is now quite a lot more choice, and certainty for those looking to develop stuff commercially.
PrimeSense sensor for developers (has been hard to get hold of)
Asus Xtion (for use with OpenNI rather than Microsofts flavour of SDK)
XBox360 Kinect sensor (need to buy standalone version to get the USB adaptor)
Kinect for Windows (From 1st Feb, more expensive than 360 version but includes support for recognition at a closer range)
OpenNI (cross-platform, skeletal tracking improved recently, no calibration pose required)
Kinect SDK (Windows 7 or 8, new version due on Feb 1, supports microphone array and speech recognition API)
One of the completely open alternatives that came from the original Kinect hacking (No skeletal tracking but good for getting depth map to do more basic tracking with)
Since Microsoft announced the Kinect for Windows hardware and made clear that the software side of things will remain free for developers, I can now press on with exploring commercial possibilities.
Well the Microsoft official SDK version 1.0 is out today. I cannot comment on it in detail as it requires the Windows version of the hardware, and I think that Microsoft have failed to actually make the hardware available in many countries today. Im not even sure if many in the US have managed to get hold of the hardware.
Looking at the release notes I don't see any stunning new developments anyway. A few things that could be handy for certain applications but for many the unofficial stuff or OpenNI/NITE stuff from PrimeSense may well continue to be the better option for a number of scenarios.
I stand corrected, for it seems the latest MS SDK does still work with the 360 Sensor. The way Microsoft word it is that developers can do this if they install the SDK, but end users will need the windows kinect hardware. In this sense its more of a licensing issue for developers, although there are technical issues too as certain capabilities will only work on the windows version of the hardware.
Anyway I gave it a quick go and although I cannot evaluate the near-field mode due to not having the windows version of the hardware, I could try the rest ok. It didn't impress me more than previous releases, so I'll be sticking with OpenNI instead for now.
Or in other words, wake me if Microsoft get round to putting the facial expression & head rotation detection stuff they recently released as part of Avatar Kinect for the Xbox, into their Windows SDK.
See http://spacepalette.com for an interesting use of the Kinect to create an instrument for visuals and music. I use the libfreenect library created by the Open Kinect community. I used to do the visuals with Processing, but I now do it with a C/C++ plugin (FreeFrame 1.5 FFGL) running inside Resolume.
Separate names with a comma.