Skip to main content

Use Kinect Everywhere

Stream Data

Althrough only a limited set of software, for example TouchDesigner, offers native support for Kinect over the Kinect SDK, you can stream Kinect data from one of these applications to almost anywhere, including in different environment like Python, Node.js or even Web Apps including p5.js.

Within a local network, you can effectively setup one Windows computer as the Kinect Host and stream data to the same machine or different machines, utilizing Kienct on Raspberry Pi, MacOS or even Arduino.

Depending on the use case, you can stream the raw camera data as a video feed, or stream the skeletal tracking data as a live data feed. TouchDesigner is good as the host to do both.

Stream Image Feed via NDI

NDI (Network Device Interface) is a real-time video-over-IP protocol developed by NewTek. It's designed for sending high-quality, low-latency video and audio over a local network (LAN), with minimal setup and high performance. Read NDI Documentation for more infomation.

You can use NDI to stream Kinect video feeds (color, depth, IR) from TouchDesigner to:

  • Another TouchDesigner instance (on same or different machine)
  • OBS Studio for recording or streaming
  • Unreal Engine, Unity, Max/MSP, etc.
  • Custom apps using NDI SDK or NDI-compatible plugins

Setup NDI stream in TouchDesigner with NDI OUT CHOP, you can create different streams for different image source (TOP) with different names.

NDI in TouchDesigner.png

Stream Body Tracking Data via OSC

OpenSoundControl (OSC) is a data transport specification (an encoding) for realtime message communication among applications and hardware over network, typically TCP or UDP. OSC is originally created for highly accurate, low latency, lightweight, and flexible method of communication for use in realtime musical performance and is widely used in music, art, motion tracking, lighting, robotics, and more.

You can use OSC to stream body tracking data from Kinect in TouchDesigner to other software (or vice versa), such as:

  • Unity, Unreal Engine
  • Processing, openFrameworks
  • Max/MSP, Pure Data, Isadora
  • Web apps (via websocket bridge)
  • Python, Node.js apps
  • Other TouchDesigner instance

TO-DO

Continue Here

Stream with Kinectron for Web

TO-DO

Continue Here

Receiving data

TO-DO

Continue Here

Unreal and Unity

In game engines, you should be able to use the Kinect SDK directly, which still envolves some lower level programing and a lot of experience. There are some plugins developed for Kinect, but some of them are paid and some haven't been updated for years. Look for the right plugin for your need, depending on whether you want to get Skeletal Tracking or Depth Image.

Unreal

Neo Kinect (paid)
Azure Kinect for Unreal Engine

Unity

Azure Kinect and Femto Bolt Examples for Unity (paid)
Unity_Kinect (free)