# Dark Lab # About the Dark Lab The Dark Lab is located on the Ground floor of Greencoat Building, in GB\_G05. # Opening Hours
OpenStaffed
Weekdays09:00–21:0009:00–19:00
Saturday10:00–18:00Unstaffed
SundayClosedClosed
# Staff
![Photo of Lieven van Velthoven](https://wiki.cci.arts.ac.uk/uploads/images/user/2024-04/vwt6XTcZ380EVwF5-lieven-lidar-scan-01-cropped-2.png)
Lieven van Velthoven
Creative Code Specialist Technician
he/him
# Dark Lab Equipment # Virtual Reality # Microsoft Kinect (3D depth camera with full-body tracking) # VR trackers # VR trackers # The Dark Lab has Virtual Reality motion tracking set up in both main truss grid area and the VR corner, making pretty much the entire lab trackable using our 'Vive VR trackers'. These are little plastic things that you can attach to whatever you want to track. They can send both their position and orientation wirelessly, with super high accuracy, in real-time to your computer. You do not need a VR headset to use them! ### How to use them ### The Vive tracking system is compatible with some of the main PC-based VR protocols like SteamVR and OpenVR/OpenXR. They can basically work with any software that supports (one of) these protocols, like Unity, Unreal, TouchDesigner, WebXR, etc., and most programming languages. You'll need to have SteamVR running and logged in. ### Using trackers without a VR headset ### Thankfully, all our Vive VR trackers can be used even when you do not have a VR headset attached to your computer. For this you will need to edit two text files in your SteamVR folder, as described here: https://github.com/username223/SteamVRNoHeadset or here: https://github.com/n1ckfg/ViveTrackerExample (step 3-9) ### To take home ### We also have a few VR trackers that you can take home - these do not need any 'base stations' to work! This means you can use them anywhere, including outdoors. They rely on two little built-in cameras to function properly, so they do need some light. Also, just like the 'normal' VR trackers, you cannot block their view of the environment. ### What do we have? ### 10x Vive Tracker 3.0 https://www.vive.com/uk/accessory/tracker3/ 2x 3-Pack HTC VIVE Ultimate VR Tracker and Wireless Dongle # Projectors # Our Projectors # ### In the Lab ### We have a number of different video projectors available for use in the Dark Lab.
Some of them are permanently mounted in the truss grid: those are ready and waiting for you to plug in and use!
If you would like to project anywhere else in the Lab, please ask Lieven van Velthoven (Dark Lab / Code technician), and he will see if it is possible to mount one for you. (no guarantees, but do ask!)
Our projectors vary in brightness, throw ratio, zoom, lens shift, imaging technology, etc.. We can have a chat about which one might best suit your project.
(these terms are explained below if you would like to learn a bit more!) ### To take home ### We also have a few projectors that you can take home - namely the BenQ ones listed below. These are bookable through [ORB](https://orb-arts.siso.co), our online loan store and equipment booking system. ### What do we have? ### Here are the models we currently have. Click on the links to fnd out more!
2x [Epson EB-L635SU](https://www.epson.co.uk/en_GB/products/projectors/installation/eb-l635su/p/31708#tech-specs) (6000 lumen, LCD, 0.8:1 medium short throw) (these are the two main projection screens)
1x [Panasonic PT-VMZ60](https://www.projectorcentral.com/Panasonic-PT-VMZ60U.htm#specs) (6000 lumen, LCD, 1.09-1.77 throw ratio)
1x [Panasonic PT-VMZ71](https://www.projectorcentral.com/Panasonic-PT-VMZ71WU.htm#specs) (7000 lumen, LCD, 1.09-1.77 throw ratio)
2x [NEC P525UL](https://www.sharpnecdisplays.eu/p/datasheet/en/datasheet/rp/p525ul.xhtml) (5000 lumen, LCD, 1.23-2.0 throw ratio)
1x [Optoma EH460ST](https://www.optoma.co.uk/ContentStorage/Documents/5ddee23d-0294-49eb-90fd-5a116be7417c.pdf) (4200 lumen, DLP, 0.5:1 short throw)
8x [BenQ TH671ST](https://www.benq.eu/en-uk/projector/cinema/th671st.html) (3000 lumen, DLP, 0.69-0.83 short throw) ### How to use the Dark Lab projectors ### The permanently mounted projectors all have an HDMI cable (with USB-C adapter) that you are free to plug into whenever no one else is using them. The two main screens (Area 1 and Area 2) are also hooked up to the two corresponding PC's. Those have a little switch on the desk to choose between input from the PC or your own laptop. The HDMI cables are labeled as 'Projector 1 (HDMI 1)', etc.; telling you which projector it is connected to, and which input to select on the projector itself.
On (or next to) the screens you will find remotes to turn the projectors on and select the correct HDMI input.
(Please make sure to turn them off when you're done, and stick the remote back where to where it was!) ### Projector terminology ### #### Throw ratio #### The so-called 'throw ratio' of a projector specifies how narrow or wide the projection angle of the lens is. In other words, it tells you how big the image will be, depending on the distance from the screen or wall.
Throw ratio is the projector distance divided by the image width. So for example, a throw ratio of 0.5 means that from one meter away it will 'throw' an image of 2 meters wide onto the wall (or 1 meter wide from 0.5 meter distance, etc.). #### LCD vs. DLP #### There are a few different types of projectors, in the sense of how they actually create the pixels on screen. Each technology has its own strengths and weaknesses: 1. LCD projectors
Pros: Amazing colours. No artifacts when taking photos or videos.
Cons: Black levels aren't the best (dark grey instead of black). 3. DLP projectors
Pros: Black levels are usually better than LCD. Native support for 3D through DLP-Sync 3D glasses. Cons: Depending on the shutter speed, problems might arise when trying to take photos or videos (rainbow effect). Some people's eyes are sensitive to this, too. Colour reproduction is often not as good as with LCD. #### Brightness #### When it comes to brightness; more is usually better! Thankfully, we have some really bright ones at CCI (up to 7000 lumen). The light output of the projector will get spread over the whole image; so if you make the image bigger (by placing the projector further away from the screen), that means it will become less bright. When using cameras, it sometimes helps to dial the brightness down a little, in order not to overexpose or blind the camera. #### Lens shift and mounting #### Our more fancy projectors like the Epsons, Panasonics and NECs have a feature called 'lens shift' (both horizontal and vertical). This allows them to shift the image up/down or left/right without physically moving the projector or distorting the image. Very handy! Most 'simpler' projectors that do not have lens shift tend to project slightly upward - in the sense that if you put them flat (horizontally; level) onto a floor or table, they will project a rectangular image slightly upward onto the wall. This means that if you want to mount one of those projectors from the ceiling, you can place them upside down so that they project slightly downward onto the wall or screen. # 3D scanning # Sound system # MIDI controllers # DJ controller # Leap Motion (hand & finger tracking) # Cameras and tripods # 86" Multi-touch Display [![Viewboard.jpg](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/n5XXB5dbfNDvyV7i-viewboard.jpg)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/n5XXB5dbfNDvyV7i-viewboard.jpg) # **Viewsonic 86" 4K multi-touch display** The Dark Lab has been outfitted with a large, two meter wide multi-touch screen.
It has a built-in drawing app for sketching and (groupwise) ideation, but you can also plug in your computer and use it for touch-based interaction - or just as a big 4K monitor! ## **How to use it?**

Use the button panel on the front to power up the screen. It will automatically load up the whiteboard drawing app.

To use your computer instead: just plug in the USB-C cable provided! This should work for both the video and touch data.

Please use the stylus pens whenever you can, as it is a giant fingerprint magnet!
Feel free to use finger-based interaction - just use the micro-fiber cloth and screen cleaner spray when you are done (be gentle ;))

### **Using the built-in whiteboard**

The Viewboard has very capable built-in whiteboard functionality. It's vector-based, 'non-destructive' (i.e. you can change edits later on), with infinite canvas plus a bunch of neat functions.
The styluses have a sharp pointed side and a blunt side. The touch board can tell the difference, and will let you assign different colors depending on which side you used to touch the color pallette.

For some nice videos about its other whiteboard functions see here: [Viewsonic 'Basic Whiteboarding' how-to videos](https://youtube.com/playlist?list=PLDfhD4-dQNN7FEvmgDy7gByk9TJ1ainQF&si=qkBvTu5mjEmb5ZYQ)
You can load / save your sketches onto a memory stick or hard drive by plugging it into the USB ports on the front of the device.
In the whiteboard app or home screen, tap the folder-shaped icon to bring up the options for saving and loading files.

Although you can store files on the device itself, DO NOT store anything personal/sensitive that you do not want others to see!!

### **For more information:** The ['Viewsonic Education North America' YouTube channel](https://youtube.com/@ViewSonicEducation) has loads of other how-to videos, in case you ever get stuck on anything - including how to use it with Windows and MacOS.

### **Enjoy!!**

# DMX lights # DMX (stage) lights # [![DMX_lights_01_HDR_cropped.jpg](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/xvREBwUiJbtPuFj9-dmx-lights-01-hdr-cropped.jpg)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/xvREBwUiJbtPuFj9-dmx-lights-01-hdr-cropped.jpg) ## What is DMX? ## DMX (Digital Multiplex), aka DMX512, is the industry standard protocol for controlling lighting and effects in clubs, theatres and tons of other places. At its core, DMX data consists of a list of 512 numbers, each between 0 and 255, usually referred to as 'channels'. This data gets sent through an XLR-style cable from the lighting controller to the lights.
Most light fixtures (and other DMX-enabled devices) usually have both a DMX-in and DMX-out port. This allows them to be daisy-chained together, meaning you can control multiple lights through just one cable. Some lights might have one function (e.g. dimmer), while others might have a whole range of controllable functions like Red, Green, Blue, White, Dimmer, Strobe, Pan, Tilt, etc.. This means that each light takes a specific number of channels to control it, and that number will differ between lights. All lights have a function to set their 'DMX address', indicating which of the 512 channels are meant for that light.
For example: if one light takes **eight** channels to control it, and the next one takes **fourteen** channels;
- The first light would be set to DMX **address 1**, and takes DMX **channels 1-8**. - The second light would then be set to **address 9**, and take **channels 9-23**. - The third light would be **address 24**, etc.

In general, for controlling *any* light, you will want to look up the User Manual for that model to find out which channels control which function of the light.
Just scroll down to find a table like this:

[![DMX channel list example.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/qIVfExE041HjFiXv-dmx-channel-list-example.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/qIVfExE041HjFiXv-dmx-channel-list-example.png) ## How to use the Dark Lab lights ##

The permanently mounted stage lights in the lab are all daisy-chained together. There is a little switch box where you can select what controls the lights: i.e. the lighting desk, a USB DMX interface, or the wireless DMX receiver (ask the Kit Room or a Dark Lab technician for the wireless DMX transmitter).
We also have DMX shields that can plug into your Arduino.

All lights in the Lab are labeled with their DMX (starting) addresses. Please refer to the User Manuals of the lights to know which consecutive channels correspond to which functions:
[Martin Rush Batten 1 Hex User Manual](https://wiki.cci.arts.ac.uk/attachments/13)
[Martin Rush MH6 Wash User Manual](https://wiki.cci.arts.ac.uk/attachments/15)
The Eurolite lighting desk has already been set up to work with our specific lights, so you won't need to worry about addresses and channels if you just want to change the colours!
If you want to incorporate DMX into your projects and make it interactive through code (using e.g. a USB or Arduino DMX interface to connect to the lights), you will need to keep the above in mind though! ## What DMX equipment is available to take home? ## We also have a few lights and DMX interfaces that you can take home. These are bookable through [ORB](https://orb-arts.siso.co), our online loan store and equipment booking system. Feel free to have a look, or have a go!
The technicians will be happy to help with any questions you might have.


# Virtual Reality We have a number of VR headsets available to students including access to high spec gaming PCs for use with them where necessary. # Which VR headsets does CCI have? ![Photo of VR area](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2024-01/scaled-1680-/jHIRtkyqxrxWFERI-vr.webp) - [Valve Index VR headset](https://store.steampowered.com/valveindex) - [Vive Pro Eye Wireless](https://www.vive.com/uk/product/vive-pro-eye/overview) - [Vive Focus 3](https://www.vive.com/uk/product/vive-focus3/overview/) # New Page # Using the VIVE Pro 2 Full Kit [![](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-02/scaled-1680-/MnMj9OIuCeYkZHFX-image-1739874864498-34-18.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-02/MnMj9OIuCeYkZHFX-image-1739874864498-34-18.png) We got the [VIVE Pro 2 headset](https://www.vive.com/uk/product/vive-pro2/overview/) and base stations, controllers (HTC Vive Controllers and VALVE INDEX Controllers), and trackers in the lab. ## Setup *For the full official tutorial, please refer to: [VIVE Pro 2 Setup Video](https://www.youtube.com/watch?v=VAcPyq6UTws)* --- ## 1. Hardware Connections & Power Check **Before powering on your PC: Ensure all devices are powered and connected** - Headset, controllers, Lighthouse base stations, and the Vive link box should all be connected and displaying a solid green light. - *Link box connections:* - **DisplayPort cable** → Graphics card port - **USB 3.0 cable** → PC USB port - **Power adapter** → Mains socket (requires 9V/1.5A or above) --- ## 2. Software Setup ### Essential Installations 1. **Steam & SteamVR** - Download and install the Steam client. - Search for and install **SteamVR** from your Steam library. 2. **Room Tracking Configuration** - Open SteamVR > **Room Setup** - Base stations should be mounted at a height of at least 2m, positioned diagonally across the play area. - *Controller pairing:* Press and hold the system button until the LED flashes blue → complete pairing via the SteamVR interface. --- ## 3. Unreal Engine 5 Integration ### Initial Project Setup 1. **Create a New VR Project** - Template: `Games > Virtual Reality` - Select `Blueprint` and `Starter Content` (for quick testing) - **Tip:** Begin with a simple scene (e.g., an empty template) and gradually increase complexity. 2. **Enable Key Plugins** - Go to `Edit > Plugins` and enable the following: - **OpenXR** (recommended for SteamVR compatibility and future-proofing) - **OpenXR Hand Tracking** - **OpenXR Vive Tracker** (if using external trackers) - **SteamVR** (required for the Vive Pro 2) - **Oculus VR** (if Oculus headset compatibility is needed) - **VR Expansion Plugin** (for advanced interaction features) - *Restart UE5 to apply changes.* 3. **Optimise Project Settings** - Open `Edit > Project Settings` and adjust the following: - **Rendering**: Enable Forward Rendering and disable Mobile HDR. - **XR Settings**: Tick `Start in VR` and enable `Motion Controller Support`. These settings improve performance and ensure VR input responsiveness. --- ## 4. Testing & Troubleshooting ### Launching VR Preview 1. Click the dropdown next to the **Play** button (⚠️ not the default **Play** mode) and select **VR Preview**. 2. Put on the headset to test real-time scene rendering. --- ### ❗ Common Issues & Checks - **Device Not Detected:** - Confirm cables are securely connected and devices are powered. - Restart SteamVR or the PC if necessary. - **Tracking Issues:** - Ensure base stations are unobstructed and correctly positioned. - Check for reflective surfaces or direct sunlight interfering with the sensors. - **Performance Lag:** - Lower rendering resolution or disable unnecessary plugins. - Update graphics drivers and Unreal Engine to the latest stable version. With these steps completed, your HTC Vive Pro 2 should be ready for Unreal Engine 5.2+ # Connecting Vive Trackers to Unreal Engine For the full official tutorial, please refer to: - [Pairing Vive Trackers (3.0)](https://www.vive.com/us/support/tracker3/category_howto/pairing-vive-tracker.html) - [Fixing Live Link XR in Unreal Engine 5.2 and 5.3](https://www.youtube.com/watch?v=GKsPjufVPwg) ### 1. Get Your Trackers Ready First things first—make sure each Vive Tracker is turned on and paired through SteamVR. **SteamVR Settings > Devices > Pair Controllers > I want to pair a different type of controller.. > HTC Vive Tracker**. You’ll know it’s good to go when the LED light is solid green. - Click the 3 little lines on the top left of the SteamVR desktop dashboard window to open the **SETTINGS**. [![](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-02/scaled-1680-/50L9XXJ2ig0Kd3z3-image-1739880267531.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-02/50L9XXJ2ig0Kd3z3-image-1739880267531.png) - Then go to Devices, chose pair your contrllers, click on I want to pair a different type of controller, then select HTC Vive Tracker. [![](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-02/scaled-1680-/T0EQKDHBkbdhzwwH-image-1739879352796-49-04.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-02/T0EQKDHBkbdhzwwH-image-1739879352796-49-04.png) - Turn on the Tracker. [![](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-02/scaled-1680-/M1quu2QzCA6pzmD8-image-1739879396953-49-48.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-02/M1quu2QzCA6pzmD8-image-1739879396953-49-48.png) --- ### 2. Set Up the Plugin - Double-check that SteamVR is running. - Open Unreal Engine and go to **Edit > Plugins**. - Find and enable **OpenXR Vive Tracker**. --- ### 3. Assign Tracker Roles After pairing your trackers you will need to **assign them to the correct body part** in SteamVR's Manage Trackers section, Left foot, Right Foot, ect. Its best to do them one at a time and label the tracker in some way so you know which body part is paired with that tacker. Now let’s tell SteamVR which tracker does what: - **SteamVR Settings > Controllers > Manage Trackers** - Assign the roles like this: - **Waist** → `waist` - **Left Foot** → `left_foot` - **Right Foot** → `right_foot` --- ### 4. Integrate with Unreal Engine - Open your project’s blueprint. - Locate the **Motion Controller** component. - Add a **Live Link** component to stream tracker data. - Make sure the **Role** matches the SteamVR assignments. --- ### 5. Find and Link the Prawn Asset - Connect the prawn’s motion controllers to the corresponding trackers. You could check here for more details [Fixing Live Link XR in Unreal Engine 5.2 and 5.3](https://www.youtube.com/watch?v=GKsPjufVPwg) - Open the **Content Browser**. - Search for **prawn**, and follow the tutorial video. --- ### 6. Test and Calibrate - Run your project in **VR Preview**. - Do some basic movements to check if everything tracks correctly. # Kinect Kinect is a line of motion-sensing input device made by Microsoft. It is a depth-sensing camera featuring a RGB camera and Depth Sensor. # Know the Kinect There is a [Tech Demo](https://www.youtube.com/watch?v=OWzjn656kb4) from MicroSoft of what Kinect can do. ## Kinect can do: - Skeletal tracking in 3D space with human body joints - IR image - Depth image - Spatial audio recording ## Kinect can’t do: - High precision motion capture - Hand(gesture)/face(landmark) tracking (with only native Kinect SDK) ** Kinect offers native support on Windows only ** For macOS users, the alternative option is to use webcam with OpenCV, MediaPipe for tracking. LeapMotion is also an option, but limited to hand and gesture on a smaller scale. There are two versions of Kinect available from the Kit Room, Kinect for Windows V2 and Azure Kinect. The functionality is mostly the same, with some difference in specs and formfactor. Kinect for Windows V2 is set up and installed in the Darklab. [![Kinect for Windows V2.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/bhJ5vqlksEBGD2wF-kinect-for-windows-v2.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/bhJ5vqlksEBGD2wF-kinect-for-windows-v2.png) Kinect for Windows V2 [![Azure Kinect.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/2uPFUInLfR5fd0vy-azure-kinect.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/2uPFUInLfR5fd0vy-azure-kinect.png) Azure Kinect # Install Drivers Kinect for Windows and Azure require different drivers and SDK. Identify your Kinect and install the corresponding SDK from below. [![Kinect for Windows V2.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/bhJ5vqlksEBGD2wF-kinect-for-windows-v2.png )](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/bhJ5vqlksEBGD2wF-kinect-for-windows-v2.png) *Kinect for Windows V2* [![Azure Kinect.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/2uPFUInLfR5fd0vy-azure-kinect.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/2uPFUInLfR5fd0vy-azure-kinect.png) *Azure Kinect* ## Kinect for Windows V2 Download and install the Kinect for Windows SDK 2.0 [Download Kinect for Windows SDK 2.0 from Official Microsoft Download Center](https://www.microsoft.com/en-gb/download/details.aspx?id=44561) After you install the SDK, connect your Kinect to your computer, make sure the power supply is connected, the usb cable only transfer data. Head to SDK Browser v2.0 to find the SDKs and Documentation. You don’t need to touch the SDKs to use Kinect, but the installation is required for other application to run on it. [![Kinect SDK Browser.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/Ds4e3P56AOE7sgYt-kinect-sdk-browser.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/Ds4e3P56AOE7sgYt-kinect-sdk-browser.png) Use the Kinect Configuration Verifier to check if Kinect is working properly. It may take some time to run, if you can see the color and depth image in the last section, then everything is all set now. [![Verify Kinect Configuration.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/Dl6UK1DZmlmNvqy5-verify-kinect-configuration.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/Dl6UK1DZmlmNvqy5-verify-kinect-configuration.png) [![Verify Kinect Configuration - good.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/7USNKabA56jAD5sc-verify-kinect-configuration-good.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/7USNKabA56jAD5sc-verify-kinect-configuration-good.png) You can view your live Kinect feed with Kinect Studio 2.0 [![Kinect Studio.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/6T4dLGTm0wWn8mkU-kinect-studio.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/6T4dLGTm0wWn8mkU-kinect-studio.png) ## Azure Kinect Azure Kinect SDK can be found on GitHub, follow the instruction to download and install the latest version. [**GitHub - microsoft/Azure-Kinect-Sensor-SDK: A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device.**](https://github.com/microsoft/Azure-Kinect-Sensor-SDK/blob/develop/docs/usage.md) Connect Azure Kinect to your computer. Azure Kinect could be powered with a standalone power supply or directly from usb-c. Make sure you use the bundled usb-c cable or a quality cable that meets the power delivery and data speed requirement. Verify the connection and view the live feed from Azure Kinect Viewer. [![Azure Kinect Viewer.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/eYnAATiDC8Dddiux-azure-kinect-viewer.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/eYnAATiDC8Dddiux-azure-kinect-viewer.png) ## Troubleshooting ### Kinect don’t show up as a device/ Couldn’t connect to the Kinect - Check your usb connection - Check if Kinect is connected to power - Try a different usb cable that is known good for data and power ** The light on Kinect only turns on when there’s application actively using the device ### Kinect for Windows connects, but looses connection/reboot every couple minutes Go to your system sound settings and find Microphone Array Xbox NUI Sensor. Make sure this device is allowed for audio. If not allowed, Kinect won’t initialize properly and try to reboot every minute. [![Troubleshoot Microphone permission.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/v76uZusN0xOHNGR2-troubleshoot-microphone-permission.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/v76uZusN0xOHNGR2-troubleshoot-microphone-permission.png) # Use Kinect in TouchDesigner # TOP and Chop There are generally two different ways of utilizing Kinect in TouchDesigner, with Kinect TOP and Kinect Chop. `Kinect TOP` offers all image sensor data in a TOP. `Kinect CHOP` provides skeletal tracking data through different channels in a CHOP. # TOP The image data can be accessed through `Kinect TOP` or `Kinect Azure TOP` respectively. Below is an example of using creating a colored point cloud from the depth and color sensing image from Kinect. The project file for this example could be found [here](https://artslondon-my.sharepoint.com/:f:/g/personal/r_wang_arts_ac_uk/EvjnQDbNE25Gmmnno8spEHwBjtGwH8weM9ACEGr_8q4qVQ?e=ul74Ke). [![Kinect Point Cloud.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/rJimaPc42ekOZAim-kinect-point-cloud.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/rJimaPc42ekOZAim-kinect-point-cloud.png) # CHOP `Kinect CHOP` offers the skeletal tracking data for each joint and its x/y/z position in different coordinate space through different channels. Use `Select CHOP` to select the channel you need, [Pattern Matching](https://docs.derivative.ca/Pattern_Matching) is helpful in filtering and selecting multiple channels with conditions. [![Kinect CHOP and Select.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/l4t6UZA9vBXwi9Za-kinect-chop-and-select.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/l4t6UZA9vBXwi9Za-kinect-chop-and-select.png) ## Multi-player Tracking A common problem of using Kinect skeletal tracking in an installation is about finding the right person to track. When you need to track only one person, setting `Max Player` to 1 will pass the handling of player selection to Kinect, and most of the time it will lock on to the closest person and there is no way to manually switch player. When there are more people in the Kinect tracking space in an installation, this could be an problem. A good approach is to keep the `Max Player` at Max, and create cuntom logic to filter and select the player you wanna track. Every time Kinect detects a new player in the frame, they will be assigned a `p*/id`. You can use id to keep tracking locked on the same person, no matter the position and how many players are in frame. For each player, you can use the x/y/z positions from the head joint (or any other joint) `p*/head:*` to caculate its reletive distance to any point in space. And use math to draw a boundary or create sorting logic, you can map a physical point in real-world to a point in Kinect coordinate space, so Kinect only use the tracking from the person standing right on the point. Below is an example of selecting a player based on the realtive position to the vertical center of the frame `x = 0` and a point `x = 0, z = 3`. The project file could be found [here](https://artslondon-my.sharepoint.com/:f:/g/personal/r_wang_arts_ac_uk/EvjnQDbNE25Gmmnno8spEHwBjtGwH8weM9ACEGr_8q4qVQ?e=ul74Ke) [![Kinect Distance Based Select Player.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/N8Oow5YuF0scsUA4-kinect-distance-based-select-player.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/N8Oow5YuF0scsUA4-kinect-distance-based-select-player.png) # Use Kinect Anywhere # Stream Data Althrough only a limited set of software, for example TouchDesigner, offers native support for Kinect over the Kinect SDK, you can stream Kinect data from one of these applications to almost anywhere, including in different environment like Python, Node.js or even Web Apps including p5.js. Within a local network, you can effectively setup one Windows computer as the Kinect Host and stream data to the same machine or different machines, utilizing Kienct on Raspberry Pi, MacOS or even Arduino. Depending on the use case, you can stream the raw camera data as a video feed, or stream the skeletal tracking data as a live data feed. TouchDesigner is good as the host to do both. ## Stream Image Feed via NDI NDI (Network Device Interface) is a real-time video-over-IP protocol developed by NewTek. It's designed for sending high-quality, low-latency video and audio over a local network (LAN), with minimal setup and high performance. Read [NDI Documentation](https://docs.ndi.video/all/getting-started/what-is-ndi) for more infomation. You can use NDI to stream Kinect video feeds (color, depth, IR) from TouchDesigner to: - Another TouchDesigner instance (on same or different machine) - OBS Studio for recording or streaming - Unreal Engine, Unity, Max/MSP, etc. - Custom apps using NDI SDK or NDI-compatible plugins Setup NDI stream in TouchDesigner with `NDI OUT CHOP`, you can create different streams for different image source (TOP) with different names. [![NDI in TouchDesigner.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/scaled-1680-/10FJaEBLFAcvmyUV-ndi-in-touchdesigner.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-06/10FJaEBLFAcvmyUV-ndi-in-touchdesigner.png) ## Stream Body Tracking Data via OSC OpenSoundControl (OSC) is a data transport specification (an encoding) for realtime message communication among applications and hardware over network, typically TCP or UDP. OSC is originally created for highly accurate, low latency, lightweight, and flexible method of communication for use in realtime musical performance and is widely used in music, art, motion tracking, lighting, robotics, and more. You can use OSC to stream body tracking data from Kinect in TouchDesigner to other software (or vice versa), such as: - Unity, Unreal Engine - Processing, openFrameworks - Max/MSP, Pure Data, Isadora - Web apps (via websocket bridge) - Python, Node.js apps - Other TouchDesigner instance Send OSC data from TouchDesigner with `OSC Out CHOP` or `OSC Out DAT`. Use `CHOP` when sending multiple channels straight from a chop; Use `DAT` with custom python script to further manipulate and format the data before sending. [![OSC Out Chop.png](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-07/scaled-1680-/0L3MKBQ9QSlbmflB-osc-out-chop.png)](https://wiki.cci.arts.ac.uk/uploads/images/gallery/2025-07/0L3MKBQ9QSlbmflB-osc-out-chop.png) ## Stream with Kinectron for Web OSC communication is typically implemented with UDP, which is fast and easy for native application to send and receive over the same local network. However web application in the browser runs in an isolated sandbox and does not have the access to local UDP port. To get data into your web application, you need a bridge for communicating with your web app through WebSocket or WebRTC. [Kinectron](https://github.com/kinectron/kinectron) enables real-time stream of Azure Kinect Data to web browser. Visit [Kinctron Release](https://github.com/kinectron/kinectron/releases) page to download the latest version of the server side application and client side library for using the data. ** Notice that Kinectron V1.0.0 only support Azure Kinect. Support for Kinect Windows V2 could be found on the older version 0. Find more about [Kinectron V0](https://kinectron.github.io/#/) and usage examples. # Receiving data OSC is widely supported in a range of applications and programming languages. Find the package or library to receive OSC data or you can bind a socket to the UDP port then listen and parse any OSC message. ### Unity Script and examples for receiving OSC message
[https://t-o-f.info/UnityOSC/](https://t-o-f.info/UnityOSC/) ### Unreal Unreal has built-in OSC Plugin, include the plugin from plug-in manager and start with a blutprint. Find the documentation
[OSC Plug-in Overview](https://dev.epicgames.com/documentation/en-us/unreal-engine/osc-plugin-overview-for-unreal-engine) ### Processing Use oscP5 library for processsing
[oscP5](https://sojamo.de/libraries/oscp5/) ### openFrameworks openFrameworks has add-on to support OSC natively, find the documentation
[ofxOsc Documentation](https://openframeworks.cc/documentation/ofxOsc/) ### MaxMSP and Pure Data Use `udpsend` and `udpreceive` objects to send and receive osc message ### Ableton Use Connection Kit in Max4Live to send and receive OSC data in Ableton. More info in
[Connection Kit](https://www.ableton.com/en/packs/connection-kit/) ### Python There are multiple packages available for osc in python.
One example is `python-osc`. Install with `pip install python-osc` and find the documentation [python-osc](https://python-osc.readthedocs.io/en/latest/) ### Node.js There are multiple osc packages in node as well.
One example is `osc`, install with `npm intall osc` and find the documentation [osc.js](https://github.com/colinbdclark/osc.js) With `osc.js`, you can also create a WebSocket bridge that forward osc messages to browser application. ### Browser /Web Application To use Kinect data in the browser, there are two options - Use Kinectron to stream data and receive with Kinectron client-side script, full support for video stream and body tracking data - Use `osc.js` in Node.js to create WebSocket bridge and forward selective tracking data through websocket. Use JavaScript native WebSocket Client API to receive the data # Use Kinect Directly in Unreal and Unity In game engines, you should be able to use the Kinect SDK directly, which still envolves some lower level programing and a lot of experience. There are some plugins developed for Kinect, but some of them are paid and some haven't been updated for years. Look for the right plugin for your need, depending on whether you want to get Skeletal Tracking or Depth Image. ### Unreal [Neo Kinect](https://www.fab.com/listings/2b2c6d45-c984-4216-a5d0-35ca97fbd526) (paid)
[Azure Kinect for Unreal Engine](https://github.com/nama-gatsuo/AzureKinectForUE) ### Unity [Azure Kinect and Femto Bolt Examples for Unity](https://assetstore.unity.com/packages/tools/integration/azure-kinect-and-femto-bolt-examples-for-unity-149700) (paid)
[Unity_Kinect](https://github.com/nuwud/Unity_Kinect) (free)