Dark Lab

About the Dark Lab

The Dark Lab is located on the Ground floor of Greencoat Building, in GB_G05.

Opening Hours

Open Staffed
Weekdays 09:00–21:00 09:00–19:00
Saturday 10:00–18:00 Unstaffed
Sunday Closed Closed

Staff

Photo of Lieven van Velthoven
Lieven van Velthoven
Creative Code Specialist Technician
he/him

Dark Lab Equipment

Dark Lab Equipment

Projectors

Our Projectors

In the Lab

We have a number of different video projectors available for use in the Dark Lab.
Some of them are permanently mounted in the truss grid: those are ready and waiting for you to plug in and use!

If you would like to project anywhere else in the Lab, please ask Lieven van Velthoven (Dark Lab / Code technician), and he will see if it is possible to mount one for you. (no guarantees, but do ask!)
Our projectors vary in brightness, throw ratio, zoom, lens shift, imaging technology, etc.. We can have a chat about which one might best suit your project.
(these terms are explained below if you would like to learn a bit more!)

To take home

We also have a few projectors that you can take home - namely the BenQ ones listed below. These are bookable through ORB, our online loan store and equipment booking system.

What do we have?

Here are the models we currently have. Click on the links to fnd out more!
2x Epson EB-L635SU (6000 lumen, LCD, 0.8:1 medium short throw) (these are the two main projection screens)
1x Panasonic PT-VMZ60 (6000 lumen, LCD, 1.09-1.77 throw ratio)
1x Panasonic PT-VMZ71 (7000 lumen, LCD, 1.09-1.77 throw ratio)
2x NEC P525UL (5000 lumen, LCD, 1.23-2.0 throw ratio)
1x Optoma EH460ST (4200 lumen, DLP, 0.5:1 short throw)
8x BenQ TH671ST (3000 lumen, DLP, 0.69-0.83 short throw)

How to use the Dark Lab projectors

The permanently mounted projectors all have an HDMI cable (with USB-C adapter) that you are free to plug into whenever no one else is using them. The two main screens (Area 1 and Area 2) are also hooked up to the two corresponding PC's. Those have a little switch on the desk to choose between input from the PC or your own laptop.

The HDMI cables are labeled as 'Projector 1 (HDMI 1)', etc.; telling you which projector it is connected to, and which input to select on the projector itself.
On (or next to) the screens you will find remotes to turn the projectors on and select the correct HDMI input.
(Please make sure to turn them off when you're done, and stick the remote back where to where it was!)

Projector terminology

Throw ratio

The so-called 'throw ratio' of a projector specifies how narrow or wide the projection angle of the lens is. In other words, it tells you how big the image will be, depending on the distance from the screen or wall.
Throw ratio is the projector distance divided by the image width. So for example, a throw ratio of 0.5 means that from one meter away it will 'throw' an image of 2 meters wide onto the wall (or 1 meter wide from 0.5 meter distance, etc.).

LCD vs. DLP

There are a few different types of projectors, in the sense of how they actually create the pixels on screen. Each technology has its own strengths and weaknesses:

  1. LCD projectors
    Pros: Amazing colours. No artifacts when taking photos or videos.
    Cons: Black levels aren't the best (dark grey instead of black).
  2. DLP projectors
    Pros: Black levels are usually better than LCD. Native support for 3D through DLP-Sync 3D glasses. Cons: Depending on the shutter speed, problems might arise when trying to take photos or videos (rainbow effect). Some people's eyes are sensitive to this, too. Colour reproduction is often not as good as with LCD.

Brightness

When it comes to brightness; more is usually better! Thankfully, we have some really bright ones at CCI (up to 7000 lumen).

The light output of the projector will get spread over the whole image; so if you make the image bigger (by placing the projector further away from the screen), that means it will become less bright.

When using cameras, it sometimes helps to dial the brightness down a little, in order not to overexpose or blind the camera.

Lens shift and mounting

Our more fancy projectors like the Epsons, Panasonics and NECs have a feature called 'lens shift' (both horizontal and vertical). This allows them to shift the image up/down or left/right without physically moving the projector or distorting the image. Very handy!

Most 'simpler' projectors that do not have lens shift tend to project slightly upward - in the sense that if you put them flat (horizontally; level) onto a floor or table, they will project a rectangular image slightly upward onto the wall. This means that if you want to mount one of those projectors from the ceiling, you can place them upside down so that they project slightly downward onto the wall or screen.

Dark Lab Equipment

86" Multi-touch Display

Viewboard.jpg

Viewsonic 86" 4K multi-touch display

The Dark Lab has been outfitted with a large, two meter wide multi-touch screen.
It has a built-in drawing app for sketching and (groupwise) ideation, but you can also plug in your computer and use it for touch-based interaction - or just as a big 4K monitor!

How to use it?

Use the button panel on the front to power up the screen. It will automatically load up the whiteboard drawing app.

To use your computer instead: just plug in the USB-C cable provided! This should work for both the video and touch data.

Please use the stylus pens whenever you can, as it is a giant fingerprint magnet!
Feel free to use finger-based interaction - just use the micro-fiber cloth and screen cleaner spray when you are done (be gentle ;))

Using the built-in whiteboard

The Viewboard has very capable built-in whiteboard functionality. It's vector-based, 'non-destructive' (i.e. you can change edits later on), with infinite canvas plus a bunch of neat functions.
The styluses have a sharp pointed side and a blunt side. The touch board can tell the difference, and will let you assign different colors depending on which side you used to touch the color pallette.

For some nice videos about its other whiteboard functions see here: Viewsonic 'Basic Whiteboarding' how-to videos

You can load / save your sketches onto a memory stick or hard drive by plugging it into the USB ports on the front of the device.
In the whiteboard app or home screen, tap the folder-shaped icon to bring up the options for saving and loading files.

Although you can store files on the device itself, DO NOT store anything personal/sensitive that you do not want others to see!!

For more information:

The 'Viewsonic Education North America' YouTube channel has loads of other how-to videos, in case you ever get stuck on anything - including how to use it with Windows and MacOS.

Enjoy!!



Dark Lab Equipment

DMX lights

DMX (stage) lights

DMX_lights_01_HDR_cropped.jpg

What is DMX?

DMX (Digital Multiplex), aka DMX512, is the industry standard protocol for controlling lighting and effects in clubs, theatres and tons of other places.

At its core, DMX data consists of a list of 512 numbers, each between 0 and 255, usually referred to as 'channels'. This data gets sent through an XLR-style cable from the lighting controller to the lights.
Most light fixtures (and other DMX-enabled devices) usually have both a DMX-in and DMX-out port. This allows them to be daisy-chained together, meaning you can control multiple lights through just one cable.

Some lights might have one function (e.g. dimmer), while others might have a whole range of controllable functions like Red, Green, Blue, White, Dimmer, Strobe, Pan, Tilt, etc.. This means that each light takes a specific number of channels to control it, and that number will differ between lights.

All lights have a function to set their 'DMX address', indicating which of the 512 channels are meant for that light.
For example: if one light takes eight channels to control it, and the next one takes fourteen channels;

In general, for controlling *any* light, you will want to look up the User Manual for that model to find out which channels control which function of the light.
Just scroll down to find a table like this:

DMX channel list example.png

How to use the Dark Lab lights

The permanently mounted stage lights in the lab are all daisy-chained together. There is a little switch box where you can select what controls the lights: i.e. the lighting desk, a USB DMX interface, or the wireless DMX receiver (ask the Kit Room or a Dark Lab technician for the wireless DMX transmitter).
We also have DMX shields that can plug into your Arduino.

All lights in the Lab are labeled with their DMX (starting) addresses. Please refer to the User Manuals of the lights to know which consecutive channels correspond to which functions:
Martin Rush Batten 1 Hex User Manual
Martin Rush MH6 Wash User Manual

The Eurolite lighting desk has already been set up to work with our specific lights, so you won't need to worry about addresses and channels if you just want to change the colours!
If you want to incorporate DMX into your projects and make it interactive through code (using e.g. a USB or Arduino DMX interface to connect to the lights), you will need to keep the above in mind though!

What DMX equipment is available to take home?

We also have a few lights and DMX interfaces that you can take home. These are bookable through ORB, our online loan store and equipment booking system. Feel free to have a look, or have a go!
The technicians will be happy to help with any questions you might have.


Virtual Reality

We have a number of VR headsets available to students including access to high spec gaming PCs for use with them where necessary.

Virtual Reality

Which VR headsets does CCI have?

Photo of VR area

Virtual Reality

Using the VIVE Pro 2 Full Kit

We got the VIVE Pro 2 headset and base stations, controllers (HTC Vive Controllers and VALVE INDEX Controllers), and trackers in the lab.

Setup

For the full official tutorial, please refer to: VIVE Pro 2 Setup Video


1. Hardware Connections & Power Check

Before powering on your PC: Ensure all devices are powered and connected


2. Software Setup

Essential Installations

  1. Steam & SteamVR

    • Download and install the Steam client.
    • Search for and install SteamVR from your Steam library.
  2. Room Tracking Configuration

    • Open SteamVR > Room Setup
    • Base stations should be mounted at a height of at least 2m, positioned diagonally across the play area.
    • Controller pairing: Press and hold the system button until the LED flashes blue → complete pairing via the SteamVR interface.

3. Unreal Engine 5 Integration

Initial Project Setup

  1. Create a New VR Project

    • Template: Games > Virtual Reality
    • Select Blueprint and Starter Content (for quick testing)
    • Tip: Begin with a simple scene (e.g., an empty template) and gradually increase complexity.
  2. Enable Key Plugins

    • Go to Edit > Plugins and enable the following:
      • OpenXR (recommended for SteamVR compatibility and future-proofing)
      • OpenXR Hand Tracking
      • OpenXR Vive Tracker (if using external trackers)
      • SteamVR (required for the Vive Pro 2)
      • Oculus VR (if Oculus headset compatibility is needed)
      • VR Expansion Plugin (for advanced interaction features)
    • Restart UE5 to apply changes.
  3. Optimise Project Settings

    • Open Edit > Project Settings and adjust the following:
      • Rendering: Enable Forward Rendering and disable Mobile HDR.
      • XR Settings: Tick Start in VR and enable Motion Controller Support.

These settings improve performance and ensure VR input responsiveness.


4. Testing & Troubleshooting

Launching VR Preview

  1. Click the dropdown next to the Play button (⚠️ not the default Play mode) and select VR Preview.
  2. Put on the headset to test real-time scene rendering.

❗ Common Issues & Checks

With these steps completed, your HTC Vive Pro 2 should be ready for Unreal Engine 5.2+

Virtual Reality

Connecting Vive Trackers to Unreal Engine

For the full official tutorial, please refer to:

1. Get Your Trackers Ready

First things first—make sure each Vive Tracker is turned on and paired through SteamVR. SteamVR Settings > Devices > Pair Controllers > I want to pair a different type of controller.. > HTC Vive Tracker. You’ll know it’s good to go when the LED light is solid green.


2. Set Up the Plugin


3. Assign Tracker Roles

After pairing your trackers you will need to assign them to the correct body part in SteamVR's Manage Trackers section, Left foot, Right Foot, ect. Its best to do them one at a time and label the tracker in some way so you know which body part is paired with that tacker.

Now let’s tell SteamVR which tracker does what:


4. Integrate with Unreal Engine



6. Test and Calibrate

Kinect

Kinect is a line of motion-sensing input device made by Microsoft. It is a depth-sensing camera featuring a RGB camera and Depth Sensor.

Kinect

Know the Kinect

There is a Tech Demo from MicroSoft of what Kinect can do.

Kinect can do:

Kinect can’t do:

** Kinect offers native support on Windows only

** For macOS users, the alternative option is to use webcam with OpenCV, MediaPipe for tracking. LeapMotion is also an option, but limited to hand and gesture on a smaller scale.

There are two versions of Kinect available from the Kit Room, Kinect for Windows V2 and Azure Kinect. The functionality is mostly the same, with some difference in specs and formfactor. Kinect for Windows V2 is set up and installed in the Darklab.

Kinect for Windows V2.png Kinect for Windows V2

Azure Kinect.png Azure Kinect

Kinect

Install Drivers

Kinect for Windows and Azure require different drivers and SDK. Identify your Kinect and install the corresponding SDK from below.

Kinect for Windows V2.png Kinect for Windows V2

Azure Kinect.png Azure Kinect

Kinect for Windows V2

Download and install the Kinect for Windows SDK 2.0

Download Kinect for Windows SDK 2.0 from Official Microsoft Download Center

After you install the SDK, connect your Kinect to your computer, make sure the power supply is connected, the usb cable only transfer data.

Head to SDK Browser v2.0 to find the SDKs and Documentation. You don’t need to touch the SDKs to use Kinect, but the installation is required for other application to run on it.

Kinect SDK Browser.png

Use the Kinect Configuration Verifier to check if Kinect is working properly. It may take some time to run, if you can see the color and depth image in the last section, then everything is all set now.

Verify Kinect Configuration.png

Verify Kinect Configuration - good.png

You can view your live Kinect feed with Kinect Studio 2.0

Kinect Studio.png

Azure Kinect

Azure Kinect SDK can be found on GitHub, follow the instruction to download and install the latest version.

GitHub - microsoft/Azure-Kinect-Sensor-SDK: A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device.

Connect Azure Kinect to your computer. Azure Kinect could be powered with a standalone power supply or directly from usb-c. Make sure you use the bundled usb-c cable or a quality cable that meets the power delivery and data speed requirement.

Verify the connection and view the live feed from Azure Kinect Viewer.

Azure Kinect Viewer.png

Troubleshooting

Kinect don’t show up as a device/ Couldn’t connect to the Kinect

** The light on Kinect only turns on when there’s application actively using the device

Kinect for Windows connects, but looses connection/reboot every couple minutes

Go to your system sound settings and find Microphone Array Xbox NUI Sensor. Make sure this device is allowed for audio. If not allowed, Kinect won’t initialize properly and try to reboot every minute.

Troubleshoot Microphone permission.png

Kinect

Use Kinect in TouchDesigner

TOP and Chop

There are generally two different ways of utilizing Kinect in TouchDesigner, with Kinect TOP and Kinect Chop. Kinect TOP offers all image sensor data in a TOP. Kinect CHOP provides skeletal tracking data through different channels in a CHOP.

TOP

The image data can be accessed through Kinect TOP or Kinect Azure TOP respectively.

Below is an example of using creating a colored point cloud from the depth and color sensing image from Kinect. The project file for this example could be found here.

Kinect Point Cloud.png

CHOP

Kinect CHOP offers the skeletal tracking data for each joint and its x/y/z position in different coordinate space through different channels. Use Select CHOP to select the channel you need, Pattern Matching is helpful in filtering and selecting multiple channels with conditions.

Kinect CHOP and Select.png

Multi-player Tracking

A common problem of using Kinect skeletal tracking in an installation is about finding the right person to track. When you need to track only one person, setting Max Player to 1 will pass the handling of player selection to Kinect, and most of the time it will lock on to the closest person and there is no way to manually switch player. When there are more people in the Kinect tracking space in an installation, this could be an problem.

A good approach is to keep the Max Player at Max, and create cuntom logic to filter and select the player you wanna track. Every time Kinect detects a new player in the frame, they will be assigned a p*/id. You can use id to keep tracking locked on the same person, no matter the position and how many players are in frame. For each player, you can use the x/y/z positions from the head joint (or any other joint) p*/head:* to caculate its reletive distance to any point in space. And use math to draw a boundary or create sorting logic, you can map a physical point in real-world to a point in Kinect coordinate space, so Kinect only use the tracking from the person standing right on the point.

Below is an example of selecting a player based on the realtive position to the vertical center of the frame x = 0 and a point x = 0, z = 3. The project file could be found here

Kinect Distance Based Select Player.png

Kinect

Use Kinect Anywhere

Stream Data

Althrough only a limited set of software, for example TouchDesigner, offers native support for Kinect over the Kinect SDK, you can stream Kinect data from one of these applications to almost anywhere, including in different environment like Python, Node.js or even Web Apps including p5.js.

Within a local network, you can effectively setup one Windows computer as the Kinect Host and stream data to the same machine or different machines, utilizing Kienct on Raspberry Pi, MacOS or even Arduino.

Depending on the use case, you can stream the raw camera data as a video feed, or stream the skeletal tracking data as a live data feed. TouchDesigner is good as the host to do both.

Stream Image Feed via NDI

NDI (Network Device Interface) is a real-time video-over-IP protocol developed by NewTek. It's designed for sending high-quality, low-latency video and audio over a local network (LAN), with minimal setup and high performance. Read NDI Documentation for more infomation.

You can use NDI to stream Kinect video feeds (color, depth, IR) from TouchDesigner to:

Setup NDI stream in TouchDesigner with NDI OUT CHOP, you can create different streams for different image source (TOP) with different names.

NDI in TouchDesigner.png

Stream Body Tracking Data via OSC

OpenSoundControl (OSC) is a data transport specification (an encoding) for realtime message communication among applications and hardware over network, typically TCP or UDP. OSC is originally created for highly accurate, low latency, lightweight, and flexible method of communication for use in realtime musical performance and is widely used in music, art, motion tracking, lighting, robotics, and more.

You can use OSC to stream body tracking data from Kinect in TouchDesigner to other software (or vice versa), such as:

Send OSC data from TouchDesigner with OSC Out CHOP or OSC Out DAT. Use CHOP when sending multiple channels straight from a chop; Use DAT with custom python script to further manipulate and format the data before sending.

OSC Out Chop.png

Stream with Kinectron for Web

OSC communication is typically implemented with UDP, which is fast and easy for native application to send and receive over the same local network. However web application in the browser runs in an isolated sandbox and does not have the access to local UDP port. To get data into your web application, you need a bridge for communicating with your web app through WebSocket or WebRTC.

Kinectron enables real-time stream of Azure Kinect Data to web browser. Visit Kinctron Release page to download the latest version of the server side application and client side library for using the data.

** Notice that Kinectron V1.0.0 only support Azure Kinect. Support for Kinect Windows V2 could be found on the older version 0. Find more about Kinectron V0 and usage examples.

Receiving data

OSC is widely supported in a range of applications and programming languages. Find the package or library to receive OSC data or you can bind a socket to the UDP port then listen and parse any OSC message.

Unity

Script and examples for receiving OSC message
https://t-o-f.info/UnityOSC/

Unreal

Unreal has built-in OSC Plugin, include the plugin from plug-in manager and start with a blutprint. Find the documentation
OSC Plug-in Overview

Processing

Use oscP5 library for processsing
oscP5

openFrameworks

openFrameworks has add-on to support OSC natively, find the documentation
ofxOsc Documentation

MaxMSP and Pure Data

Use udpsend and udpreceive objects to send and receive osc message

Ableton

Use Connection Kit in Max4Live to send and receive OSC data in Ableton. More info in
Connection Kit

Python

There are multiple packages available for osc in python.
One example is python-osc. Install with pip install python-osc and find the documentation python-osc

Node.js

There are multiple osc packages in node as well.
One example is osc, install with npm intall osc and find the documentation osc.js

With osc.js, you can also create a WebSocket bridge that forward osc messages to browser application.

Browser /Web Application

To use Kinect data in the browser, there are two options

Use Kinect Directly in Unreal and Unity

In game engines, you should be able to use the Kinect SDK directly, which still envolves some lower level programing and a lot of experience. There are some plugins developed for Kinect, but some of them are paid and some haven't been updated for years. Look for the right plugin for your need, depending on whether you want to get Skeletal Tracking or Depth Image.

Unreal

Neo Kinect (paid)
Azure Kinect for Unreal Engine

Unity

Azure Kinect and Femto Bolt Examples for Unity (paid)
Unity_Kinect (free)