Samples about Kinect Azure DK programming
OpenCV_OneKinect
Using OpenCV 4.1.0 function imshow to display raw RGB/IR/Depth data acquired from One Kinect.
OpenCV_TwoKinects
Using OpenCV 4.1.0 function imshow to display raw RGB/IR/Depth data acquired from Two Kinects on one PC.
OpenGL_GLUT_ShowImage
Displaying raw RGB/IR/Depth data as the background on the FreeGLUT OpenGL rendering environment.
OpenGL_GLUT_ArUco_AR
Using ArUco library to calculate the model view and projection matrix in favor of the rendering Augmented Scene.
OpenGL_GLFW_GLEW_ArUco_AR
Using GLFW + GLEW based OpenGL 3 GLSL to display raw RGB data from Kinect on the background and render the ArUco-assisted augmented scene.
OpenGL_GLFW_GLEW_PointCloudRenderer
Using GLFW + GLEW based OpenGL 3 GLSL to display raw RGB data from Kinect on the background and render the point cloud transformed from raw Depth data.
Aruco_TwoKinects_Calibration_Extrinsics
Using ArUco library to calibrate the extrinsic matrix between Two Kinects. We will get two csv files stored two transformation matrix which are "sub => master" and "sub => marker".
OpenCV_TwoKinects_GreenScreen
The code is copied from Azure Kinect SDK example "green screen", however, this project is based on the OpenCV 4.1.0.
OneKinect_Recording_RGB_DEPTH_IR
Record the rgb+depth+ir stream into the mkv video file.
OneKinect_Playback_RGB_DEPTH_IR
Playback the mkv video file using opencv.
Open3D_OneKinect
Using the Open3D to open the azure kinect device and show the rgb + depth image or the point cloud based on the open3d's visualization class.
Copyright © 2019 Haipeng WANG
Distributed under the MIT License.
Kinect2Grabber
Kinect Azure DK SDK
PS:
I choose the Visual Studio 2017 Win64 compiler to CMake.
On the Visual Studio 2017 platform, the x64|Release version is strongly recommended.