![]() |
OpenNI2 SDK
v2.3.0.81
|
类 | |
class | CameraSettings |
interface | NewFrameListener |
静态 Public 成员函数 | |
static VideoStream | create (Device device, SensorType sensorType) |
The VideoStream object encapsulates a single video stream from a device. Once created, it is used to start data flow from the device, and to read individual frames of data. This is the central class used to obtain data in OpenNI. It provides the ability to manually read data in a polling loop, as well as providing events and a Listener class that can be used to implement event-driven data acquisition.
Aside from the video data frames themselves, the class offers a number of functions used for obtaining information about a VideoStream. Field of view, available video modes, and minimum and maximum valid pixel values can all be obtained.
In addition to obtaining data, the VideoStream object is used to set all configuration properties that apply to a specific stream (rather than to an entire device). In particular, it is used to control cropping, mirroring, and video modes.
A valid, initialized device that provides the desired stream type is required to create a stream.
Several video streams can be created to stream data from the same sensor. This is useful if several components of an application need to read frames separately.
While some device might allow different streams from the same sensor to have different configurations, most devices will have a single configuration for the sensor, shared by all streams.
|
inline |
Adds a new Listener to receive this VideoStream onNewFrame event. See NewFrameListener for more information on implementing an event driven frame reading architecture.
streamListener | Object which implements NewFrameListener that will respond to this event. |
|
inlinestatic |
Creates a stream of frames from a specific sensor type of a specific device. You must supply a reference to a Device that supplies the sensor type requested. You can use Device#hasSensor(SensorType) to check whether a given sensor is available on your target device before calling create().
device | A reference to the Device you want to create the stream on. |
sensorType | The type of sensor the stream should produce data from. |
|
inline |
Destroy this stream. This function is currently called automatically by the destructor, but it is considered a best practice for applications to manually call this function on any VideoStream that they call create() for.
|
inline |
Gets an object through which several camera settings can be configured.
|
inline |
|
inline |
This function return stream handle.
|
inline |
Gets the horizontal field of view of frames received from this stream.
|
inline |
Provides the maximum possible value for pixels obtained by this stream. This is most useful for getting the maximum possible value of depth streams.
|
inline |
Provides the smallest possible value for pixels obtains by this VideoStream. This is most useful for getting the minimum possible value that will be reported by a depth stream.
|
inline |
Check whether mirroring is currently turned on for this stream.
|
inline |
Provides the SensorInfo object associated with the sensor that is producing this VideoStream.
SensorInfo is useful primarily as a means of learning which video modes are valid for this VideoStream.
|
inline |
Gets the sensor type for this stream.
|
inline |
Gets the vertical field of view of frames received from this stream.
|
inline |
Get the current video mode information for this video stream. This includes its resolution, fps and stream format.
|
inline |
Checks whether this stream supports cropping.
|
inline |
Read the next frame from this video stream, delivered as a VideoFrameRef. This is the primary method for manually obtaining frames of video data. If no new frame is available, the call will block until one is available. To avoid blocking, use NewFrameListener to implement an event driven architecture. Another alternative is to use OpenNI#waitForAnyStream(java.util.List, int) to wait for new frames from several streams
|
inline |
Removes a Listener from this video stream list. The listener removed will no longer receive new frame events from this stream.
streamListener | Object of the listener to be removed. |
|
inline |
Disables cropping.
|
inline |
Changes the cropping settings for this stream. You can use the isCroppingSupported() function to make sure cropping is supported before calling this function.
cropping | CropArea object which set corresponding cropping information. |
|
inline |
Enable or disable mirroring for this stream.
isEnabled | true to enable mirroring, false to disable it. |
|
inline |
设置软件滤波
softFilter | CLOSE-> close;OPEN-> open |
|
inline |
atlas depth rotate
enable | 1-> enable Rotate,0-> unEnableRotate |
|
inline |
设置 HoleFilter「HoleFilter代表窗口滤波器的大小」
holeFilter | 0->close HoleFilter 1-> 3*3,2-> 5*5,3-> 7*7,4-> 9*9 |
|
inline |
6.6设置Depth传输格式
inputFormat | 2-> 10bit「atlas only」,3-> 11bit, 4-> 12bit |
|
inline |
设置 Depth 最大值
maxDepth | max Depth |
|
inline |
设置 Depth 最小值
minDepth | Min Depth |
|
inline |
Changes the current video mode of this stream. Recommended practice is to use Device#getSensorInfo(SensorType), and then SensorInfo#getSupportedVideoModes() to obtain a list of valid video mode settings for this stream. Then, pass a valid VideoMode to setVideoMode(VideoMode) to ensure correct operation.
videoMode | Desired new video mode for this stream. returns Status code indicating success or failure of this operation. |
|
inline |
Starts data generation from this video stream.
|
inline |
Stops data generation from this video stream.