Control and Management of Audio/Video Streams and Real-time ORB Protocols

This document describes the design and implementation of an audio/video streaming application that is based on the OMG spec for the Control and Management of A/V Streams. A TAO-based implementation of CORBA A/V streaming specification is available in the TAO release. A PDF paper describing the design and performance of TAO's A/V Streaming Service is available. Additional papers by the DOC group and BBN Technologies that use TAO's A/V Streaming Service are also available.


Traditional CORBA middleware is well suited for objects that need to communicate with each other using request/response style semantics. An increasing range of applications, e.g., multimedia applications, require the transfer of a continous stream of information between objects.

Current streaming applications use proprietary mechanisms to establish and control the streams, as well as to transport the data. The framework developed for this project is designed to allow the creation of interoperable streams. This design faciliates the independent development of client and server components.

In this document, we describe the various interfaces and interactions in the CORBA streaming framework. In addition, we discuss the design and implementation of a prototype audio/video streaming application that uses this framework. This prototype is based on TAO, which is a real-time CORBA ORB.

CORBA Streaming Framework Components

Figure 1 illustrates the key components of the CORBA Streaming framework.

Figure 1: CORBA Streaming Framework Components

The important roles of the framework components depicted above are explained below. The corresponding OMG interface name is given in brackets.

Multimedia Device Factory (MMDevice)

The MMDevice component abstracts the notion of a multimedia device. This device might be physical (such as a video camera) or logical (such as a program that reads video clips from a file).

The MMDevice is responsible for creating new endpoints in response to requests to create new streams. The endpoint typically consist of a pair of objects -- A Virtual Device, or VDev, which encapsulates the device-specific parameters of the connection, and the StreamEndpoint, which encapsulates the transport-specific parameters of the connection. The roles of VDev and StreamEndpoint are described in more detail further on in this document.

The MMDevice is also the component where various policies governing the creation of the VDev and StreamEndpoint objects can be implemented. Currently, the implementation of MMDevice provides for two concurrency strategies:

Virtual Device (VDev)

The Virtual Device represents the device-specific parameters of a stream endpoint. For instance, a particular device might wish to support only "MJPEG-1" compression, or only "ULAW" sound.

The stream establishment phase provides for negotiation regarding these parameters. If the negotiation fails, the stream can be torn down immediately.

The current prototype (explained below) supports MPEG-1 video streams, and ULAW audio.

Stream Controller (StreamCtrl)

The Stream Controller StreamCtrl abstraction is used to bind the supplier and consumer of a stream (such as a video-camera and a display). This abstraction also provides a standard mechanism to access stream controls, such as stop, pause, and play. In addition, the Streams abstraction allows the specification of the Quality of Service (QoS) parameters (such as frames/sec) associated with the stream.

The current implementation of the StreamCtrl interface allows a server to associate IDL interfaces with a multimedia device. Once associated, a client can use the stream controller to gain access to the control interface, and use it to control the stream.

Stream Endpoint (StreamEndpoint)

The StreamEndpoint is an abstraction that encapsulates transport-specific parameters of a stream. For instance, a stream that uses UDP as the transport protocol will use the host name and the port number to identify hostnames.

The current implementation of the StreamEndpoint provides a flexible way for applications to define and exchange such transport-level parameters.

Prototype Implementation of an Audio/Video Streaming Service

Using TAO, we have developed a prototype of a CORBA-based audio/video streaming application that uses the various components and interfaces discussed above.

The prototype is a client/server application. The client contacts the server and requests that the server establish streams for a particular movie (i.e. a video stream and an audio stream). The server is responsible for sending audio/video packets to the client. The client side then decodes and plays these packets.

The following describes the various components of the client and server in more detail.

Server Architecture

Figure 2 illustrates the architecture of the audio/video server.

Figure 2: CORBA Audio/Video Server architecture

The server process uses a process based concurrency model. The main server process contains the MMDevice endpoint factory (described earlier), which creates connection handlers on response to client connections. Each connection causes the creation of one audio process, and another video process.

The video/audio processes are responsible for the following tasks:

Client Architecture

Figure 3 depicts the key components in the client architecture:

Figure 3: CORBA Audio/Video Client architecture

The client also uses a process-based concurrency architecture. The individual processes are described next:

Future Work

This section describes our plans for future work:

Nagarajan Surendran & Sumedh Mungee