12

I am building an RTSP streaming server in java using xuggler, but I'm not sure how to implement correct RTP packetization.

My current approach is to call ReadNextPacket(packet) on the input container, then craft an RTP packet with the payload filled by packet.getData() and appropriate header (payload type based on stream index, timestamp set by getTimestamp(), etc.) and send it.

Can someone provide me a practical example of how to encode an IPacket into a correct rtp payload, in the most input-format-independent way? The documentation is a bit lacking on this.

3
  • @streak your link is broken Commented Dec 20, 2014 at 21:19
  • Sorry the domain is moved, i guess. Commented Dec 21, 2014 at 7:54
  • 2
    I believe it depends on the input format. e.g. see tools.ietf.org/html/rfc5219 on how to set up an RTP payload for MP3 files. There are other RFC's documenting other formats. In any case, input format is very important. You do not want to split an MP3 frame between two packets, for example. Commented Jan 29, 2015 at 22:16

1 Answer 1

0

I've seen a code which used javax.media for implementing RTP server.

class MediaConvertion {
private MediaLocator mediaLocator = null;

private DataSink dataSink = null;

private Processor mediaProcessor = null;

private static final Format[] FORMATS = new Format[] { new AudioFormat(
        AudioFormat.DVI_RTP) };

private static final ContentDescriptor CONTENT_DESCRIPTOR = new ContentDescriptor(
        ContentDescriptor.RAW_RTP);

public MediaConvertion(String url) throws IOException,
        NoProcessorException, CannotRealizeException, NoDataSinkException,
        NoDataSinkException {
    mediaLocator = new MediaLocator(url);
}

public void setDataSource(DataSource ds) throws IOException,
        NoProcessorException, CannotRealizeException, NoDataSinkException {

    mediaProcessor = Manager.createRealizedProcessor(new ProcessorModel(ds,
            FORMATS, CONTENT_DESCRIPTOR));
    dataSink = Manager.createDataSink(mediaProcessor.getDataOutput(),
            mediaLocator);
}

public void startTransmitting() throws IOException {
    mediaProcessor.start();
    dataSink.open();
    dataSink.start();
}

public void stopTransmitting() throws IOException {
    dataSink.stop();
    dataSink.close();
    mediaProcessor.stop();
    mediaProcessor.close();
}
}

public class MediaConverterExample extends Frame implements ActionListener {

Button st_stream;
static MediaConvertion mdcon;

public static void main(String args[]) throws IOException,
        NoProcessorException, CannotRealizeException, NoDataSinkException,
        MalformedURLException, NoDataSourceException {
    Format input1 = new AudioFormat(AudioFormat.MPEGLAYER3);
    Format input2 = new AudioFormat(AudioFormat.MPEG);
    Format output = new AudioFormat(AudioFormat.LINEAR);
    PlugInManager.addPlugIn("com.sun.media.codec.audio.mp3.JavaDecoder",
            new Format[] { input1, input2 }, new Format[] { output },
            PlugInManager.CODEC);
    File mediaFile = new File(args[1]);
    DataSource source = Manager.createDataSource(new MediaLocator(mediaFile
            .toURL()));
    mdcon = new MediaConvertion(args[0]);
    mdcon.setDataSource(source);
    new MediaConverterExample();
}

public MediaConverterExample() {
    st_stream = new Button("Start Streaming");
    add(st_stream);
    st_stream.addActionListener(this);
    setVisible(true);
    setSize(200, 300);

}

public void actionPerformed(ActionEvent ae) {
    try {
        mdcon.startTransmitting();
    } catch (Exception e) {
    }
}
}
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.