Skip to main content
-3 votes
0 answers
27 views

I have chrome extension made with react, and the same webapp on a website. each app should send and receive stream over webrtc. With chrome extension it works well, it sends and receives. on the ...
Slava709's user avatar
0 votes
0 answers
44 views

I'm trying to test low-latency video streaming using ffmpeg's WHIP protocol to stream to an aiortc-based Python server. The ICE connection completes successfully, but the DTLS handshake fails with &...
jac's user avatar
  • 1
0 votes
0 answers
33 views

I'm developing a VS Code extension that loads Yjs inside a WebView. VS Code WebViews cannot import ES modules due to CSP restrictions, so I cannot use: import * as Y from "https://esm.run/yjs&...
dont stalk's user avatar
-1 votes
1 answer
48 views

I’m building an SFU in C#, using the SIPSorcery library for handling WebRTC media streams. I need to calculate the exact time difference between an incoming audio RTP packet from client A and a video ...
Matin's user avatar
  • 13
0 votes
0 answers
72 views

I'm using the WebRTC example for Android. I find that audioTrack is always used in java layer. By checking the native code, I found that WebRTC native codes would call isLowLatencyInputSupported() by ...
Chandler's user avatar
1 vote
0 answers
41 views

I am trying to build realtime voice translation react-native application using mediasoup. My doubt is how I can pass the audio stream of webrtc in mediasoup server to audio translation pipeline made ...
Nitin Sharma's user avatar
0 votes
1 answer
64 views

I'm building a peer-to-peer chat app using WebRTC and a WebSocket signaling server. Everything works fine up to registration — I can connect to the server, and I see "registered" in the ...
kuu huu's user avatar
  • 11
1 vote
0 answers
44 views

Question: I'm implementing a layered SFU architecture using Pion WebRTC in Go. The setup is as follows: Architecture Master SFU: Receives a single upstream client stream (audio + video). Only ...
yternal's user avatar
  • 11
0 votes
0 answers
62 views

I am working on an electron pet project of mine to send files over local network. For the actual sending part I choose to use wrtc via Simple-Peer, and it's on a backend(maybe weird I know) so I ...
Vlad Karelov's user avatar
0 votes
0 answers
48 views

I got the library to work ('react-native-webrtc'), and I can receive an audio stream. But on iOS, the mic permission is turned on and I can see the orange dot in the top right corner of the screen ...
20 Credi's user avatar
0 votes
0 answers
34 views

How to debug audio output issues like hearing sound from both speakers when only one is desired in mobile WebRTC? I developing one to one audio calling with webrtc + react js. When connected each ...
VYSHNAV MV's user avatar
0 votes
1 answer
63 views

I have an Android Webview that has a meeting running inside of it. When I'm in full-screen mode, everything works perfectly. However, when I enter the PiP mode, I lose the audio. Participants can hear ...
Mohammad Ihraiz's user avatar
0 votes
0 answers
20 views

I'm implementing a WebRTC conference call feature using JSIP, and I'm trying to record both the local audio stream and multiple remote audio streams using MediaRecorder. This works fine in 1:1 calls, ...
Bhumesh Deekonda's user avatar
0 votes
0 answers
47 views

I built an ai voice agent with TTS->LLM->STT pipeline, it should make outbound calls and interact with customers. How do I utilize amazon contact center with kinesis video streams to manage this ...
Zaki's user avatar
  • 107
0 votes
0 answers
82 views

I need help to figure out why I am unable to authenticate WebRTC clients of my TURN server. Server I installed coturn on freebsd via $ pkg install turnserver. I have the following settings written in ...
Duncan Britt's user avatar

15 30 50 per page
1
2 3 4 5
605