Skip to main content
1 vote
0 answers
35 views

I wrote a Swift macOS app to control a PCI audio device. The code switches input and output channels by default. As soon as I launch the Audio-Midi Setup utility and switch channels, my code stops ...
Purgen's user avatar
  • 11
2 votes
0 answers
61 views

I'm building an audio app with XCode, using the AudioPlayer class from the AudioKit package to handle playback. When using AudioPlayer's seek() to seek to a specific point in an audio file, it works, ...
Giles.Spreadborough's user avatar
3 votes
0 answers
68 views

I maintain a macOS app that needs to recognize devices using the device's UID. To do this, I retrieve the device's UID using the kAudioDevicePropertyDeviceUID property, store that, and use it for ...
Matt Green's user avatar
  • 2,122
0 votes
1 answer
59 views

My app can play Standard MIDI Files with an AVAudioUnitMIDIInstrument. I've been setting a sound font like this, where soundfontURL is a Swift URL object: AudioUnitSetProperty(audioUnit, ...
arlomedia's user avatar
  • 9,133
4 votes
1 answer
72 views

I am working on an application to get when audio device is being used. This app runs on macOS. For Mac versions starting from Sonoma I can use this code: int getAudioProcessPID(AudioObjectID process) ...
RuLoViC's user avatar
  • 903
1 vote
0 answers
42 views

What I do: Build the the AudioDriverKit example driver from Apple and install to an M1 Ipad. Allow the driver in the settings. Open some browser and go to Youtube. Play some video. Stop playback and ...
tuple_cat's user avatar
  • 1,396
1 vote
1 answer
106 views

I'm building a macOS audio batch processor in Swift using AVFoundation. I want to combine two stereo WAV files into a single 4-channel WAV file (either at 44100 Hz or 48000 Hz sample rate). I've Tried:...
Arjun's user avatar
  • 134
1 vote
0 answers
34 views

My need is to make a multi-track audio editor, for every individual audio clip, we can change speed, pitch, or volume of it, My idea is to make every one of them becomes an AVAudioPlayerNode (In ...
luckysmg's user avatar
  • 389
1 vote
0 answers
37 views

I'm using AVFoundation to make a multi-track editor app, which can insert multiple track and clip, including scale some clip to change the speed of the clip, (also I'm not sure whether AVFoundation ...
luckysmg's user avatar
  • 389
0 votes
0 answers
62 views

I observed that the aggregate device works normally in standard microphone mode. However, in voice isolation mode, when retrieving input channels, only the first sub-device's input channel count in ...
user22997196's user avatar
1 vote
1 answer
46 views

I love the syntax for the new CoreAudio API. Unfortunately it seems to leak like a sieve. Is there anyway to avoid the leaks without needing to go back to the old API? This code leaks. func ...
Devin Roth's user avatar
1 vote
1 answer
81 views

I am using SimplyCoreAudio to switch the clock source. The code is as follows: debugPrint("nullAudioDevice.clockSourceID \(nullAudioDevice.clockSourceID)") if let cid = ...
user22997196's user avatar
1 vote
0 answers
39 views

I'm trying to use AudioKit in tvOS, but I'm running into a compiler issue. When I try to compile my project I'm getting the following error in Settings.swift. 'isiOSAppOnMac' is only available in tvOS ...
user3578473's user avatar
1 vote
1 answer
61 views

I've recently got interested in generated audio again, but I'm having a bit of trouble. I've been following this tutorial, and converting it into Swift: https://gist.github.com/gcatlin/...
Ash's user avatar
  • 9,443
0 votes
0 answers
47 views

I've been working on a C# application that interfaces with the Windows Audio APIs to check the processes used by the primary render device. However, I keep encountering an issue where the call to ...
Carsten's user avatar
  • 2,341

15 30 50 per page
1
2 3 4 5
168