如果您对ios–AVAudioEngine多个AVAudioInputNodes不能完美同步感兴趣,那么这篇文章一定是您不可错过的。我们将详细讲解ios–AVAudioEngine多个AVAudioI
如果您对ios – AVAudioEngine多个AVAudioInputNodes不能完美同步感兴趣,那么这篇文章一定是您不可错过的。我们将详细讲解ios – AVAudioEngine多个AVAudioInputNodes不能完美同步的各种细节,此外还有关于android.media.audiofx.AudioEffect.Descriptor的实例源码、android.media.audiofx.AudioEffect的实例源码、AVAudioEngine 在 macOS/iOS 上协调/同步输入/输出时间戳、AVAudioEngine下采样问题的实用技巧。
本文目录一览:- ios – AVAudioEngine多个AVAudioInputNodes不能完美同步
- android.media.audiofx.AudioEffect.Descriptor的实例源码
- android.media.audiofx.AudioEffect的实例源码
- AVAudioEngine 在 macOS/iOS 上协调/同步输入/输出时间戳
- AVAudioEngine下采样问题
ios – AVAudioEngine多个AVAudioInputNodes不能完美同步
// //AVAudioPlayerNode1 --> //AVAudioPlayerNode2 --> //AVAudioPlayerNode3 --> AVAudiomixerNode --> AVAudioUnitvarispeed ---> AvAudioOutputNode //AVAudioPlayerNode4 --> | //AVAudioPlayerNode5 --> AudioTap // | //AVAudioPCMBuffers //
我正在使用以下代码加载示例并同时安排它们:
- (void)scheduleInitialAudioBlock:(SBScheduledAudioBlock *)block { for (int i = 0; i < 5; i++) { Nsstring *path = [self assetPathForChannel:i trackItem:block.trackItem]; //this fetches the right audio file path to be played AVAudioPCMBuffer *buffer = [self bufferFromFile:path]; [block.buffers addobject:buffer]; } AVAudioTime *time = [[AVAudioTime alloc] initWithSampleTime:0 atRate:1.0]; for (int i = 0; i < 5; i++) { [inputNodes[i] scheduleBuffer:block.buffers[i] atTime:time options:AVAudioPlayerNodeBufferInterrupts completionHandler:nil]; } } - (AVAudioPCMBuffer *)bufferFromFile:(Nsstring *)filePath { NSURL *fileURl = [NSURL fileURLWithPath:filePath]; NSError *error; AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:fileURl commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error]; if (error) { return nil; } AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:audioFile.length]; [audioFile readIntoBuffer:buffer frameCount:audioFile.length error:&error]; if (error) { return nil; } return buffer; }
我注意到这个问题只能在设备上看到,我正在测试iPhone5s,但我无法弄清楚为什么音频文件播放不同步,任何帮助都将非常感激.
**答案**
我们最终使用以下代码对问题进行排序:
AVAudioTime *startTime = nil; for (AVAudioPlayerNode *node in inputNodes) { if(startTime == nil) { const float kStartDelayTime = 0.1; // sec AVAudioFormat *outputFormat = [node outputFormatForBus:0]; AVAudioFramePosition startSampleTime = node.lastRenderTime.sampleTime + kStartDelayTime * outputFormat.sampleRate; startTime = [AVAudioTime timeWithSampleTime:startSampleTime atRate:outputFormat.sampleRate]; } [node playAtTime:startTime]; }
这为每个AVAudioInputNode提供了足够的时间来加载缓冲区并修复了我们所有的音频同步问题.希望这有助于他人!
好吧,问题是你在playAt之前的for循环的每次运行中检索你的player.lastRenderTime:
So,you’ll actually get a different Now-time for every player!
你这样做的方式你也可以通过play来启动循环中的所有玩家:或者playAtTime:nil !!!失去同步会导致相同的结果……
出于同样的原因,你的播放器在不同的设备上以不同的方式不同步,取决于机器的速度;-)你现在的时间是随机的魔术数字 – 所以,不要认为它们总是会工作如果他们恰巧在你的场景中工作.即使是由于繁忙的运行循环或cpu导致的最小延迟也会让您再次失去同步…
解:
你真正需要做的是在循环之前获得Now = player.lastRenderTime的一个离散快照,并使用这个非常相同的锚点,以获得所有玩家的批量同步开始.
这样你甚至不需要延迟你的玩家的开始.不可否认,该系统将剪切一些领先的框架 – (但当然每个玩家的数量相同;-) – 以弥补您最近设置的(实际上已经过去和已经过去)与实际之间的差异playTime(在不久的将来仍然处于领先地位),但最终会使你的所有玩家完全同步,就好像你现在真的已经开始了它们一样.这些剪裁的框架几乎从不会引人注目,您可以放心地回应…
如果您碰巧需要这些帧 – 因为文件/段/缓冲区启动时发出咔嗒声或瑕疵 – 现在,通过延迟启动播放器,将您现在转移到未来.但是当然你在按下开始按钮后会有一点滞后 – 虽然当然仍然完美同步……
结论:
这里的重点是现在为所有玩家提供一个单一参考,并在捕获此现在参考后尽快调用playAtTime:Now方法.间隙越大,剪裁的前导框架部分越大 – 除非您提供合理的启动延迟并将其添加到您现在的时间,这当然会在按下您的开始按钮后导致延迟启动时无响应.
并且始终要注意这样一个事实 – 无论设备是由音频缓冲机制产生的任何延迟 – 它都不会影响任何数量的播放器的同步性,如果以正确的,上述方式完成!它也不会延迟你的音频!实际上让您听到音频的窗口会在稍后的时间点打开…
请注意:
>如果您选择未延迟(超响应)启动选项
无论出于何种原因,都会产生很大的延迟
捕获现在和你的玩家的实际开始),你会的
截断你的一个大的领先部分(最多约300毫秒/ 0.3秒)
音频.这意味着当您启动播放器时,它将立即开始
但不是从你最近暂停它的位置恢复而是
(最多约300毫秒)后面的音频.所以声学感知就是这样
暂停播放可以随时随地切断音频的一部分,尽管一切都完全同步.
>作为playAtTime中提供的开始延迟:现在
myProvidedDelay方法调用是一个固定的常量值(没有得到
动态调整以适应缓冲延迟或其他变化
系统负载较重时的参数)甚至可以选择延迟选项
提供的延迟时间小于约300ms可能会导致a
如果依赖于设备的准备,则剪切领先的音频样本
时间超过了您提供的延迟时间.
>最大削波量(按设计)不会大于
这些~300ms.要获得证明只是强制控制(样本准确)剪辑
例如,领先的帧到目前为止添加一个负延迟时间
你会通过增加这个来感知不断增长的剪辑音频部分
负值.每个负值大于300毫秒
整理到~300ms.因此提供30秒的负延迟
导致与负值10,6,3或1相同的行为
秒,当然还包括负0.8秒,0.5秒
到~0.3
此示例非常适用于演示目的,但不应在生产代码中使用负延迟值.
注意:
在多人游戏设置中最重要的是保持你的播放器.暂停同步.截至2016年6月,AVAudioPlayerNode中仍然没有同步退出策略.
只需要一个小方法查找或在两个播放器之间向控制台注销一些内容.暂停调用可能会强制后者执行一个甚至更多的帧/样本,而不是前一个.所以你的玩家实际上不会及时停在相同的相对位置.最重要的是 – 不同的设备会产生不同的行为……
如果您现在以上述(同步)方式启动它们,那么您在上次暂停时的这些不同步的当前玩家位置肯定会在每次playAtTime强制同步到您现在的新位置: – 基本上意味着您每次重新启动播放器时都会将丢失的样本/帧传播到未来.这当然会增加每个新的开始/暂停周期并扩大差距.做这五十或一百次,你已经得到了一个很好的延迟效果,而不使用效果音频单元;-)
因为我们没有(通过系统提供的)控制这个因素唯一的补救措施是把所有的调用都放在播放器上.暂停一个接一个地按顺序排列,而不是它们之间的任何东西,就像你可以看到的那样以下示例.不要将它们放入for循环或任何类似的东西 – 这将保证在播放器的下一个暂停/开始时结束不同步…
是否将这些呼叫保持在一起是100%完美的解决方案,或者任何大cpu负载下的运行循环可能会偶然干扰并强制分离暂停呼叫,导致帧丢失 – 我不知道 – 至少在几周内搞乱AVAudioNode API我绝不会强迫我的多人游戏设备失去同步 – 但是,我仍然感觉不舒服或安全这种非同步,随机魔术数暂停解…
代码示例和替代方案:
如果您的引擎已经运行,您在AVAudioNode中获得了@property lastRenderTime – 您的播放器的超类 – 这是您获得100%样本帧精确同步的门票…
AVAudioFormat *outputFormat = [playerA outputFormatForBus:0]; const float kStartDelayTime = 0.0; // seconds - in case you wanna delay the start AVAudioFramePosition Now = playerA.lastRenderTime.sampleTime; AVAudioTime *startTime = [AVAudioTime timeWithSampleTime:(Now + (kStartDelayTime * outputFormat.sampleRate)) atRate:outputFormat.sampleRate]; [playerA playAtTime: startTime]; [playerB playAtTime: startTime]; [playerC playAtTime: startTime]; [playerD playAtTime: startTime]; [player...
顺便说一句 – 您可以使用AVAudioPlayer / AVAudioRecorder类实现相同的100%样本帧精确结果…
NSTimeInterval startDelayTime = 0.0; // seconds - in case you wanna delay the start NSTimeInterval Now = playerA.deviceCurrentTime; NSTimeIntervall startTime = Now + startDelayTime; [playerA playAtTime: startTime]; [playerB playAtTime: startTime]; [playerC playAtTime: startTime]; [playerD playAtTime: startTime]; [player...
没有startDelayTime,所有玩家的前100-200秒将被截断,因为启动命令实际上花费时间到运行循环,尽管玩家已经开始(好,已经安排)100%同步.但是使用startDelayTime = 0.25你很高兴.并且永远不要忘记准备好提前播放你的玩家,以便在开始时不需要额外的缓冲或设置 – 只需启动他们;-)
android.media.audiofx.AudioEffect.Descriptor的实例源码
@TargetApi(18) private static boolean isAcousticEchoCancelerExcludedByUUID() { for (Descriptor d : getAvailableEffects()) { if (d.type.equals(AudioEffect.EFFECT_TYPE_AEC) && d.uuid.equals(AOSP_ACOUSTIC_ECHO_CANCELER)) { return true; } } return false; }
@TargetApi(18) private static boolean isNoiseSuppressorExcludedByUUID() { for (Descriptor d : getAvailableEffects()) { if (d.type.equals(AudioEffect.EFFECT_TYPE_NS) && d.uuid.equals(AOSP_NOISE_SUPPRESSOR)) { return true; } } return false; }
private static Descriptor[] getAvailableEffects() { if (cachedEffects != null) { return cachedEffects; } // The caching is best effort only - if this method is called from several // threads in parallel,they may end up doing the underlying OS call // multiple times. It's normally only called on one thread so there's no // real need to optimize for the multiple threads case. cachedEffects = AudioEffect.queryEffects(); return cachedEffects; }
private static boolean isEffectTypeAvailable(UUID effectType) { Descriptor[] effects = getAvailableEffects(); if (effects == null) { return false; } for (Descriptor d : effects) { if (d.type.equals(effectType)) { return true; } } return false; }
@TargetApi(18) private static boolean isAcousticEchoCancelerExcludedByUUID() { for (Descriptor d : getAvailableEffects()) { if (d.type.equals(AudioEffect.EFFECT_TYPE_AEC) && d.uuid.equals(AOSP_ACOUSTIC_ECHO_CANCELER)) { return true; } } return false; }
@TargetApi(18) private static boolean isNoiseSuppressorExcludedByUUID() { for (Descriptor d : getAvailableEffects()) { if (d.type.equals(AudioEffect.EFFECT_TYPE_NS) && d.uuid.equals(AOSP_NOISE_SUPPRESSOR)) { return true; } } return false; }
private static Descriptor[] getAvailableEffects() { if (cachedEffects != null) { return cachedEffects; } // The caching is best effort only - if this method is called from several // threads in parallel,they may end up doing the underlying OS call // multiple times. It's normally only called on one thread so there's no // real need to optimize for the multiple threads case. cachedEffects = AudioEffect.queryEffects(); return cachedEffects; }
private static boolean isEffectTypeAvailable(UUID effectType) { Descriptor[] effects = getAvailableEffects(); if (effects == null) { return false; } for (Descriptor d : effects) { if (d.type.equals(effectType)) { return true; } } return false; }
android.media.audiofx.AudioEffect的实例源码
@Override public void onDestroy() { if (D) Logger.d(TAG,"Destroying service"); super.onDestroy(); final Intent audioEffectsIntent = new Intent(AudioEffect.ACTION_CLOSE_AUdio_EFFECT_CONTROL_SESSION); audioEffectsIntent.putExtra(AudioEffect.EXTRA_AUdio_SESSION,getAudioSessionId()); audioEffectsIntent.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,getPackageName()); sendbroadcast(audioEffectsIntent); mAlarmManager.cancel(mShutdownIntent); mPlayerHandler.removeCallbacksAndMessages(null); mPlayer.release(); mPlayer = null; mAudioManager.abandonAudioFocus(mAudioFocusListener); mMediaSession.release(); mPlayerHandler.removeCallbacksAndMessages(null); unregisterReceiver(mIntentReceiver); mWakeLock.release(); }
public void pause() { if (D) Log.d(TAG,"Pausing playback"); synchronized (this) { mPlayerHandler.removeMessages(FADEUP); if (mIsSupposedToBePlaying) { final Intent intent = new Intent( AudioEffect.ACTION_CLOSE_AUdio_EFFECT_CONTROL_SESSION); intent.putExtra(AudioEffect.EXTRA_AUdio_SESSION,getAudioSessionId()); intent.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,getPackageName()); sendbroadcast(intent); mPlayer.pause(); notifyChange(Meta_CHANGED); setIsSupposedToBePlaying(false,true); } } }
/** * Temporarily pauses playback. */ public void pause() { if (mPlayerHandler == null) return; LogUtils.d(TAG,"Pausing playback"); synchronized (this) { if (mPlayerHandler != null) { mPlayerHandler.removeMessages(MusicServiceConstants.FADEUP); } if (mIsSupposedToBePlaying) { final Intent intent = new Intent( AudioEffect.ACTION_CLOSE_AUdio_EFFECT_CONTROL_SESSION); intent.putExtra(AudioEffect.EXTRA_AUdio_SESSION,getPackageName()); sendbroadcast(intent); if (mPlayer != null) { mPlayer.pause(); } setIsSupposedToBePlaying(false,true); stopShakeDetector(false); } } }
public void pause() { if (D) Log.d(TAG,true); } } }
public static void openEqualizer(@NonNull final Activity activity) { final int sessionId = MusicPlayerRemote.getAudioSessionId(); if (sessionId == AudioEffect.ERROR_BAD_VALUE) { Toast.makeText(activity,activity.getResources().getString(R.string.no_audio_ID),Toast.LENGTH_LONG).show(); } else { try { final Intent effects = new Intent(AudioEffect.ACTION_disPLAY_AUdio_EFFECT_CONTROL_PANEL); effects.putExtra(AudioEffect.EXTRA_AUdio_SESSION,sessionId); effects.putExtra(AudioEffect.EXTRA_CONTENT_TYPE,AudioEffect.CONTENT_TYPE_MUSIC); activity.startActivityForResult(effects,0); } catch (@NonNull final ActivityNotFoundException notFound) { Toast.makeText(activity,activity.getResources().getString(R.string.no_equalizer),Toast.LENGTH_SHORT).show(); } } }
private void startEffectsPanel() { try { final Intent effects = new Intent(AudioEffect.ACTION_disPLAY_AUdio_EFFECT_CONTROL_PANEL); effects.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,getContext().getPackageName()); effects.putExtra(AudioEffect.EXTRA_AUdio_SESSION,MusicUtils.getAudioSessionId()); effects.putExtra(AudioEffect.EXTRA_CONTENT_TYPE,AudioEffect.CONTENT_TYPE_MUSIC); startActivityForResult(effects,REQUEST_EQ); } catch (final ActivityNotFoundException ignored) { Toast.makeText(getActivity(),"No system equalizer found",Toast.LENGTH_SHORT).show(); } }
/** * @param path The path of the file,or the http/rtsp URL of the stream * you want to play * return True if the <code>player</code> has been prepared and is * ready to play,false otherwise */ void setDataSource(final String path) { Logger.d(TAG,"setDataSourceImpl,path: " + path); try { mCurrentMediaPlayer.reset(); mCurrentMediaPlayer.setonPreparedListener(this); mCurrentMediaPlayer.setDataSource(path); mCurrentMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); mCurrentMediaPlayer.prepareAsync(); preparing = true; mService.get().mIsSupposedToBePlaying = false; } catch (final IOException e) { e.printstacktrace(); } mCurrentMediaPlayer.setonCompletionListener(this); mCurrentMediaPlayer.setonErrorListener(this); mCurrentMediaPlayer.setonBufferingUpdateListener(this); final Intent intent = new Intent(AudioEffect.ACTION_OPEN_AUdio_EFFECT_CONTROL_SESSION); intent.putExtra(AudioEffect.EXTRA_AUdio_SESSION,getAudioSessionId()); intent.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,mService.get().getPackageName()); mService.get().sendbroadcast(intent); //return true; resetBufferPercent(); mService.get().notifyChange(PLAYSTATE_CHANGED); }
/** * @param player The {@link MediaPlayer} to use * @param path The path of the file,or the http/rtsp URL of the stream * you want to play * @return True if the <code>player</code> has been prepared and is * ready to play,false otherwise */ private boolean setDataSourceImpl(@NonNull final MediaPlayer player,@NonNull final String path) { if (context == null) { return false; } try { player.reset(); player.setonPreparedListener(null); if (path.startsWith("content://")) { player.setDataSource(context,Uri.parse(path)); } else { player.setDataSource(path); } player.setAudioStreamType(AudioManager.STREAM_MUSIC); player.prepare(); } catch (Exception e) { return false; } player.setonCompletionListener(this); player.setonErrorListener(this); final Intent intent = new Intent(AudioEffect.ACTION_OPEN_AUdio_EFFECT_CONTROL_SESSION); intent.putExtra(AudioEffect.EXTRA_AUdio_SESSION,context.getPackageName()); intent.putExtra(AudioEffect.EXTRA_CONTENT_TYPE,AudioEffect.CONTENT_TYPE_MUSIC); context.sendbroadcast(intent); return true; }
public static void openEqualizer(@NonNull final Activity activity) { final int sessionId = MusicPlayerRemote.getAudioSessionId(); if (sessionId == AudioEffect.ERROR_BAD_VALUE) { Toast.makeText(activity,Toast.LENGTH_SHORT).show(); } } }
@TargetApi(18) private static boolean isAcousticEchoCancelerExcludedByUUID() { for (Descriptor d : getAvailableEffects()) { if (d.type.equals(AudioEffect.EFFECT_TYPE_AEC) && d.uuid.equals(AOSP_ACOUSTIC_ECHO_CANCELER)) { return true; } } return false; }
@TargetApi(18) private static boolean isNoiseSuppressorExcludedByUUID() { for (Descriptor d : getAvailableEffects()) { if (d.type.equals(AudioEffect.EFFECT_TYPE_NS) && d.uuid.equals(AOSP_NOISE_SUPPRESSOR)) { return true; } } return false; }
@TargetApi(18) private boolean effectTypeIsVoIP(UUID type) { if (!WebRtcAudioUtils.runningOnJellyBeanMR2OrHigher()) return false; return (AudioEffect.EFFECT_TYPE_AEC.equals(type) && isAcousticEchoCancelerSupported()) || (AudioEffect.EFFECT_TYPE_NS.equals(type) && isNoiseSuppressorSupported()); }
private static Descriptor[] getAvailableEffects() { if (cachedEffects != null) { return cachedEffects; } // The caching is best effort only - if this method is called from several // threads in parallel,they may end up doing the underlying OS call // multiple times. It's normally only called on one thread so there's no // real need to optimize for the multiple threads case. cachedEffects = AudioEffect.queryEffects(); return cachedEffects; }
@TargetApi(18) private static boolean isAcousticEchoCancelerExcludedByUUID() { for (Descriptor d : getAvailableEffects()) { if (d.type.equals(AudioEffect.EFFECT_TYPE_AEC) && d.uuid.equals(AOSP_ACOUSTIC_ECHO_CANCELER)) { return true; } } return false; }
@TargetApi(18) private static boolean isNoiseSuppressorExcludedByUUID() { for (Descriptor d : getAvailableEffects()) { if (d.type.equals(AudioEffect.EFFECT_TYPE_NS) && d.uuid.equals(AOSP_NOISE_SUPPRESSOR)) { return true; } } return false; }
@TargetApi(18) private boolean effectTypeIsVoIP(UUID type) { if (!WebRtcAudioUtils.runningOnJellyBeanMR2OrHigher()) return false; return (AudioEffect.EFFECT_TYPE_AEC.equals(type) && isAcousticEchoCancelerSupported()) || (AudioEffect.EFFECT_TYPE_NS.equals(type) && isNoiseSuppressorSupported()); }
private static Descriptor[] getAvailableEffects() { if (cachedEffects != null) { return cachedEffects; } // The caching is best effort only - if this method is called from several // threads in parallel,they may end up doing the underlying OS call // multiple times. It's normally only called on one thread so there's no // real need to optimize for the multiple threads case. cachedEffects = AudioEffect.queryEffects(); return cachedEffects; }
@Override public AudioEffect create(int audioSession) { try { return new LoudnessEnhancer(audioSession); } catch (RuntimeException e) { // NOTE: some devices doesn't support LoudnessEnhancer class and may throw an exception // (ME176C throws IllegalArgumentException) Log.w(TAG,"Failed to instantiate loudness enhancer class",e); } return null; }
@Override public void onDestroy() { super.onDestroy(); Extras.getInstance().setwidgetPosition(100); audioWidget.cleanUp(); audioWidget = null; Equalizers.EndEq(); BassBoosts.EndBass(); Virtualizers.EndVirtual(); Loud.EndLoudnessEnhancer(); Reverb.EndReverb(); receiverCleanup(); Extras.getInstance().eqSwitch(false); audioManager.abandonAudioFocus(this); removeProgress(); fastplay = false; isPlaying = false; paused = false; stopMediaplayer(); if (!Extras.getInstance().hideLockscreen()) { if (mediaSessionLockscreen != null) { mediaSessionLockscreen.release(); mediaSessionLockscreen = null; } } Intent i = new Intent(AudioEffect.ACTION_CLOSE_AUdio_EFFECT_CONTROL_SESSION); if (Helper.isActivityPresent(this,i)) { i.putExtra(AudioEffect.EXTRA_AUdio_SESSION,audioSession()); i.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,this.getPackageName()); sendbroadcast(i); } else { Log.d(TAG,"no activity found"); } if (!Extras.getInstance().hideNotify()) { removeNotification(); } }
@Override public boolean onoptionsItemSelected(MenuItem item) { int id = item.getItemId(); switch (id) { case android.R.id.home: FragmentManager fm = getSupportFragmentManager(); if (fm.getBackStackEntryCount() > 0) { fm.popBackStack(); } else { fragmentLoader(setContainerId(),setFragment()); } return true; case R.id.system_eq: Intent intent = new Intent(AudioEffect.ACTION_disPLAY_AUdio_EFFECT_CONTROL_PANEL); if (intent.getAction() != null && Helper.isActivityPresent(MainActivity.this,intent)){ intent.putExtra(AudioEffect.EXTRA_AUdio_SESSION,musicXService.audioSession()); startActivityForResult(intent,EQ); }else { Toast.makeText(this,"No app found to handle equalizer",Toast.LENGTH_SHORT).show(); } break; case R.id.play_save_queue: multiQueuePlay(); break; } return super.onoptionsItemSelected(item); }
@Override public void onDestroy() { // Check that we're not being destroyed while something is still // playing. if (chatHead != null) windowManager.removeView(chatHead); if (isPlaying()) { Log.e(LOGTAG,"Service being destroyed while still playing."); } if (mShakeAction != 0) mSensorManager.unregisterListener(this); // release all MediaPlayer resources,including the native player and // wakelocks Intent i = new Intent(AudioEffect.ACTION_CLOSE_AUdio_EFFECT_CONTROL_SESSION); i.putExtra(AudioEffect.EXTRA_AUdio_SESSION,getAudioSessionId()); i.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,getPackageName()); sendbroadcast(i); pause(); mPlayer.release(); mPlayer = null; mAudioManager.abandonAudioFocus(mAudioFocusListener); setRemoteControlClient(); // make sure there aren't any other messages coming mDelayedStopHandler.removeCallbacksAndMessages(null); mCurrentMediaPlayerHandler.removeCallbacksAndMessages(null); unregisterReceiver(mIntentReceiver); dismissAllNotifications(); super.onDestroy(); }
public void setDataSource(String path) { try { mOnlineDuration = 0; mCurrentMediaPlayer.reset(); if (path.startsWith("content://")) { mCurrentMediaPlayer.setDataSource( mediaplaybackService.this,Uri.parse(path)); } else { mCurrentMediaPlayer.setDataSource(path); } mCurrentMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); mCurrentMediaPlayer.prepareAsync(); mCurrentMediaPlayer.setonPreparedListener(new MediaPlayer.OnPreparedListener() { @Override public void onPrepared(MediaPlayer mp) { mOnlineDuration = mCurrentMediaPlayer.getDuration(); mp.setonCompletionListener(listener); mp.setonErrorListener(errorListener); setonInfoListener(); Intent i = new Intent(AudioEffect.ACTION_OPEN_AUdio_EFFECT_CONTROL_SESSION); i.putExtra(AudioEffect.EXTRA_AUdio_SESSION,getAudioSessionId()); i.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,getPackageName()); sendbroadcast(i); mIsInitialized = true; notifyChange(Meta_CHANGED); notifyChange(END_BUFFERING); } }); } catch (IOException ex) { mIsInitialized = false; return; } }
@Override public boolean onCreateOptionsMenu(Menu menu) { super.onCreateOptionsMenu(menu); MenuInflater inflater = getMenuInflater(); inflater.inflate(R.menu.starred,menu); MenuInflater inflater2 = getMenuInflater(); inflater2.inflate(R.menu.share,menu); Intent i = new Intent(AudioEffect.ACTION_disPLAY_AUdio_EFFECT_CONTROL_PANEL); if (getPackageManager().resolveActivity(i,0) != null) { MenuInflater inflater1 = getMenuInflater(); inflater1.inflate(R.menu.equalizer,menu); } MenuInflater inflater3 = getMenuInflater(); inflater3.inflate(R.menu.settings,menu); if (!MusicUtils.getBooleanPref(this,"radiomode",false)) { SubMenu menu1 = menu.addSubMenu(0,ADD_TO_PLAYLIST,R.string.add_to_playlist); if (!MusicUtils.getBooleanPref(this,false)) { MusicUtils.makePlaylistMenuOnline(this,menu1); } menu.add(1,SEARCH_LYRICS,R.string.search_lyrics_menu_short); } menu.add(1,ABOUT + 2,R.string.carmode_menu_short); if (mSettings.getBoolean(PreferencesActivity.POPUP_ON,false)) menu.add(1,ABOUT + 1,R.string.go_popup); menu.add(1,ABOUT,R.string.about_menu_short); menu.add(1,EXIT,R.string.exit_menu); return true; }
/** * @param player The {@link MediaPlayer} to use * @param path The path of the file,AudioEffect.CONTENT_TYPE_MUSIC); context.sendbroadcast(intent); return true; }
@Override public void onDestroy() { if (D) Log.d(TAG,"Destroying service"); super.onDestroy(); // Remove any sound effects final Intent audioEffectsIntent = new Intent( AudioEffect.ACTION_CLOSE_AUdio_EFFECT_CONTROL_SESSION); audioEffectsIntent.putExtra(AudioEffect.EXTRA_AUdio_SESSION,getPackageName()); sendbroadcast(audioEffectsIntent); mAlarmManager.cancel(mShutdownIntent); mPlayerHandler.removeCallbacksAndMessages(null); if (TimberUtils.isJellyBeanMR2()) mHandlerThread.quitSafely(); else mHandlerThread.quit(); mPlayer.release(); mPlayer = null; mAudioManager.abandonAudioFocus(mAudioFocusListener); if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) mSession.release(); getContentResolver().unregisterContentObserver(mMediaStoreObserver); closeCursor(); unregisterReceiver(mIntentReceiver); if (mUnmountReceiver != null) { unregisterReceiver(mUnmountReceiver); mUnmountReceiver = null; } mWakeLock.release(); }
@Override public void onDestroy() { super.onDestroy(); instance = null; if(currentPlaying != null) currentPlaying.setPlaying(false); lifecycleSupport.onDestroy(); try { Intent i = new Intent(AudioEffect.ACTION_CLOSE_AUdio_EFFECT_CONTROL_SESSION); i.putExtra(AudioEffect.EXTRA_AUdio_SESSION,audioSessionId); i.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,getPackageName()); sendbroadcast(i); } catch(Throwable e) { // Froyo or lower } mediaPlayer.release(); if(nextMediaPlayer != null) { nextMediaPlayer.release(); } mediaPlayerLooper.quit(); shufflePlayBuffer.shutdown(); effectsController.release(); if(bufferTask != null) { bufferTask.cancel(); bufferTask = null; } if(nextPlayingTask != null) { nextPlayingTask.cancel(); nextPlayingTask = null; } if(proxy != null) { proxy.stop(); proxy = null; } Notifications.hidePlayingNotification(this,this,handler); Notifications.hideDownloadingNotification(this,handler); }
/** * Respond to clicks on actionbar options */ @Override public boolean onoptionsItemSelected(MenuItem item) { switch (item.getItemId()) { case R.id.action_search: onSearchRequested(); break; case R.id.action_settings: startActivityForResult(new Intent(this,SettingsHolder.class),0); break; case R.id.action_eqalizer: final Intent intent = new Intent(AudioEffect.ACTION_disPLAY_AUdio_EFFECT_CONTROL_PANEL); if (getPackageManager().resolveActivity(intent,0) == null) { startActivity(new Intent(this,SimpleEq.class)); } else{ intent.putExtra(AudioEffect.EXTRA_AUdio_SESSION,MusicUtils.getCurrentAudioId()); startActivity(intent); } break; case R.id.action_shuffle_all: shuffleAll(); break; default: return super.onoptionsItemSelected(item); } return true; }
@Override public void onDestroy() { // Check that we're not being destroyed while something is still // playing. if (mIsSupposedToBePlaying) { Log.e(LOGTAG,"Service being destroyed while still playing."); } // release all MediaPlayer resources,including the native player and // wakelocks Intent i = new Intent(AudioEffect.ACTION_CLOSE_AUdio_EFFECT_CONTROL_SESSION); i.putExtra(AudioEffect.EXTRA_AUdio_SESSION,getPackageName()); sendbroadcast(i); mPlayer.release(); mPlayer = null; mAudioManager.abandonAudioFocus(mAudioFocusListener); mAudioManager.unregisterRemoteControlClient(mRemoteControlClient); // make sure there aren't any other messages coming mDelayedStopHandler.removeCallbacksAndMessages(null); mMediaplayerHandler.removeCallbacksAndMessages(null); if (mCursor != null) { mCursor.close(); mCursor = null; } updatealbumBitmap(); unregisterReceiver(mIntentReceiver); if (mUnmountReceiver != null) { unregisterReceiver(mUnmountReceiver); mUnmountReceiver = null; } mWakeLock.release(); super.onDestroy(); }
/** * Stops media playback */ public synchronized void stop() { // Check if a player exists otherwise there is nothing to do. if (mCurrentMediaPlayer != null) { // Check if the player for the next song exists already if (mNextMediaPlayer != null) { // Remove the next player from the currently playing one. mCurrentMediaPlayer.setNextMediaPlayer(null); // Release the MediaPlayer,not usable after this command mNextMediaPlayer.release(); // Reset variables to clean internal state mNextMediaPlayer = null; mSecondPrepared = false; mSecondPreparing = false; } // Check if the currently active player is ready if (mCurrentPrepared) { /* * Signal android desire to close audio effect session */ Intent audioEffectIntent = new Intent(AudioEffect.ACTION_CLOSE_AUdio_EFFECT_CONTROL_SESSION); audioEffectIntent.putExtra(AudioEffect.EXTRA_AUdio_SESSION,mCurrentMediaPlayer.getAudioSessionId()); audioEffectIntent.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,mPlaybackService.getPackageName()); mPlaybackService.sendbroadcast(audioEffectIntent); Log.v(TAG,"Closing effect for session: " + mCurrentMediaPlayer.getAudioSessionId()); } // Release the current player mCurrentMediaPlayer.release(); // Reset variables to clean internal state mCurrentMediaPlayer = null; mCurrentPrepared = false; } }
private void dumpAudioEffectsstate() { AudioEffect.Descriptor effects[] = AudioEffect.queryEffects(); Log.v(TAG,"Found audio effects: " + effects.length); for(AudioEffect.Descriptor effect : effects) { Log.v(TAG,"AudioEffect: " + effect.name + " connect mode: " + effect.connectMode + " implementor: " + effect.implementor); } }
/** * @param player The {@link MediaPlayer} to use * @param path The path of the file,AudioEffect.CONTENT_TYPE_MUSIC); context.sendbroadcast(intent); return true; }
/** @see android.media.audiofx.AudioEffect.Descriptor#uuid */ protected boolean hasFeature(UUID uuid) { // Build of array of UUIDs to search through. if (null == uuids || uuids.length == 0) { uuids = new UUID[descriptors.length]; int i = -1; for (AudioEffect.Descriptor descriptor : descriptors) uuids[++i] = descriptor.uuid; } return (Arrays.binarySearch(uuids,uuid) >= 0); }
/** @see android.media.audiofx.AudioEffect.Descriptor#name */ protected String[] getDescriptorNames() { // Build of array of UUIDs to search through. String[] names = new String[descriptors.length]; int i = -1; for (AudioEffect.Descriptor descriptor : descriptors) names[++i] = descriptor.name; return names; }
private void release(AudioEffect effect) { if (null != effect) { effect.setControlStatusListener(null); effect.setEnableStatusListener(null); if (effect instanceof Equalizer) { ((Equalizer) effect).setParameterListener(null); } else if (effect instanceof BassBoost) { ((BassBoost) effect).setParameterListener(null); } else if (effect instanceof Virtualizer) { ((Virtualizer) effect).setParameterListener(null); } effect.release(); } }
public static Intent getSystemEqIntent(Context c) { Intent systemEq = new Intent(AudioEffect.ACTION_disPLAY_AUdio_EFFECT_CONTROL_PANEL); systemEq.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,c.getPackageName()); ActivityInfo info = systemEq.resolveActivityInfo(c.getPackageManager(),0); if (info != null && !info.name.startsWith("com.android.musicfx")) { return systemEq; } else { return null; } }
/** * Checks whether the current device is capable of instantiating and using an * {@link android.media.audiofx.Equalizer} * @return True if an Equalizer may be used at runtime */ public static boolean hasEqualizer() { for (AudioEffect.Descriptor effect : AudioEffect.queryEffects()) { if (EQUALIZER_UUID.equals(effect.type)) { return true; } } return false; }
private void sendplayingStatusChangedbroadcast(boolean isPlaying){ if(isPlaying){ final Intent audioEffectIntent = new Intent(AudioEffect.ACTION_OPEN_AUdio_EFFECT_CONTROL_SESSION); audioEffectIntent.putExtra(AudioEffect.EXTRA_AUdio_SESSION,player.mediaPlayer.getAudioSessionId()); audioEffectIntent.putExtra(AudioEffect.EXTRA_PACKAGE_NAME,getPackageName()); sendbroadcast(audioEffectIntent); } buildNotification(); Intent statusChangedIntent = new Intent(); statusChangedIntent.setAction(ACTION_PLAYING_STATUS_CHANGED); statusChangedIntent.putExtra(RadioPlaybackService.KEY_IS_PLAYING,isPlaying); sendbroadcast(statusChangedIntent); }
@Override public void onDestroy() { // Check that we're not being destroyed while something is still // playing. if (mIsSupposedToBePlaying) { Log.e(LOGTAG,getPackageName()); sendbroadcast(i); mPlayer.release(); mPlayer = null; mAudioManager.abandonAudioFocus(mAudioFocusListener); mAudioManager.unregisterRemoteControlClient(mRemoteControlClient); // make sure there aren't any other messages coming mDelayedStopHandler.removeCallbacksAndMessages(null); mMediaplayerHandler.removeCallbacksAndMessages(null); if (mCursor != null) { mCursor.close(); mCursor = null; } updatealbumBitmap(); unregisterReceiver(mIntentReceiver); if (mUnmountReceiver != null) { unregisterReceiver(mUnmountReceiver); mUnmountReceiver = null; } mWakeLock.release(); super.onDestroy(); }
/** * Respond to clicks on actionbar options */ @Override public boolean onoptionsItemSelected(MenuItem item) { switch (item.getItemId()) { case R.id.action_search: onSearchRequested(); break; case R.id.action_settings: startActivityForResult(new Intent(this,0); break; case R.id.action_eqalizer: Intent i = new Intent(AudioEffect.ACTION_disPLAY_AUdio_EFFECT_CONTROL_PANEL); i.putExtra(AudioEffect.EXTRA_AUdio_SESSION,MusicUtils.getCurrentAudioId()); startActivityForResult(i,0); break; case R.id.action_shuffle_all: shuffleAll(); break; default: return super.onoptionsItemSelected(item); } return true; }
@Override public void onDestroy() { // Check that we're not being destroyed while something is still // playing. if (mIsSupposedToBePlaying) { Log.e(LOGTAG,getPackageName()); sendbroadcast(i); mPlayer.release(); mPlayer = null; mAudioManager.abandonAudioFocus(mAudioFocusListener); mAudioManager.unregisterRemoteControlClient(mRemoteControlClient); // make sure there aren't any other messages coming mDelayedStopHandler.removeCallbacksAndMessages(null); mMediaplayerHandler.removeCallbacksAndMessages(null); if (mCursor != null) { mCursor.close(); mCursor = null; } updatealbumBitmap(); unregisterReceiver(mIntentReceiver); if (mUnmountReceiver != null) { unregisterReceiver(mUnmountReceiver); mUnmountReceiver = null; } mWakeLock.release(); super.onDestroy(); }
AVAudioEngine 在 macOS/iOS 上协调/同步输入/输出时间戳
我可能无法回答您的问题,但我相信您的问题中没有提到的属性确实报告了额外的延迟信息。
我只在 HAL/AUHAL 层工作过(从未 AVAudioEngine
),但在关于计算整体延迟的讨论中,会出现一些音频设备/流属性:kAudioDevicePropertyLatency
和 {{1 }}。
仔细研究了一下,我看到了 kAudioStreamPropertyLatency
的 AVAudioIONode
属性 (https://developer.apple.com/documentation/avfoundation/avaudioionode/1385631-presentationlatency) 的文档中提到的那些属性。我希望驱动程序报告的硬件延迟会在那里。 (我怀疑标准 presentationLatency
属性报告输入样本出现在“正常”节点的输出中的延迟,并且 IO 情况很特殊)
它不在 latency
的上下文中,但这里有来自 CoreAudio 邮件列表的一条消息,其中谈到了使用可能提供一些额外背景的低级属性:https://lists.apple.com/archives/coreaudio-api/2017/Jul/msg00035.html
AVAudioEngine下采样问题
我对从麦克风获取的音频进行下采样时遇到问题。我正在使用AVAudioEngine通过以下代码从麦克风中采样:
assert(self.engine.inputNode != nil)let input = self.engine.inputNode!let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false) let mixer = AVAudioMixerNode()engine.attach(mixer)engine.connect(input, to: mixer, format: input.inputFormat(forBus: 0))do { try engine.start() mixer.installTap(onBus: 0, bufferSize: 1024, format: audioFormat, block: { (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in //some code here })} catch let error { print(error.localizedDescription)}
这段代码在iPhone 5s上非常有效,因为麦克风输入为8000Hz,并且缓冲区中充满了来自麦克风的数据。
问题是我希望能够从iPhone
6s(及更高版本)录制哪个麦克风以16000Hz录制的声音。奇怪的是,如果我将mixernode与引擎mainmixernode连接(使用以下代码):
engine.connect(mixer, to: mainMixer, format: audioFormat)
这实际上是有效的,并且我得到的缓冲区的格式为8000Hz,声音可以完美地降采样,唯一的问题是声音也从我不想要的扬声器中发出(如果我不连接它,缓冲区为空)。
有谁知道如何解决这个问题?
非常感谢任何帮助,意见或想法。
答案1
小编典典我只需将混音器的音量更改为0即可解决此问题。
mixer.volume = 0
这使我能够利用引擎主混音器的强大功能将任何采样率重新采样到所需的采样率,而不会听到直接从扬声器传出的麦克风反馈环路。如果有人需要任何澄清,请告诉我。
现在这是我的代码:
assert(self.engine.inputNode != nil)let input = self.engine.inputNode!let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false) let mixer = AVAudioMixerNode()engine.attach(mixer)engine.connect(input, to: mixer, format: input.inputFormat(forBus: 0))mixer.volume = 0engine.connect(mixer, to: mainMixer, format: audioFormat)do { try engine.start() mixer.installTap(onBus: 0, bufferSize: 1024, format: audioFormat, block: { (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in //some code here })} catch let error { print(error.localizedDescription)}
关于ios – AVAudioEngine多个AVAudioInputNodes不能完美同步的介绍已经告一段落,感谢您的耐心阅读,如果想了解更多关于android.media.audiofx.AudioEffect.Descriptor的实例源码、android.media.audiofx.AudioEffect的实例源码、AVAudioEngine 在 macOS/iOS 上协调/同步输入/输出时间戳、AVAudioEngine下采样问题的相关信息,请在本站寻找。
本文标签: