Skip to content

Commit 06a5a6a

Browse files
committed
New translations
1 parent 1eec4b9 commit 06a5a6a

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

en-US/dita/RTC-NG/API/api_imediaengine_pullaudioframe.dita

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -25,13 +25,13 @@
2525
</section>
2626
<section id="detailed_desc">
2727
<title>Details</title>
28-
<p>Before calling this method, call the <xref keyref="setExternalAudioSink"/><codeph>(<parmname>enabled</parmname>: <ph keyref="true"/>)</codeph> method to notify the app to enable and set the external audio sink.</p>
28+
<p>Before calling this method, call <xref keyref="setExternalAudioSink"/><codeph>(<parmname>enabled</parmname>: <ph keyref="true"/>)</codeph> to notify the app to enable and set the external audio rendering.</p>
2929
<p>After a successful call of this method, the app pulls the decoded and mixed audio data for playback.</p>
3030
<note type="attention">
3131
<ul>
3232
<li>Call this method after joining a channel.</li>
33-
<li>Both this method and <xref keyref="onPlaybackAudioFrame"/> callback can be used to get audio data after remote mixing. It should be noted that after calling <apiname keyref="setExternalAudioSink"/> to enable external audio rendering, the app no longer receives data from the <apiname keyref="onPlaybackAudioFrame"/> callback. Therefore, you should choose between this method and the <apiname keyref="onPlaybackAudioFrame"/> callback based on your actual business requirements. The specific distinctions between them are as follows:<ul>
34-
<li>After a successful method call, the app automatically pulls the audio data from the SDK. After setting the audio data parameters, the SDK adjusts the frame buffer and avoids problems caused by jitter in the external audio playback.</li>
33+
<li>Both this method and <xref keyref="onPlaybackAudioFrame"/> callback can be used to get audio data after remote mixing. Note that after calling <apiname keyref="setExternalAudioSink"/> to enable external audio rendering, the app no longer receives data from the <apiname keyref="onPlaybackAudioFrame"/> callback. Therefore, you should choose between this method and the <apiname keyref="onPlaybackAudioFrame"/> callback based on your actual business requirements. The specific distinctions between them are as follows:<ul>
34+
<li>After calling this method, the app automatically pulls the audio data from the SDK. By setting the audio data parameters, the SDK adjusts the frame buffer to help the app handle latency, effectively avoiding audio playback jitter.</li>
3535
<li>The SDK sends the audio data to the app through the <apiname keyref="onPlaybackAudioFrame"/> callback. Any delay in processing the audio frames may result in audio jitter.</li>
3636
</ul></li>
3737
<li>This method is only used for retrieving audio data after remote mixing. If you need to get audio data from different audio processing stages such as capture and playback, you can register the corresponding callbacks by calling <xref keyref="registerAudioFrameObserver"/>.</li>

0 commit comments

Comments
 (0)