Hello,
We have a customer testing their Java application on the Apalis-IMX8 (i.MX8QM) HW paired with an Ixora baseboard.
Part of their application uses the SGTL5000 codec chip for playing audio out the headphone jack using the Javax.sound.sampled APIs.
A Java error is encountered any time a SourceDataLine is opened for any format of wave file presented. The error indicates the hwparams aren’t set correctly.
After instrumenting the current Toradex branch of the Linux Kernel here:
https://git.toradex.com/cgit/linux-toradex.git/tree/sound/core/pcm_native.c?h=toradex_5.4-2.3.x-imx#n2116
I noticed that the sound constraints calculation was ending up with a bad calculation for the number of periods when the buffer size gets locked to 16384 (as it would with almost any 2 channel audio).
Using the current Toradex Multimedia Image (2021-10) I can illustrate the difficulty using the standard ALSA utils.
Command that works:
aplay -v -Dsysdefault:CARD=apalisimx8qmsgt /usr/share/sounds/alsa/Front_Left.wav
Its setup is:
stream : PLAYBACK
access : RW_INTERLEAVED
format : S16_LE
subformat : STD
channels : 1
rate : 48000
exact rate : 48000 (48000/1)
msbits : 16
buffer_size : 24000
period_size : 6000
period_time : 125000
tstamp_mode : NONE
tstamp_type : MONOTONIC
period_step : 1
avail_min : 6000
period_event : 0
start_threshold : 24000
stop_threshold : 24000
silence_threshold: 0
silence_size : 0
boundary : 6755399441055744000
appl_ptr : 0
hw_ptr : 0
NOTE: The period_time of 125000 (1/4 of the total buffer time of 500000us) That max buffer time is mentioned here: https://linux.die.net/man/1/aplay
-B, --buffer-time=#Buffer duration is # microseconds If no buffer time and no buffer size is given then the maximal allowed buffer time but not more than 500ms is set.
Let’s try a standard speaker-test using the SGTL5000 codec:
speaker-test -Dsysdefault:CARD=apalisimx8qmsgt --test wav
speaker-test 1.2.1
Playback device is sysdefault:CARD=apalisimx8qmsgt
Stream parameters are 48000Hz, S16_LE, 1 channels
WAV file(s)
Rate set to 48000Hz (requested 48000Hz)
Buffer size range from 132 to 32768
Period size range from 66 to 16380
Using max buffer size 32768
Periods = 4
Unable to set nperiods 4 for playback: Invalid argument
Setting of hwparams failed: Invalid argument
If I specify the 500000us buffer time that aplay uses as a default, then the command works:
speaker-test --buffer 500000 -Dsysdefault:CARD=apalisimx8qmsgt --test wav
speaker-test 1.2.1
Playback device is sysdefault:CARD=apalisimx8qmsgt
Stream parameters are 48000Hz, S16_LE, 1 channels
WAV file(s)
Rate set to 48000Hz (requested 48000Hz)
Buffer size range from 132 to 32768
Period size range from 66 to 16380
Requested period time 125000 us
Periods = 4
was set period_size = 6000
was set buffer_size = 24000
0 - Front Left
Time per period = 1.001557
0 - Front Left
Time per period = 1.499898
0 - Front Left
If I try and use 2 channels, the command fails again:
speaker-test --buffer 500000 -c2 -Dsysdefault:CARD=apalisimx8qmsgt --test wav
speaker-test 1.2.1
Playback device is sysdefault:CARD=apalisimx8qmsgt
Stream parameters are 48000Hz, S16_LE, 2 channels
WAV file(s)
Rate set to 48000Hz (requested 48000Hz)
Buffer size range from 72 to 16384
Period size range from 36 to 8190
Requested buffer time 500000 us
Unable to set buffer time 500000 us for playback: Invalid argument
Setting of hwparams failed: Invalid argument
I can adjust the buffer down to 325000us and then speaker-test works:
speaker-test --buffer 325000 -c2 -Dsysdefault:CARD=apalisimx8qmsgt --test wav
speaker-test 1.2.1
Playback device is sysdefault:CARD=apalisimx8qmsgt
Stream parameters are 48000Hz, S16_LE, 2 channels
WAV file(s)
Rate set to 48000Hz (requested 48000Hz)
Buffer size range from 72 to 16384
Period size range from 36 to 8190
Requested buffer time 325000 us
Periods = 4
was set period_size = 3900
was set buffer_size = 15600
0 - Front Left
1 - Front Right
Time per period = 2.763582
0 - Front Left
If I set basically ANY period time hinting, then speaker-test works.
This all comes down to the buffer_size, buffer_time, period_size, period_time, num_periods calculations failing in pcm_native.c « core « sound - linux-toradex.git - Linux kernel for Apalis, Colibri and Verdin modules
For the Java APIs, setting the period time in this way seems to be problematic and there doesn’t seem to be any “easy” way to get audio out of the headphone jack in Apalis-imx8/Ixora. If the customer tests their Java code on RaspberryPi4 or their own PC it “just works”.
Here is a simple ClipPlayerDemo.java file which works out of the box for all other audio codecs I’ve tested EXCEPT the SGTL5000 on Apalis-IMX8:
/*
- Based on documentation / code found at the following locations:
- Overview of the Sampled Package (The Java™ Tutorials > Sound)
- javax.sound.sampled (Java Platform SE 8 )
- How to play back audio in Java with examples
*/import java.io.File;
import java.io.IOException;import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioPermission;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.Clip;
import javax.sound.sampled.Control;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.Line;
import javax.sound.sampled.LineEvent;
import javax.sound.sampled.LineListener;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.Mixer;
import javax.sound.sampled.SourceDataLine;
import javax.sound.sampled.UnsupportedAudioFileException;public class ClipPlayerDemo implements LineListener {
/* Buffer size for SourceDataLine writing */
private static final int BUFFER_SIZE = 4096;/* Notifies wait loop that play has completed */ private boolean playCompleted = false; private void playClip(File audioFile, Mixer.Info minfo) { try { AudioInputStream audioStream = AudioSystem.getAudioInputStream(audioFile); Clip line = (Clip)AudioSystem.getClip(minfo); line.addLineListener(this); line.open(audioStream); Control[] controls = line.getControls(); for (int b = 0; b < controls.length; b++) { System.out.println("===C desc=[" + controls[b].toString() + "]"); } line.start(); while (!playCompleted) { try { Thread.sleep(1000); } catch (InterruptedException ex) { ex.printStackTrace(); } } line.close(); audioStream.close(); } catch (UnsupportedAudioFileException ex) { System.out.println("ERROR: CLIP: The specified audio file is not supported."); } catch (LineUnavailableException ex) { System.out.println("ERROR: CLIP: Audio line for playing back is unavailable."); ex.printStackTrace(); } catch (IOException ex) { System.out.println("ERROR: Error playing the audio file."); } } private void playSourceDataLine(File audioFile, Mixer.Info minfo) { Mixer mixer = AudioSystem.getMixer(minfo); byte[] bytesBuffer = new byte[BUFFER_SIZE]; int bytesRead = -1, div = 1; try { AudioInputStream audioStream = AudioSystem.getAudioInputStream(audioFile); SourceDataLine line = (SourceDataLine) mixer.getLine(new Line.Info(SourceDataLine.class)); line.open(audioStream.getFormat()); Control[] controls = line.getControls(); for (int b = 0; b < controls.length; b++) { System.out.println("===C desc=[" + controls[b].toString() + "]"); } System.out.println("SourceDataLine opened."); line.start(); System.out.println("SourceDataLine started."); while ((bytesRead = audioStream.read(bytesBuffer)) != -1) { line.write(bytesBuffer, 0, bytesRead); } line.drain(); System.out.println("SourceDataLine stopped."); line.close(); System.out.println("SourceDataLine closed."); audioStream.close(); } catch (UnsupportedAudioFileException ex) { System.out.println("ERROR: LINE: The specified audio file is not supported."); } catch (LineUnavailableException ex) { System.out.println("ERROR: LINE: Audio line for playing back is unavailable."); ex.printStackTrace(); } catch (IOException ex) { System.out.println("ERROR: Error playing the audio file."); } } public void play(String audioFilePath, Mixer.Info minfo) { File audioFile = new File(audioFilePath); System.out.println("Playing " + audioFilePath + " ..."); playClip(audioFile, minfo); playSourceDataLine(audioFile, minfo); } @Override public void update(LineEvent event) { LineEvent.Type type = event.getType(); if (type == LineEvent.Type.OPEN) { System.out.println("Clip opened.");
} else if (type == LineEvent.Type.START) {
this.playCompleted = false;
System.out.println(“Clip started.”);
} else if (type == LineEvent.Type.STOP) {
this.playCompleted = true;
System.out.println(“Clip stopped.”);
} else if (type == LineEvent.Type.CLOSE) {
System.out.println(“Clip closed.”);
}} public static void main(String[] args) { Mixer.Info[] mixers = AudioSystem.getMixerInfo(); Mixer.Info outputMixerInfo = null; if (args.length != 2) { System.out.println("Usage: <device> <file>"); } else { System.out.println("Usage: device:[" + args[0] + "] file:[" + args[1] + "]"); for (int i = 0; i < mixers.length; i++) { Mixer mixer = AudioSystem.getMixer(mixers[i]); System.out.println("=MI name=[" + mixers[i].getName() + "], desc=[" + mixers[i].getDescription() "]"); if (mixers[i].getName().contains(args[0])) { outputMixerInfo = mixers[i]; break; } } new ClipPlayerDemo().play(args[1], outputMixerInfo); } }
}
On Apalis-IMX8 with OpenJDK-11 installed, It can be run with something like:
java ClipPlayerDemo.java “hw:2,0” “/usr/share/sounds/alsa/Front_Left.wav”
On a RaspberryPi4 it can be run with:
java ClipPlayerDemo.java “Headphones [plughw:1,0]” “/usr/share/sounds/alsa/Front_Left.wav”
Let me know if you need more detail.
– Mike