Apalis-imx8 SGTL5000 codec issues

Hello,

We have a customer testing their Java application on the Apalis-IMX8 (i.MX8QM) HW paired with an Ixora baseboard.

Part of their application uses the SGTL5000 codec chip for playing audio out the headphone jack using the Javax.sound.sampled APIs.

A Java error is encountered any time a SourceDataLine is opened for any format of wave file presented. The error indicates the hwparams aren’t set correctly.

After instrumenting the current Toradex branch of the Linux Kernel here:
https://git.toradex.com/cgit/linux-toradex.git/tree/sound/core/pcm_native.c?h=toradex_5.4-2.3.x-imx#n2116
I noticed that the sound constraints calculation was ending up with a bad calculation for the number of periods when the buffer size gets locked to 16384 (as it would with almost any 2 channel audio).

Using the current Toradex Multimedia Image (2021-10) I can illustrate the difficulty using the standard ALSA utils.

Command that works:

aplay -v -Dsysdefault:CARD=apalisimx8qmsgt /usr/share/sounds/alsa/Front_Left.wav

Its setup is:
stream : PLAYBACK
access : RW_INTERLEAVED
format : S16_LE
subformat : STD
channels : 1
rate : 48000
exact rate : 48000 (48000/1)
msbits : 16
buffer_size : 24000
period_size : 6000
period_time : 125000
tstamp_mode : NONE
tstamp_type : MONOTONIC
period_step : 1
avail_min : 6000
period_event : 0
start_threshold : 24000
stop_threshold : 24000
silence_threshold: 0
silence_size : 0
boundary : 6755399441055744000
appl_ptr : 0
hw_ptr : 0
NOTE: The period_time of 125000 (1/4 of the total buffer time of 500000us) That max buffer time is mentioned here: https://linux.die.net/man/1/aplay
-B, --buffer-time=#

Buffer duration is # microseconds If no buffer time and no buffer size is given then the maximal allowed buffer time but not more than 500ms is set.

Let’s try a standard speaker-test using the SGTL5000 codec:

speaker-test -Dsysdefault:CARD=apalisimx8qmsgt --test wav

speaker-test 1.2.1

Playback device is sysdefault:CARD=apalisimx8qmsgt
Stream parameters are 48000Hz, S16_LE, 1 channels
WAV file(s)
Rate set to 48000Hz (requested 48000Hz)
Buffer size range from 132 to 32768
Period size range from 66 to 16380
Using max buffer size 32768
Periods = 4
Unable to set nperiods 4 for playback: Invalid argument
Setting of hwparams failed: Invalid argument

If I specify the 500000us buffer time that aplay uses as a default, then the command works:

speaker-test --buffer 500000 -Dsysdefault:CARD=apalisimx8qmsgt --test wav

speaker-test 1.2.1

Playback device is sysdefault:CARD=apalisimx8qmsgt
Stream parameters are 48000Hz, S16_LE, 1 channels
WAV file(s)
Rate set to 48000Hz (requested 48000Hz)
Buffer size range from 132 to 32768
Period size range from 66 to 16380
Requested period time 125000 us
Periods = 4
was set period_size = 6000
was set buffer_size = 24000
0 - Front Left
Time per period = 1.001557
0 - Front Left
Time per period = 1.499898
0 - Front Left

If I try and use 2 channels, the command fails again:

speaker-test --buffer 500000 -c2 -Dsysdefault:CARD=apalisimx8qmsgt --test wav

speaker-test 1.2.1

Playback device is sysdefault:CARD=apalisimx8qmsgt
Stream parameters are 48000Hz, S16_LE, 2 channels
WAV file(s)
Rate set to 48000Hz (requested 48000Hz)
Buffer size range from 72 to 16384
Period size range from 36 to 8190
Requested buffer time 500000 us
Unable to set buffer time 500000 us for playback: Invalid argument
Setting of hwparams failed: Invalid argument

I can adjust the buffer down to 325000us and then speaker-test works:

speaker-test --buffer 325000 -c2 -Dsysdefault:CARD=apalisimx8qmsgt --test wav

speaker-test 1.2.1

Playback device is sysdefault:CARD=apalisimx8qmsgt
Stream parameters are 48000Hz, S16_LE, 2 channels
WAV file(s)
Rate set to 48000Hz (requested 48000Hz)
Buffer size range from 72 to 16384
Period size range from 36 to 8190
Requested buffer time 325000 us
Periods = 4
was set period_size = 3900
was set buffer_size = 15600
0 - Front Left
1 - Front Right
Time per period = 2.763582
0 - Front Left

If I set basically ANY period time hinting, then speaker-test works.

This all comes down to the buffer_size, buffer_time, period_size, period_time, num_periods calculations failing in pcm_native.c « core « sound - linux-toradex.git - Linux kernel for Apalis, Colibri and Verdin modules

For the Java APIs, setting the period time in this way seems to be problematic and there doesn’t seem to be any “easy” way to get audio out of the headphone jack in Apalis-imx8/Ixora. If the customer tests their Java code on RaspberryPi4 or their own PC it “just works”.

Here is a simple ClipPlayerDemo.java file which works out of the box for all other audio codecs I’ve tested EXCEPT the SGTL5000 on Apalis-IMX8:

/*

import java.io.File;
import java.io.IOException;

import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioPermission;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.Clip;
import javax.sound.sampled.Control;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.Line;
import javax.sound.sampled.LineEvent;
import javax.sound.sampled.LineListener;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.Mixer;
import javax.sound.sampled.SourceDataLine;
import javax.sound.sampled.UnsupportedAudioFileException;

public class ClipPlayerDemo implements LineListener {
/* Buffer size for SourceDataLine writing */
private static final int BUFFER_SIZE = 4096;

/* Notifies wait loop that play has completed */
private boolean playCompleted = false;

private void playClip(File audioFile, Mixer.Info minfo) {
    try {
        AudioInputStream audioStream = AudioSystem.getAudioInputStream(audioFile);
        Clip line = (Clip)AudioSystem.getClip(minfo);

        line.addLineListener(this);
        line.open(audioStream);

        Control[] controls = line.getControls();
        for (int b = 0; b < controls.length; b++) {
            System.out.println("===C desc=[" + controls[b].toString() + "]");
        }

        line.start();

        while (!playCompleted) {
            try {
                Thread.sleep(1000);
            } catch (InterruptedException ex) {
                ex.printStackTrace();
            }
        }

        line.close();
        audioStream.close();
    } catch (UnsupportedAudioFileException ex) {
        System.out.println("ERROR: CLIP: The specified audio file is not supported.");
    } catch (LineUnavailableException ex) {
        System.out.println("ERROR: CLIP: Audio line for playing back is unavailable.");
        ex.printStackTrace();
    } catch (IOException ex) {
        System.out.println("ERROR: Error playing the audio file.");
    }
}

private void playSourceDataLine(File audioFile, Mixer.Info minfo) {
    Mixer mixer = AudioSystem.getMixer(minfo);
    byte[] bytesBuffer = new byte[BUFFER_SIZE];
    int bytesRead = -1, div = 1;

    try {
        AudioInputStream audioStream = AudioSystem.getAudioInputStream(audioFile);
        SourceDataLine line = (SourceDataLine) mixer.getLine(new Line.Info(SourceDataLine.class));

        line.open(audioStream.getFormat());

        Control[] controls = line.getControls();
        for (int b = 0; b < controls.length; b++) {
            System.out.println("===C desc=[" + controls[b].toString() + "]");
        }

        System.out.println("SourceDataLine opened.");
        line.start();
        System.out.println("SourceDataLine started.");

        while ((bytesRead = audioStream.read(bytesBuffer)) != -1) {
            line.write(bytesBuffer, 0, bytesRead);
        }

        line.drain();
        System.out.println("SourceDataLine stopped.");
        line.close();
        System.out.println("SourceDataLine closed.");
        audioStream.close();
    } catch (UnsupportedAudioFileException ex) {
        System.out.println("ERROR: LINE: The specified audio file is not supported.");
    } catch (LineUnavailableException ex) {
        System.out.println("ERROR: LINE: Audio line for playing back is unavailable.");
        ex.printStackTrace();
    } catch (IOException ex) {
        System.out.println("ERROR: Error playing the audio file.");
    }
}

public void play(String audioFilePath, Mixer.Info minfo) {
    File audioFile = new File(audioFilePath);
    System.out.println("Playing " + audioFilePath + " ...");

    playClip(audioFile, minfo);
    playSourceDataLine(audioFile, minfo);
}

@Override public void update(LineEvent event) {
    LineEvent.Type type = event.getType();

    if (type == LineEvent.Type.OPEN) {
        System.out.println("Clip opened.");

} else if (type == LineEvent.Type.START) {
this.playCompleted = false;
System.out.println(“Clip started.”);
} else if (type == LineEvent.Type.STOP) {
this.playCompleted = true;
System.out.println(“Clip stopped.”);
} else if (type == LineEvent.Type.CLOSE) {
System.out.println(“Clip closed.”);
}

}

public static void main(String[] args) {
    Mixer.Info[] mixers = AudioSystem.getMixerInfo();
    Mixer.Info outputMixerInfo = null;

    if (args.length != 2) {
        System.out.println("Usage: <device> <file>");
    } else {
        System.out.println("Usage: device:[" + args[0] + "] file:[" + args[1] + "]");

        for (int i = 0; i < mixers.length; i++) {
            Mixer mixer = AudioSystem.getMixer(mixers[i]);
            System.out.println("=MI name=[" + mixers[i].getName() + "], desc=[" + mixers[i].getDescription()  "]");

            if (mixers[i].getName().contains(args[0])) {
                outputMixerInfo = mixers[i];
                break;
            }
        }

        new ClipPlayerDemo().play(args[1], outputMixerInfo);
    }
}

}

On Apalis-IMX8 with OpenJDK-11 installed, It can be run with something like:
java ClipPlayerDemo.java “hw:2,0” “/usr/share/sounds/alsa/Front_Left.wav”

On a RaspberryPi4 it can be run with:
java ClipPlayerDemo.java “Headphones [plughw:1,0]” “/usr/share/sounds/alsa/Front_Left.wav”

Let me know if you need more detail.

– Mike

Hello Mike,
I tried your command examples on my board and could see the problems you mention. I’ll investigate what’s going on and I will get back to you if I have other questions or if I find something useful.
Thank you for your detailed description so far.

Best Regards,
Rafael Beims

Thank you Rafael,

Don’t hesitate to reach out if you need more information.

  • Mike

Hi @mike-foundries,
I was doing a little bit more research on this and I figured out something interesting. The default pcm output device that’s configured on the apalis is a plug device. Plug devices can make automatic format conversions when needed in order to make sure that the format is supported by the underlying hardware.
If we just use the default device (without using the -Dsysdefault:CARD) options, your examples all work correctly and I can hear audio coming out of the headphone output.
Based on that I compiled your java application to check if for some reason it wasn’t possible to just let the default device (plug device instead of hw device) to be used.
I ran some tests with it and found out that indeed it is possible to play audio from the java app by doing something like:

 java ClipPlayerDemo "default" "/usr/share/sounds/alsa/Front_Left.wav"

I also did a quick change to the app in order to print out the selected mixer device, when found:

145                 if (mixers[i].getName().contains(args[0])) {                                          
146                     outputMixerInfo = mixers[i];                                                      
147                                                                                                       
148                     System.out.println("selected: " + outputMixerInfo.getName());                     
149                     break;                                                                            
150                 }

After doing that I could see that when I run the example with the same command line you did I get the following:

root@apalis-imx8:/tmp# java ClipPlayerDemo "hw:2,0" "/usr/share/sounds/alsa/Front_Left.wav"    
Usage: device:[hw:2,0] file:[/usr/share/sounds/alsa/Front_Left.wav]
=MI name=[apalisimx8qmsgt [default]], desc=[Direct Audio Device: apalis-imx8qm-sgtl5000, 59050000.sai-sgtl5000 sgtl5000-0, ]
=MI name=[imxspdif [plughw:0,0]], desc=[Direct Audio Device: imx-spdif, S/PDIF PCM snd-soc-dummy-dai-0, S/PDIF PCM snd-soc-dummy-dai-0]
=MI name=[apalisimx8qmsgt [plughw:1,0]], desc=[Direct Audio Device: apalis-imx8qm-sgtl5000, 59050000.sai-sgtl5000 sgtl5000-0, ]
=MI name=[imxaudiohdmitx [plughw:2,0]], desc=[Direct Audio Device: imx-audio-hdmi-tx, imx8 hdmi i2s-hifi-0, ]
selected: imxaudiohdmitx [plughw:2,0]
Playing /usr/share/sounds/alsa/Front_Left.wav ...
ERROR: CLIP: Audio line for playing back is unavailable.
javax.sound.sampled.LineUnavailableException: line with format PCM_SIGNED 48000.0 Hz, 16 bit, mono, 2 bytes/frame, little-endian not supported.
	at com.sun.media.sound.DirectAudioDevice$DirectDL.implOpen(Unknown Source)
	at com.sun.media.sound.DirectAudioDevice$DirectClip.implOpen(Unknown Source)
	at com.sun.media.sound.AbstractDataLine.open(Unknown Source)
	at com.sun.media.sound.DirectAudioDevice$DirectClip.open(Unknown Source)
	at com.sun.media.sound.DirectAudioDevice$DirectClip.open(Unknown Source)
	at ClipPlayerDemo.playClip(ClipPlayerDemo.java:39)
	at ClipPlayerDemo.play(ClipPlayerDemo.java:111)
	at ClipPlayerDemo.main(ClipPlayerDemo.java:153)
ERROR: LINE: Audio line for playing back is unavailable.
javax.sound.sampled.LineUnavailableException: line with format PCM_SIGNED 48000.0 Hz, 16 bit, mono, 2 bytes/frame, little-endian not supported.
	at com.sun.media.sound.DirectAudioDevice$DirectDL.implOpen(Unknown Source)
	at com.sun.media.sound.AbstractDataLine.open(Unknown Source)
	at com.sun.media.sound.AbstractDataLine.open(Unknown Source)
	at ClipPlayerDemo.playSourceDataLine(ClipPlayerDemo.java:77)
	at ClipPlayerDemo.play(ClipPlayerDemo.java:112)
	at ClipPlayerDemo.main(ClipPlayerDemo.java:153)

In this specific case the audio output that’s being selected is the hdmi one, instead of the headphone one. Calling it like this plays audio but still gives me an error:

root@apalis-imx8:/tmp# java ClipPlayerDemo "hw:1,0" "/usr/share/sounds/alsa/Front_Left.wav"
Usage: device:[hw:1,0] file:[/usr/share/sounds/alsa/Front_Left.wav]
=MI name=[apalisimx8qmsgt [default]], desc=[Direct Audio Device: apalis-imx8qm-sgtl5000, 59050000.sai-sgtl5000 sgtl5000-0, ]
=MI name=[imxspdif [plughw:0,0]], desc=[Direct Audio Device: imx-spdif, S/PDIF PCM snd-soc-dummy-dai-0, S/PDIF PCM snd-soc-dummy-dai-0]
=MI name=[apalisimx8qmsgt [plughw:1,0]], desc=[Direct Audio Device: apalis-imx8qm-sgtl5000, 59050000.sai-sgtl5000 sgtl5000-0, ]
selected: apalisimx8qmsgt [plughw:1,0]
Playing /usr/share/sounds/alsa/Front_Left.wav ...
ERROR: CLIP: Audio line for playing back is unavailable.
javax.sound.sampled.LineUnavailableException: line with format PCM_SIGNED 48000.0 Hz, 16 bit, mono, 2 bytes/frame, little-endian not supported.
	at com.sun.media.sound.DirectAudioDevice$DirectDL.implOpen(Unknown Source)
	at com.sun.media.sound.DirectAudioDevice$DirectClip.implOpen(Unknown Source)
	at com.sun.media.sound.AbstractDataLine.open(Unknown Source)
	at com.sun.media.sound.DirectAudioDevice$DirectClip.open(Unknown Source)
	at com.sun.media.sound.DirectAudioDevice$DirectClip.open(Unknown Source)
	at ClipPlayerDemo.playClip(ClipPlayerDemo.java:39)
	at ClipPlayerDemo.play(ClipPlayerDemo.java:111)
	at ClipPlayerDemo.main(ClipPlayerDemo.java:153)
===C desc=[Master Gain with current value: 0.0 dB (range: -80.0 - 6.0206)]
===C desc=[Mute Control with current value: False]
SourceDataLine opened.
SourceDataLine started.
SourceDataLine stopped.
SourceDataLine closed.

So, with this I have a followup question:
Do you have a specific reason why you’re trying to use the hardware device directly, or would it suffice to use the “default” device (or a previously configured “plug” device, which could be done in /etc/asound.conf).

Please let me know what are your thoughts about this.

Regards,
Rafael Beims

Hi Rafael,

The quick answer to your question about “default” device usage was that when I tested it, the audio wasn’t coming out the headphones and I didn’t check to see how it was routed / configured. :disappointed:

I appreciate your effort and the follow up. Do you have an example of /etc/asound.conf configuring the headphones as the default plug device?

Customer baseboard would have the codec dedicated to the line out so I think this is a good solution.

Thanks,
Mike

Hi Mike,

As far as I could see our default asound.conf right now routes through the plug device to the headphone output. It would be great if you could test it on your end just to check if that behavior is the same as what I saw here.

Hello Rafael,

The customer is using Java and alsa tool configurations in a container. So in this case, we are not using any of the meta-toradex-* Yocto layers. Could you point me towards your asound.conf? I can copy that into the container for testing.

When I tested alsa with the default configurations, I wasn’t getting sound out the headphones using “default”.

Here’s the asound.conf (2.8 KB) file that I have on my apalis.

Hey @mike-foundries, do you have any feedback about this issue? Did it end up working with the changed asound.conf file?

Hello @rafael.tx , I have tested the asound.conf and it sort of works.

It does indeed route the default output to the SGTL5000 codec as a Direct Audio device.

But it seems that audio is muted until running this one time:

amixer -Dsysdefault:CARD=apalisimx8qmsgt sset “Headphone” unmute

I can adjust the MUTE and MASTER_GAIN controls in the Java sample, but they do not enable audio. Everything I’ve tried still ends up with silent sound coming out the headphone jack until I run the above amixer command.

Here’s an example of the Java code I’m adding:

       BooleanControl mute = (BooleanControl)line.getControl(BooleanControl.Type.MUTE);
       mute.setValue(false);
       FloatControl gain = (FloatControl)line.getControl(FloatControl.Type.MASTER_GAIN);
       gain.setValue(0);

Thank you,
– Mike

Hi @mike-foundries
Sorry for the delay. It seems that the Alsa outputs are by default muted, as you already noticed.
I did not check this, but I think that the reason why your code doesn’t work for turning off the mute is that you have to unmute the headphone output, and the line variable contains a mixer pointer for the plughw and not the headphone out.
Anyway, you could also try to save the state of the headphone output after disabling the mute with:

$ alsactl store

Hi,

I think I’m facing the same problem, but our application requires to use hardware device directly via PortAudio.
I’ve tried the pa_devs example from Portaudio to list the available devices.
Compiled PortAudio with ‘–enable-debug-output’ the output of pa_devs for the analog audio device and the hdmi device is:

FillInDevInfo: Filling device info for: apalis-imx8qm-sgtl5000: - (hw:1,0)
GropeDevice: collecting info ..
Expression 'alsa_snd_pcm_hw_params_set_buffer_size_near( pcm, hwParams, &alsaBufferFrames )' failed in '../portaudio/src/hostapi/alsa/pa_linux_alsa.c', line: 922
Host error description: Invalid argument
FillInDevInfo: Failed groping hw:1,0 for capture
FillInDevInfo: Filling device info for: imx-audio-hdmi-tx: - (hw:2,0)
GropeDevice: collecting info ..
Expression 'alsa_snd_pcm_hw_params_set_buffer_size_near( pcm, hwParams, &alsaBufferFrames )' failed in '../portaudio/src/hostapi/alsa/pa_linux_alsa.c', line: 922
Host error description: Invalid argument
FillInDevInfo: Failed groping hw:2,0 for playback

Can you give me any tips for further investigation?

Best regards,
Jonas

Hi @jonas-licht ,

Sorry for the delay to answer.

Are you still facing the same issue? If yes, can you please create a new community question?

Thanks in advance,
Daniel Morais