diff mbox series

kselftests/alsa: pcm - move more configuration to configuration files

Message ID 20221201173333.2494019-1-perex@perex.cz
State Accepted
Commit 348d09fcd1b6ba455141882daf1d50ff33cd0bf8
Headers show
Series kselftests/alsa: pcm - move more configuration to configuration files | expand

Commit Message

Jaroslav Kysela Dec. 1, 2022, 5:33 p.m. UTC
Obtain all test parameters from the configuration files. The defaults
are defined in the pcm-test.conf file. The test count and parameters
may be variable per specific hardware.

Also, handle alt_formats field now (with the fixes in the format loop).
It replaces the original "automatic" logic which is not so universal.

The code may be further extended to skip various tests based
on the configuration hints, if the exact PCM hardware parameters
are not available for the given hardware.

Cc: Mark Brown <broonie@kernel.org>
Cc: Shuah Khan <shuah@kernel.org>
Cc: linux-kselftest@vger.kernel.org
Signed-off-by: Jaroslav Kysela <perex@perex.cz>
---
 tools/testing/selftests/alsa/Makefile         |   2 +-
 tools/testing/selftests/alsa/alsa-local.h     |   3 +
 tools/testing/selftests/alsa/conf.c           |  26 ++++-
 .../alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf  |   8 ++
 tools/testing/selftests/alsa/pcm-test.c       | 102 ++++++++++++------
 tools/testing/selftests/alsa/pcm-test.conf    |  16 +++
 6 files changed, 121 insertions(+), 36 deletions(-)
 create mode 100644 tools/testing/selftests/alsa/pcm-test.conf

Comments

Mark Brown Dec. 1, 2022, 7:52 p.m. UTC | #1
On Thu, Dec 01, 2022 at 06:33:33PM +0100, Jaroslav Kysela wrote:

> Obtain all test parameters from the configuration files. The defaults
> are defined in the pcm-test.conf file. The test count and parameters
> may be variable per specific hardware.

> Also, handle alt_formats field now (with the fixes in the format loop).
> It replaces the original "automatic" logic which is not so universal.

> The code may be further extended to skip various tests based
> on the configuration hints, if the exact PCM hardware parameters
> are not available for the given hardware.

> --- a/tools/testing/selftests/alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf
> +++ b/tools/testing/selftests/alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf
> @@ -55,6 +55,14 @@ card.hda {
>  				period_size 24000
>  				buffer_size 192000
>  			}
> +			test.time3 {
> +				access RW_INTERLEAVED
> +				format S16_LE
> +				rate 44100
> +				channels 2
> +				period_size 24000
> +				buffer_size 192000
> +			}

I really do think we should be giving these names which help people
understand what the tests are intending to cover, it'll make it easier
to both understand the results and maintian the configurations going
forward.  Or at least commenting things, but names is probably better.
Since the timeN is also used to figure out what type of test we're doing
that'd mean either adding an explicit test_type field 

	pcm.test.48k2_S16 {
		test_type time

or block

	pcm.test.time.48k2_S16

or alternatively adding a human reabale name field

	pcm.test.time1 {
		description "48kHz Stereo S16_LE"

which is more readable but does mean that automated systems aren't going
to surface the meaningful name for users so readily - you get things
like

	https://linux.kernelci.org/test/plan/id/6388c0cba8274c94402abd12/
	https://linux.kernelci.org/test/plan/id/6388ce6efef77e61ab2abd10/

so there's a UI barrier before people see the test.

mixer-test is kind of "fun" in how many test results it can generate on
bigger systems but hey, and there's some output corruption going on in
the first link which looses us the capture tests.  I have toyed with the
idea of putting the control names into the mixer test names, but some of
the test systems currently struggle with parsing spaces in the test
name.

I do see this is all kind of baked into snd_config_get_type()
unfortunately so perhaps the new description/name field is the best
option here?  We could add that incrementally.

>  	for (pcm = pcm_list; pcm != NULL; pcm = pcm->next) {
> -		test_pcm_time1(pcm, "test.time1", "S16_LE", 48000, 2, 512, 4096);
> -		test_pcm_time1(pcm, "test.time2", "S16_LE", 48000, 2, 24000, 192000);
> +		cfg = pcm->pcm_config;
> +		if (cfg == NULL)
> +			cfg = default_pcm_config;
> +		cfg = conf_get_subtree(cfg, "test", NULL);
> +		if (cfg == NULL)
> +			continue;
> +		snd_config_for_each(i, next, cfg) {

I can see the benefit in moving the defaults to a configuration file
instead of code but rather than having it be an either/or it seems much
better to have the board specific configuration file extend the
defaults, resulting in us looping over both files if we've got both.
We'd need to have something that avoided collisions, perhaps the
simplest thing would be to just add an element into the printed test
name for the source of the config so we get output like:

	ok 1 test.default.time1.0.0.0.PLAYBACK
	ok 2 test.system.time1.0.0.0.PLAYBACK

That does mean that the system test list can't replace the generic test
list but like I said elsewhere I think that would be a good thing for
clarity anyway ("X works on system A but not the very similar system B,
what's broken about system B...").

> --- /dev/null
> +++ b/tools/testing/selftests/alsa/pcm-test.conf
> @@ -0,0 +1,16 @@
> +pcm.test.time1 {
> +	format S16_LE
> +	alt_formats [ S32_LE ]
> +	rate 48000
> +	channels 2
> +	period_size 512
> +	buffer_size 4096
> +}
> +pcm.test.time2 {
> +	format S16_LE
> +	alt_formats [ S32_LE ]
> +	rate 48000
> +	channels 2
> +	period_size 24000
> +	buffer_size 192000
> +}

It's probably especially important that anything in a default
configuration should skip on the constraints not being satisfied since
we've no idea what the hardware we're running on is.  Rather than
requiring skipping to be explicitly configured perhaps we could just set
a flag based on if we're reading the default tests or a system specific
file, I'm not sure I see a sensible use case for system specific tests
specifying a configuration that can't be satisfied.  Doing things that
way the flag could either mean we add skipping or that we report two
results for each configured test:

	not ok 1 test.system.time1.0.0.0.PLAYBACK.constraints
	ok 2 test.system.time1.0.0.0.PLAYBACK # SKIP

which is perhaps marginally simpler to implement and makes it clearer in
the results if it was a straight up logic failure rather than a timing
failure.

I would also like to see 44.1kHz, 96kHz and at least one mono and one 6
channel configuration adding (in my patches I added 8kHz mono since it's
the most common practical mono format and 8kHz stereo so if 8kHz mono
doesn't work it's a bit more obvious if it's mono or 8kHz that's broken).
That could definitely be done incrementally though.
Jaroslav Kysela Dec. 1, 2022, 8:42 p.m. UTC | #2
On 01. 12. 22 20:52, Mark Brown wrote:
> On Thu, Dec 01, 2022 at 06:33:33PM +0100, Jaroslav Kysela wrote:
> 
>> Obtain all test parameters from the configuration files. The defaults
>> are defined in the pcm-test.conf file. The test count and parameters
>> may be variable per specific hardware.
> 
>> Also, handle alt_formats field now (with the fixes in the format loop).
>> It replaces the original "automatic" logic which is not so universal.
> 
>> The code may be further extended to skip various tests based
>> on the configuration hints, if the exact PCM hardware parameters
>> are not available for the given hardware.
> 
>> --- a/tools/testing/selftests/alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf
>> +++ b/tools/testing/selftests/alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf
>> @@ -55,6 +55,14 @@ card.hda {
>>   				period_size 24000
>>   				buffer_size 192000
>>   			}
>> +			test.time3 {
>> +				access RW_INTERLEAVED
>> +				format S16_LE
>> +				rate 44100
>> +				channels 2
>> +				period_size 24000
>> +				buffer_size 192000
>> +			}
> 
> I really do think we should be giving these names which help people
> understand what the tests are intending to cover, it'll make it easier
> to both understand the results and maintian the configurations going
> forward.  Or at least commenting things, but names is probably better.
> Since the timeN is also used to figure out what type of test we're doing
> that'd mean either adding an explicit test_type field

I'm fine with the new / more descriptive / names. I just concentrated to the 
functionality.

>>   	for (pcm = pcm_list; pcm != NULL; pcm = pcm->next) {
>> -		test_pcm_time1(pcm, "test.time1", "S16_LE", 48000, 2, 512, 4096);
>> -		test_pcm_time1(pcm, "test.time2", "S16_LE", 48000, 2, 24000, 192000);
>> +		cfg = pcm->pcm_config;
>> +		if (cfg == NULL)
>> +			cfg = default_pcm_config;
>> +		cfg = conf_get_subtree(cfg, "test", NULL);
>> +		if (cfg == NULL)
>> +			continue;
>> +		snd_config_for_each(i, next, cfg) {
> 
> I can see the benefit in moving the defaults to a configuration file
> instead of code but rather than having it be an either/or it seems much
> better to have the board specific configuration file extend the
> defaults, resulting in us looping over both files if we've got both.
> We'd need to have something that avoided collisions, perhaps the
> simplest thing would be to just add an element into the printed test
> name for the source of the config so we get output like:
> 
> 	ok 1 test.default.time1.0.0.0.PLAYBACK
> 	ok 2 test.system.time1.0.0.0.PLAYBACK
> 
> That does mean that the system test list can't replace the generic test
> list but like I said elsewhere I think that would be a good thing for
> clarity anyway ("X works on system A but not the very similar system B,
> what's broken about system B...").

As for the generic tool, I would like to have an option to control all mechanisms:

1) only default config
2) only hw specific config
3) default + hw specific configs (merged)

A new field in the pcm configuration may be added for this to build the 
per-device-stream configuration. If we merge those two configs, I think that 
only a field which will skip the default config is sufficient to cover all 
possibilities. The test names are a good identification, what was executed.

>> --- /dev/null
>> +++ b/tools/testing/selftests/alsa/pcm-test.conf
>> @@ -0,0 +1,16 @@
>> +pcm.test.time1 {
>> +	format S16_LE
>> +	alt_formats [ S32_LE ]
>> +	rate 48000
>> +	channels 2
>> +	period_size 512
>> +	buffer_size 4096
>> +}
>> +pcm.test.time2 {
>> +	format S16_LE
>> +	alt_formats [ S32_LE ]
>> +	rate 48000
>> +	channels 2
>> +	period_size 24000
>> +	buffer_size 192000
>> +}
> 
> It's probably especially important that anything in a default
> configuration should skip on the constraints not being satisfied since
> we've no idea what the hardware we're running on is.  Rather than
> requiring skipping to be explicitly configured perhaps we could just set
> a flag based on if we're reading the default tests or a system specific
> file, I'm not sure I see a sensible use case for system specific tests
> specifying a configuration that can't be satisfied.  Doing things that
> way the flag could either mean we add skipping or that we report two
> results for each configured test:
> 
> 	not ok 1 test.system.time1.0.0.0.PLAYBACK.constraints
> 	ok 2 test.system.time1.0.0.0.PLAYBACK # SKIP
> 
> which is perhaps marginally simpler to implement and makes it clearer in
> the results if it was a straight up logic failure rather than a timing
> failure.
> 
> I would also like to see 44.1kHz, 96kHz and at least one mono and one 6
> channel configuration adding (in my patches I added 8kHz mono since it's
> the most common practical mono format and 8kHz stereo so if 8kHz mono
> doesn't work it's a bit more obvious if it's mono or 8kHz that's broken).
> That could definitely be done incrementally though.

I've not really have any objections about this.

As Takashi already applied your set, I'll try to merge my code with yours and 
keep the added functionality.

				Thanks,
					Jaroslav
Mark Brown Dec. 1, 2022, 10:55 p.m. UTC | #3
On Thu, Dec 01, 2022 at 09:42:52PM +0100, Jaroslav Kysela wrote:

> As for the generic tool, I would like to have an option to control all mechanisms:

> 1) only default config
> 2) only hw specific config
> 3) default + hw specific configs (merged)

> A new field in the pcm configuration may be added for this to build the
> per-device-stream configuration. If we merge those two configs, I think that
> only a field which will skip the default config is sufficient to cover all
> possibilities. The test names are a good identification, what was executed.

That's not quite what I implemented (see below...) but I think it's
not too far off.

> As Takashi already applied your set, I'll try to merge my code with yours
> and keep the added functionality.

I've just prepared a branch which I'm running through my tests now which
just does a revert, applies your patch and then does most of what I
said.  I'll send it later assuming everything looks OK enough on my test
farm (probably in ~1-2 hours).  All this working together is great but
one of us needs to move to a different timezone.  :P  I think we're
*roughly* on the same page here, just some edges to iron out.

For the config merging/selection thing I think for the merge we could
either add a section in the per-board config which overrides defaults or
just go with what I did where we add the test list, as well as adding
command line options to select only the per-board or only the
per-system.  Does any of that line up with what you were thinking?

The following changes since commit 7d721baea138696d5a6746fb5bce0a510a91bd65:

  kselftest/alsa: Add more coverage of sample rates and channel counts (2022-12-01 20:02:14 +0100)

are available in the Git repository at:

  https://git.kernel.org/pub/scm/linux/kernel/git/broonie/ci.git alsa-pcm-test-hacks

for you to fetch changes up to 9e96c58e581313bd639fd51f37c8e831d7b4a05c:

  kselftest/alsa: pcm - Add more coverage by default (2022-12-01 22:46:30 +0000)

----------------------------------------------------------------
Jaroslav Kysela (1):
      kselftest/alsa: pcm - move more configuration to configuration files

Mark Brown (6):
      kselftest/alsa: pcm - Drop recent coverage improvement changes
      kselftest/alsa: pcm - Always run the default set of tests
      kselftest/alsa: pcm - skip tests when we fail to set params
      kselftest/alsa: pcm - Support optional description for tests
      kselftest/alsa: pcm - Provide descriptions for the default tests
      kselftest/alsa: pcm - Add more coverage by default

 tools/testing/selftests/alsa/Makefile              |   2 +-
 tools/testing/selftests/alsa/alsa-local.h          |   3 +
 tools/testing/selftests/alsa/conf.c                |  26 ++-
 .../alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf       |  43 +++--
 tools/testing/selftests/alsa/pcm-test.c            | 205 ++++++++++++++-------
 tools/testing/selftests/alsa/pcm-test.conf         |  63 +++++++
 6 files changed, 250 insertions(+), 92 deletions(-)
 create mode 100644 tools/testing/selftests/alsa/pcm-test.conf
diff mbox series

Patch

diff --git a/tools/testing/selftests/alsa/Makefile b/tools/testing/selftests/alsa/Makefile
index a8c0383878d3..77fba3e498cc 100644
--- a/tools/testing/selftests/alsa/Makefile
+++ b/tools/testing/selftests/alsa/Makefile
@@ -14,7 +14,7 @@  TEST_GEN_PROGS := mixer-test pcm-test
 
 TEST_GEN_PROGS_EXTENDED := libatest.so
 
-TEST_FILES := conf.d
+TEST_FILES := conf.d pcm-test.conf
 
 include ../lib.mk
 
diff --git a/tools/testing/selftests/alsa/alsa-local.h b/tools/testing/selftests/alsa/alsa-local.h
index 65f197ea9773..de030dc23bd1 100644
--- a/tools/testing/selftests/alsa/alsa-local.h
+++ b/tools/testing/selftests/alsa/alsa-local.h
@@ -12,6 +12,7 @@ 
 
 snd_config_t *get_alsalib_config(void);
 
+snd_config_t *conf_load_from_file(const char *filename);
 void conf_load(void);
 void conf_free(void);
 snd_config_t *conf_by_card(int card);
@@ -20,5 +21,7 @@  int conf_get_count(snd_config_t *root, const char *key1, const char *key2);
 const char *conf_get_string(snd_config_t *root, const char *key1, const char *key2, const char *def);
 long conf_get_long(snd_config_t *root, const char *key1, const char *key2, long def);
 int conf_get_bool(snd_config_t *root, const char *key1, const char *key2, int def);
+void conf_get_string_array(snd_config_t *root, const char *key1, const char *key2,
+			   const char **array, int array_size, const char *def);
 
 #endif /* __ALSA_LOCAL_H */
diff --git a/tools/testing/selftests/alsa/conf.c b/tools/testing/selftests/alsa/conf.c
index c7ffc8f04195..d7aafe5a1993 100644
--- a/tools/testing/selftests/alsa/conf.c
+++ b/tools/testing/selftests/alsa/conf.c
@@ -125,7 +125,7 @@  static int dump_config_tree(snd_config_t *top)
 	snd_output_close(out);
 }
 
-static snd_config_t *load(const char *filename)
+snd_config_t *conf_load_from_file(const char *filename)
 {
 	snd_config_t *dst;
 	snd_input_t *input;
@@ -235,7 +235,7 @@  static bool test_filename1(int card, const char *filename, const char *sysfs_car
 	snd_config_t *config, *sysfs_config, *card_config, *sysfs_card_config, *node;
 	snd_config_iterator_t i, next;
 
-	config = load(filename);
+	config = conf_load_from_file(filename);
 	if (snd_config_search(config, "sysfs", &sysfs_config) ||
 	    snd_config_get_type(sysfs_config) != SND_CONFIG_TYPE_COMPOUND)
 		ksft_exit_fail_msg("Missing global sysfs block in filename %s\n", filename);
@@ -446,3 +446,25 @@  int conf_get_bool(snd_config_t *root, const char *key1, const char *key2, int de
 		ksft_exit_fail_msg("key '%s'.'%s' is not an bool\n", key1, key2);
 	return !!ret;
 }
+
+void conf_get_string_array(snd_config_t *root, const char *key1, const char *key2,
+			   const char **array, int array_size, const char *def)
+{
+	snd_config_t *cfg;
+	char buf[16];
+	int ret, index;
+
+	ret = conf_get_by_keys(root, key1, key2, &cfg);
+	if (ret == -ENOENT)
+		cfg = NULL;
+	else if (ret < 0)
+		ksft_exit_fail_msg("key '%s'.'%s' search error: %s\n", key1, key2, snd_strerror(ret));
+	for (index = 0; index < array_size; index++) {
+		if (cfg == NULL) {
+			array[index] = def;
+		} else {
+			sprintf(buf, "%i", index);
+			array[index] = conf_get_string(cfg, buf, NULL, def);
+		}
+	}
+}
diff --git a/tools/testing/selftests/alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf b/tools/testing/selftests/alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf
index 0a83f35d43eb..5b40a916295d 100644
--- a/tools/testing/selftests/alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf
+++ b/tools/testing/selftests/alsa/conf.d/Lenovo_ThinkPad_P1_Gen2.conf
@@ -55,6 +55,14 @@  card.hda {
 				period_size 24000
 				buffer_size 192000
 			}
+			test.time3 {
+				access RW_INTERLEAVED
+				format S16_LE
+				rate 44100
+				channels 2
+				period_size 24000
+				buffer_size 192000
+			}
 		}
 		CAPTURE {
 			# use default tests, check for the presence
diff --git a/tools/testing/selftests/alsa/pcm-test.c b/tools/testing/selftests/alsa/pcm-test.c
index 6e7dfc395b98..e973b03ae1fd 100644
--- a/tools/testing/selftests/alsa/pcm-test.c
+++ b/tools/testing/selftests/alsa/pcm-test.c
@@ -31,7 +31,6 @@  struct pcm_data {
 	struct pcm_data *next;
 };
 
-int num_pcms = 0;
 struct pcm_data *pcm_list = NULL;
 
 int num_missing = 0;
@@ -200,7 +199,6 @@  static void find_pcms(void)
 					pcm_data->pcm_config = conf_get_subtree(card_config, key, NULL);
 					pcm_data->next = pcm_list;
 					pcm_list = pcm_data;
-					num_pcms++;
 				}
 			}
 		}
@@ -219,17 +217,15 @@  static void find_pcms(void)
 	snd_config_delete(config);
 }
 
-static void test_pcm_time1(struct pcm_data *data,
-			   const char *cfg_prefix, const char *sformat,
-			   long srate, long schannels,
-			   long speriod_size, long sbuffer_size)
+static void test_pcm_time(struct pcm_data *data, const char *test_name, snd_config_t *pcm_cfg)
 {
 	char name[64], key[128], msg[256];
 	const char *cs;
 	int i, err;
 	snd_pcm_t *handle = NULL;
 	snd_pcm_access_t access = SND_PCM_ACCESS_RW_INTERLEAVED;
-	snd_pcm_format_t format;
+	snd_pcm_format_t format, old_format;
+	const char *alt_formats[8];
 	unsigned char *samples = NULL;
 	snd_pcm_sframes_t frames;
 	long long ms;
@@ -237,27 +233,23 @@  static void test_pcm_time1(struct pcm_data *data,
 	unsigned int rrate;
 	snd_pcm_uframes_t rperiod_size, rbuffer_size, start_threshold;
 	timestamp_t tstamp;
-	bool pass = false, automatic = true;
+	bool pass = false;
 	snd_pcm_hw_params_t *hw_params;
 	snd_pcm_sw_params_t *sw_params;
 
 	snd_pcm_hw_params_alloca(&hw_params);
 	snd_pcm_sw_params_alloca(&sw_params);
 
-	cs = conf_get_string(data->pcm_config, cfg_prefix, "format", sformat);
+	cs = conf_get_string(pcm_cfg, "format", NULL, "S16_LE");
 	format = snd_pcm_format_value(cs);
 	if (format == SND_PCM_FORMAT_UNKNOWN)
 		ksft_exit_fail_msg("Wrong format '%s'\n", cs);
-	rate = conf_get_long(data->pcm_config, cfg_prefix, "rate", srate);
-	channels = conf_get_long(data->pcm_config, cfg_prefix, "channels", schannels);
-	period_size = conf_get_long(data->pcm_config, cfg_prefix, "period_size", speriod_size);
-	buffer_size = conf_get_long(data->pcm_config, cfg_prefix, "buffer_size", sbuffer_size);
-
-	automatic = strcmp(sformat, snd_pcm_format_name(format)) == 0 &&
-			srate == rate &&
-			schannels == channels &&
-			speriod_size == period_size &&
-			sbuffer_size == buffer_size;
+	conf_get_string_array(pcm_cfg, "alt_formats", NULL,
+				alt_formats, ARRAY_SIZE(alt_formats), NULL);
+	rate = conf_get_long(pcm_cfg, "rate", NULL, 48000);
+	channels = conf_get_long(pcm_cfg, "channels", NULL, 2);
+	period_size = conf_get_long(pcm_cfg, "period_size", NULL, 4096);
+	buffer_size = conf_get_long(pcm_cfg, "buffer_size", NULL, 16384);
 
 	samples = malloc((rate * channels * snd_pcm_format_physical_width(format)) / 8);
 	if (!samples)
@@ -287,16 +279,29 @@  static void test_pcm_time1(struct pcm_data *data,
 					   snd_pcm_access_name(access), snd_strerror(err));
 		goto __close;
 	}
+	i = -1;
 __format:
 	err = snd_pcm_hw_params_set_format(handle, hw_params, format);
 	if (err < 0) {
-		if (automatic && format == SND_PCM_FORMAT_S16_LE) {
-			format = SND_PCM_FORMAT_S32_LE;
-			ksft_print_msg("%s.%d.%d.%d.%s.%s format S16_LE -> S32_LE\n",
-					 cfg_prefix,
-					 data->card, data->device, data->subdevice,
-					 snd_pcm_stream_name(data->stream),
-					 snd_pcm_access_name(access));
+		i++;
+		if (i < ARRAY_SIZE(alt_formats) && alt_formats[i]) {
+			old_format = format;
+			format = snd_pcm_format_value(alt_formats[i]);
+			if (format != SND_PCM_FORMAT_UNKNOWN) {
+				ksft_print_msg("%s.%d.%d.%d.%s.%s format %s -> %s\n",
+						 test_name,
+						 data->card, data->device, data->subdevice,
+						 snd_pcm_stream_name(data->stream),
+						 snd_pcm_access_name(access),
+						 snd_pcm_format_name(old_format),
+						 snd_pcm_format_name(format));
+				samples = realloc(samples, (rate * channels *
+							    snd_pcm_format_physical_width(format)) / 8);
+				if (!samples)
+					ksft_exit_fail_msg("Out of memory\n");
+				snd_pcm_format_set_silence(format, samples, rate * channels);
+				goto __format;
+			}
 		}
 		snprintf(msg, sizeof(msg), "snd_pcm_hw_params_set_format %s: %s",
 					   snd_pcm_format_name(format), snd_strerror(err));
@@ -362,7 +367,7 @@  static void test_pcm_time1(struct pcm_data *data,
 	}
 
 	ksft_print_msg("%s.%d.%d.%d.%s hw_params.%s.%s.%ld.%ld.%ld.%ld sw_params.%ld\n",
-			 cfg_prefix,
+			 test_name,
 			 data->card, data->device, data->subdevice,
 			 snd_pcm_stream_name(data->stream),
 			 snd_pcm_access_name(access),
@@ -411,7 +416,7 @@  static void test_pcm_time1(struct pcm_data *data,
 	pass = true;
 __close:
 	ksft_test_result(pass, "%s.%d.%d.%d.%s%s%s\n",
-			 cfg_prefix,
+			 test_name,
 			 data->card, data->device, data->subdevice,
 			 snd_pcm_stream_name(data->stream),
 			 msg[0] ? " " : "", msg);
@@ -420,19 +425,35 @@  static void test_pcm_time1(struct pcm_data *data,
 		snd_pcm_close(handle);
 }
 
-#define TESTS_PER_PCM 2
-
 int main(void)
 {
 	struct pcm_data *pcm;
+	snd_config_t *global_config, *default_pcm_config, *cfg, *pcm_cfg;
+	snd_config_iterator_t i, next;
+	int num_pcm_tests = 0, num_tests;
+	const char *test_name, *test_type;
 
 	ksft_print_header();
 
+	global_config = conf_load_from_file("pcm-test.conf");
+	default_pcm_config = conf_get_subtree(global_config, "pcm", NULL);
+	if (default_pcm_config == NULL)
+		ksft_exit_fail_msg("default pcm test configuration (pcm compound) is missing\n");
+
 	conf_load();
 
 	find_pcms();
 
-	ksft_set_plan(num_missing + num_pcms * TESTS_PER_PCM);
+	for (pcm = pcm_list; pcm != NULL; pcm = pcm->next) {
+		cfg = pcm->pcm_config;
+		if (cfg == NULL)
+			cfg = default_pcm_config;
+		num_tests = conf_get_count(cfg, "test", NULL);
+		if (num_tests > 0)
+			num_pcm_tests += num_tests;
+	}
+
+	ksft_set_plan(num_missing + num_pcm_tests);
 
 	for (pcm = pcm_missing; pcm != NULL; pcm = pcm->next) {
 		ksft_test_result(false, "test.missing.%d.%d.%d.%s\n",
@@ -441,10 +462,25 @@  int main(void)
 	}
 
 	for (pcm = pcm_list; pcm != NULL; pcm = pcm->next) {
-		test_pcm_time1(pcm, "test.time1", "S16_LE", 48000, 2, 512, 4096);
-		test_pcm_time1(pcm, "test.time2", "S16_LE", 48000, 2, 24000, 192000);
+		cfg = pcm->pcm_config;
+		if (cfg == NULL)
+			cfg = default_pcm_config;
+		cfg = conf_get_subtree(cfg, "test", NULL);
+		if (cfg == NULL)
+			continue;
+		snd_config_for_each(i, next, cfg) {
+			pcm_cfg = snd_config_iterator_entry(i);
+			if (snd_config_get_id(pcm_cfg, &test_name) < 0)
+				ksft_exit_fail_msg("snd_config_get_id\n");
+			test_type = conf_get_string(pcm_cfg, "type", NULL, "time");
+			if (strcmp(test_type, "time") == 0)
+				test_pcm_time(pcm, test_name, pcm_cfg);
+			else
+				ksft_exit_fail_msg("unknown test type '%s'\n", test_type);
+		}
 	}
 
+	snd_config_delete(global_config);
 	conf_free();
 
 	ksft_exit_pass();
diff --git a/tools/testing/selftests/alsa/pcm-test.conf b/tools/testing/selftests/alsa/pcm-test.conf
new file mode 100644
index 000000000000..473a19251b49
--- /dev/null
+++ b/tools/testing/selftests/alsa/pcm-test.conf
@@ -0,0 +1,16 @@ 
+pcm.test.time1 {
+	format S16_LE
+	alt_formats [ S32_LE ]
+	rate 48000
+	channels 2
+	period_size 512
+	buffer_size 4096
+}
+pcm.test.time2 {
+	format S16_LE
+	alt_formats [ S32_LE ]
+	rate 48000
+	channels 2
+	period_size 24000
+	buffer_size 192000
+}