Mantis Bugtracker          
testlink.org

View Issue Details Jump to Notes ] Issue History ] Print ]
IDProjectCategoryView StatusDate SubmittedLast Update
0007857TestLinkNew Featurepublic2017-02-22 10:492017-05-13 19:12
Reporterikhaliq 
Assigned To 
PriorityhighSeverityfeature requestReproducibilityN/A
StatusnewResolutionopen 
PlatformOSOS Version
Product Version1.9.15 (2015 Q4) 
Fixed in Version 
Summary0007857: Enhance configuration support at test plan level
DescriptionCurrently, we have following two configurations available at test plan level:
* Builds / Releases
* Platforms

During testing of embedded systems, one needs to use software and hardware combinations. For example, following are some parameters that will needed to be considered during test execution:
* Host
* Product Type/Variant
* Build version, i.e nightly, test/release candidate etc...
* Target
* Architecture
* Toolchain
* Debug support, i.e JTAG, built-in etc...
* Build configuration, i.e Debug/Release

And there can be many others like these. So, having just current available parameters during execution (Build/Release, Platform), it becomes very difficult to incorporate and maintain the different variations/combinations within the single test plan, if we need to execute generic test cases on the different combinations.

One can use, for example, Host as Platform and Build version as Build/Release, but the other parameters cannot be adjusted appropriately.

Another option is to use separate test plans and incorporate these parameters by creating multiple platforms, but it makes it complex and difficult to maintain, because we will need to create multiple test plans for shared/generic test cases.

The suggestion under this issue is to add more flexibility in configuration support at test plan level so that the complexity of maintaining test cases for different variation can reduce.

There could be different options added at test project level which are similar to Platform (may have to choice to add for a given test plan), i.e Target, Software etc...

Or atleast a very generic option such as "Configuration" can be introduced which behaves similar to current "Platform" option. Having this option would certainly help us reducing the complexity and more filtering.

The new configuration field will need to have same behavior as that of Platform parameter in following areas:
* test plan creation
* test plan execution
* reporting
TagsNo tags attached.
Database (MySQL,Postgres,etc)MySQL
Browser
PHP Version
TestCaseID
QA Team - Task Workflow StatusTBD
Attached Filespng file icon Screen Shot 2017-05-13 at 20.03.39.png [^] (63,966 bytes) 2017-05-13 18:06


png file icon Screen Shot 2017-05-13 at 20.01.31.png [^] (98,767 bytes) 2017-05-13 18:06


xml file icon customFields (2).xml [^] (6,296 bytes) 2017-05-13 18:06
png file icon TestLink_Configuration_feature.png [^] (74,917 bytes) 2017-05-13 18:43

- Relationships
related to 0001650closedfman Using Custom fields for TestPlan assignment for manage test automation data 

-  Notes
(0026015)
fman (administrator)
2017-02-22 16:37

If configuration is like platform, then you have the solution already.

More detailed info, usage and examples are needed.
ALso impacts on execution and reports need to be provided
(0026038)
ikhaliq (reporter)
2017-02-27 10:07

Okay, I'll add more details here.
(0026104)
fman (administrator)
2017-03-13 20:38

any news ?
(0026140)
ikhaliq (reporter)
2017-03-20 12:06

Yes, I did not forget about it. I'll provide the details here soon.
(0026240)
ikhaliq (reporter)
2017-04-04 09:03
edited on: 2017-04-04 09:04

Example use case: Test Execution Management for Debugger Testing for Embedded Systems

Product:
* Test Alpha

Host Platforms:
* Windows 7 64bit
* Windows 7 32bit
* Windows 8 64bit
* Windows 8 32bit
* Windows 10 64bit
* Windows 10 32bit
* Ubuntu 16.04 64bit
* Ubuntu 16.04 64bit
* CentOS 7 64bit
* RHEL 7 64bit
* Fedora 23 64bit
* etc...

Toolchains:
* GCC
* MinGW
* etc...

Target Architectures:
* ARM
* PPC
* MIPS
* etc...

Target Boards:
* ARM
** Board A
** Board B
* PPC
** Board C
** Board D
* MIPS
** Board E
* etc...

Debug connection:
* Built-in
* JTAG
* External/3rd party support
* etc...

Workflow Example:

The product "Test Alpha" version 1.0 is released with the specifications mentioned above. Now, we would receive multiple builds so, I use the "Build/Releases" option from testlink to map to the product builds, to mark my test execution on different builds:
* Build 1
* Build 2
* Build 3
etc...

With currently available functionality, I use "Platforms" option to map the hosts operating systems. However, I have no easy way to map the other parameters, i.e Toolchains, Target Architecture and boards etc...

Please note that this is very common use case example for the IDE testing for the embedded software. The combinations from different parameters can easily vary. For example, MIPS + Built_in debug connection might be a expected scenario but PPC + built_in might not be applicable for a particular release.

I can possibly create separate test plans for each unique combination but this would be really hetic to maintain. Since, we would be expecting the product releases periodically with changes in the supported parameters expected.

Suggested Feature:

There a "project-level" field available which is generically named as Configuration. All required configurations can be added in the project and picked up as desired when preparing test plan.

Suggested Workflow:

Assumption:
* There is a project named as "Test Alpha" which exist in TestLink.

Steps:
1- User goes to Desktop page.
2- Under "Test Project" category, "Configuration Management" is chosen.
3- This open a new page, where "Create Configuration" button is pressed.
3.1- User provide a name for the configuration, i.e Target Architectures.
3.2- User provide all the possible/desired combinations in its value field, i.e ARM, PPC, MIPS etc...
4- User repeats the same step to add other required configurations, i.e Toolchains, Target boards etc...
5- Now, user creates a new test plan by following the current process.
6- Under the Test Plan contents:
6.1- User adds Platforms.
6.2- User adds Configurations available in the project.
6.3- User adds test cases:
6.3.1- User selects the test suite from adding test cases.
6.3.2- Apart from Platforms list, there is a list available for available configurations for the test plan under the test cases selection pane.
6.3.3- User chooses the desired combinations for the test cases, i.e platform and configurations. Platform has higher preference for filtering purposes.
6.3.4- User completes the desired addition of test cases.
6.4- User adds the build for execution.
6.5- Now, the execution process is started.
6.5.1- User choose the test plan. It filters the Platform list as current behavior is. There is combo appearing below named as configuration, which automatically populates the list of selected configurations for the given platform.
Example: For RHEL 7 64bit platform, following different combinations were chosen for different test cases:
Target architecture: ARM, Board: Board A, B, Toolchains: GCC, MinGW etc.. so, it shows the list as:
ARM / Board A / GCC
ARM / Board B / GCC
ARM / Board B / MinGW
6.5.2- User choose the desired platform and build.
6.5.3- User choose the desired combination and corresponding filtered test cases (very much like filtering done with multiple keywords/custom fields) appear based on platform and configuration combination chosen.
6.5.4- User completes the test execution for all desired combinations.
6.6- In the reporting view, user sees the test execution results being filtered under Platform > available configuration (same as appearing in the test execution case).

This is the crude workflow that I had in my mind. Off course, it depends upon the implementations details regarding feasibility of this workflow. So, I would be happy with any user friendly workflow that provides me the flexibility to test the use cases like above.

Note: This feature is entirely optional for the complex scenarios. The default workflow could be kept as same as it is now.

(0026324)
fman (administrator)
2017-04-27 05:08
edited on: 2017-04-27 05:13

Try playing with custom fields for Test Cases, that will be used on TestPlan Desing scope
see related issue

(0026325)
ikhaliq (reporter)
2017-04-27 06:28

I already tried using custom fields but was not able to achieve what was desired. Will look into the linked issue further...
(0026348)
ikhaliq (reporter)
2017-05-13 14:51
edited on: 2017-05-13 14:56

Hi Francisco,

I have gone through the linked issues and unfortunately these don't map to cover our requirements. I have also tried playing around with CFs but these also don't fulfill the needs.

To re-iterate, we have following workflow requirements:
* User switches to Desktop page and sees "Configuration Management" (or any other appropriate name) under "Test Project" category.
* After clicking on Add new Configuration option, user has the choice to create different combinations, i.e string, list, multiselection list... same as CFs configuration options but this should be available for test plan execution just like platform.
* After creating as many configurations as required with different names, user can add these to required test plan as done for platform.
* Now, user sees the list of a given configurations and their values at test case selection level.
* During test case execution, the test case can be filtered as per selections done on the test plan design level.

Long story short, custom fields do not make the test case iterations for each value as platforms do. If chosen for a test case, at design level, these don't even appear in the execution history (as apparent by current design). So, there is a need to a support which does not bound to a specific name and can have multi-values option for test case execution.

Please let me know what do you think.

(0026349)
ikhaliq (reporter)
2017-05-13 15:06
edited on: 2017-05-13 15:08

Another way of saying this is that current Platform feature is changed to have flexibility for configurations:
* Instead of having a fix Platform name, it should be variable and assigned by the user.
* For a given platform, a list of possible values can be provided.

So, what CFs facilitate at test case design level is required at test plan design, execution and reporting level, i.e multiple platform/configuration combo appearing at test plan design and execution.

(0026350)
fman (administrator)
2017-05-13 17:29

The linked issue is about CUSTOM FIELDS TO BE PRESENT AT TEST PLAN DESIGN, and these Custom fields can be assigned a different value for for each test case that is LINKED TO TEST PLAN.
This means that you can add a sort of different configuration for each test case you are linking to the test plan..

regarding
>> If chosen for a test case, at design level, these don't even appear in the execution
>> history (as apparent by current design).
You are doing some wrong config, because CF manageable at DESIGN TIME, can be configured to be ONLY SHOW (read only) at execution time.

>> So, there is a need to a support which does not bound to a specific name and can have >> multi-values option for test case execution.
unfortunately I do not understand the meaning of this sentence.

same applies for your note 26349.
 
>> 6.3.2- Apart from Platforms list, there is a list available for available configurations for the test plan under the test cases selection pane.
This happens when you use CF available at test plan design.

I'm going to add some screeshots
(0026351)
ikhaliq (reporter)
2017-05-13 17:36

If we had the demo (http://demo.testlink.org/latest/ [^]) running, i could explain limitations of CFs more clearly...
(0026352)
ikhaliq (reporter)
2017-05-13 17:40

Let me see if I can share some clear example...
(0026353)
fman (administrator)
2017-05-13 18:05

do you have time for a short skype call?
(0026354)
fman (administrator)
2017-05-13 18:05

Workflow with CF AT TEST PLAN DESIGN LEVEL IS:

1. you add test cases to test plan
2. after you add test case to test plan, you will be able to assign values for CF AT TEST PLAN DESIGN

see attachments
(0026355)
fman (administrator)
2017-05-13 18:24

we can talk about a custom development sponsored (i.e. payed) by your company.
just let me know
(0026356)
ikhaliq (reporter)
2017-05-13 18:54
edited on: 2017-05-13 18:59

Yes, I have already tried the CF workflow at test plan design level.

The CFs are very well mapping to the needs in terms of defining different parameters, i.e Architecture, Toolchain etc... but when it comes to assigning these to test cases, these are limiting.

From your example, although we can set the value of CFs for the given test case, but:
* We won't have any track of the execution history (once test execution is complete) for the values used. As you said, only those CFs will be available in history which were chosen to be available at test execution level.
* If I want to repeat the same test (a bunch of test cases in fact) for the different combinations, i.e Architecture, Toolchain etc... I can't do that with CFs. And this is one of major use cases for requesting this support.

I have attached a "possible" workflow diagram(http://mantis.testlink.org/file_download.php?file_id=4715&type=bug [^]) which reflects what is in my mind to solve the needs. Now, step1-3 are already covered by CFs but step-4 is what is currently missing.

Another way of saying is that, the current implementation of "Platforms" can be modified in such a way that these are created in the way CFs are created while used in their current workflow. Then we would have multiple numbers of "so-called" platforms (preferably ability to assign name for each option) appearing at test execution level.

Off the top of my head, I don't see it impacting current users of testlink much. They could have more control in test plan design and execution.

(0026357)
fman (administrator)
2017-05-13 19:08

Impact on testlink architecture is not light because testplan_tcversions table need to be changed to add another information regarding configuration, and this affects lot of tables and feature.
Implementation will be not easy.

>> * If I want to repeat the same test (a bunch of test cases in fact) for the different
>> combinations, i.e Architecture, Toolchain etc... I can't do that with CFs. And this is one of >> major use cases for requesting this support.
you can do this if you defined this CF to be available AT EXECUTION TIME.
But lot of reports need to be changes because latest execution concept need to be changed.

As already stated if your company is willing to pay for the development we can talk, otherwise effort is to high to be done on regular roadmap.
(0026358)
ikhaliq (reporter)
2017-05-13 19:12

Francisco,

Thanks a lot for being responsive. I'll discuss it and let's see what I can get on it.

Thanks,
Irfan

- Issue History
Date Modified Username Field Change
2017-02-22 10:49 ikhaliq New Issue
2017-02-22 16:36 fman QA Team - Task Workflow Status => TBD
2017-02-22 16:36 fman Severity major => feature request
2017-02-22 16:37 fman Note Added: 0026015
2017-02-22 16:38 fman Assigned To => fman
2017-02-22 16:38 fman Status new => feedback
2017-02-27 10:07 ikhaliq Note Added: 0026038
2017-02-27 10:07 ikhaliq Status feedback => assigned
2017-03-13 20:38 fman Note Added: 0026104
2017-03-13 20:38 fman Status assigned => feedback
2017-03-20 12:06 ikhaliq Note Added: 0026140
2017-03-20 12:06 ikhaliq Status feedback => assigned
2017-03-20 14:45 fman Assigned To fman =>
2017-04-04 09:03 ikhaliq Note Added: 0026240
2017-04-04 09:04 ikhaliq Note Edited: 0026240 View Revisions
2017-04-27 05:08 fman Note Added: 0026324
2017-04-27 05:08 fman Assigned To => fman
2017-04-27 05:08 fman Status assigned => feedback
2017-04-27 05:13 fman Relationship added related to 0001650
2017-04-27 05:13 fman Note Edited: 0026324 View Revisions
2017-04-27 06:28 ikhaliq Note Added: 0026325
2017-04-27 06:28 ikhaliq Status feedback => assigned
2017-05-13 14:51 ikhaliq Note Added: 0026348
2017-05-13 14:56 ikhaliq Note Edited: 0026348 View Revisions
2017-05-13 15:06 ikhaliq Note Added: 0026349
2017-05-13 15:08 ikhaliq Note Edited: 0026349 View Revisions
2017-05-13 17:29 fman Note Added: 0026350
2017-05-13 17:29 fman Status assigned => feedback
2017-05-13 17:36 ikhaliq Note Added: 0026351
2017-05-13 17:36 ikhaliq Status feedback => assigned
2017-05-13 17:40 ikhaliq Note Added: 0026352
2017-05-13 18:05 fman Note Added: 0026353
2017-05-13 18:05 fman Note Added: 0026354
2017-05-13 18:06 fman File Added: Screen Shot 2017-05-13 at 20.03.39.png
2017-05-13 18:06 fman File Added: Screen Shot 2017-05-13 at 20.01.31.png
2017-05-13 18:06 fman File Added: customFields (2).xml
2017-05-13 18:24 fman Note Added: 0026355
2017-05-13 18:24 fman Status assigned => feedback
2017-05-13 18:43 ikhaliq File Added: TestLink_Configuration_feature.png
2017-05-13 18:54 ikhaliq Note Added: 0026356
2017-05-13 18:54 ikhaliq Status feedback => assigned
2017-05-13 18:59 ikhaliq Note Edited: 0026356 View Revisions
2017-05-13 19:08 fman Note Added: 0026357
2017-05-13 19:09 fman Assigned To fman =>
2017-05-13 19:09 fman Status assigned => feedback
2017-05-13 19:12 ikhaliq Note Added: 0026358
2017-05-13 19:12 ikhaliq Status feedback => new



Copyright © 2000 - 2017 MantisBT Team
Powered by Mantis Bugtracker