DPTASM 18: Schematic Representation of DPM’s Productized Test Automation Service Model

For better clarity, in case you have already not, you may want to take a look at DPTASM 16: Basis of DPM’s Productized Test Automation Service Model before going through this Schematic Representation detailed below.

Figure 2: Tests In, Results Out aspect of DPM’s
Productized Test Automation Service Model
Tests In, Results Out: Figure 2: Tests In, Results Out aspect of DPM’s Productized Test Automation Service Model here represents a scenario in which when a new need for automated test execution is identified, Translated Tests are created and submitted, by a Product or Project Team, with the Test Automation Development and Engineering Team. A Translated Test is a Test Case broken down into a list of tiny UI Action and Verification steps. More information on Translated Tests is available here.

The Translated Test is the input part. The service delivered is the output part. On the inside, the service primarily comprises of creating Composed Tests; and Execution of the Composed Tests. On the outside, the service primarily comprises of Reporting the Results form Automatic Test Execution. What the end-user receives are the Results. It is important to notice here that the input to the Test Automation process is received in a very standardized form and the output is delivered as a service, again, in a very standardized form. Appropriate set of tools and technologies, for example a Private Cloud, may be used to build the infrastructure to facilitate and automate the entire “Tests In, Results Out” process. That would further enhance the 'scalability' and 'speed of communication' aspects of this mechanism tremendously.

Ingredients of ‘Time to User’: In case of DPM’s Productized Test Automation Service Model, in the scope of every transaction leading to an instance of service delivery, where a requirement is submitted in the form of a Translated Test, and output is delivered in the form of Execution Results, the Product or Project Team invests time on just translating each test into a set of more atomic steps, generating a Translated Test for each test in question. And, the ‘Test Automation Development and Engineering Team’ invests time on, pretty much, only the following aspects:

o   Creating the Composed Test
o   Testing the Automated Test (Composed Test)
o   Automatic Test Execution and Results Reporting

All the other activities take place 'outside' the scope of each transaction leading to an instance of service delivery mentioned above. That means, those other activities do not become a part of the core transaction in the service delivery process and, that’s why, do not enjoy an opportunity to negatively impact the Time to User aspect of the transaction. Moreover, those other activities are performed in the context of just one, and not “n”, instance(s) of Test Automation Infrastructure. So, the total time invested on those other activities is “(t/n)”, where “t” is the total time that would be required to perform those other activities, including all the maintenance and enhancements, if there were “n” instances of Test Automation Infrastructure to be taken care of.

Figure 3: DPM’s Productized Test Automation
Request Service Model
A case of Extreme Reuse: Instead of duplicating the elements of an infrastructure, when a productized service is built and delivered around that infrastructure, that automatically makes a case of extreme reuse that is not difficult to understand. In case of this Productized Service Model, the Common Infrastructure Artifacts need not be “Physically” instantiated or replicated or repeated in any way to achieve the benefits of ‘Reuse’. Every instance of service provided through delivery of ‘Real Results’ to multiple ‘Service Consumers’ results in achieving the benefits of ‘Reuse’. It is important to observe that Test Automation, as a Business Process and an area of Software Engineering Practices Application, inherently allows itself to be delivered as a Productized Service.

In case of DPM’s Productized Test Automation Service Model, a standardized Request comes in as an input and what goes out is, again, a standardized instance of Service as an output. This has been depicted in Figure 3: DPM’s Productized Test Automation Request Service Model.

Change Management and Release Management: Maintenance, Change Management, Release Management and Version Control are done strictly based on a predefined set of product-centric and productized service-centric process guidelines established in the context of DPM’s Productized Test Automation Service Model.

Figure 4: DPM’s Productized Test Automation
Cost Benefit Model
A case of Extreme Cost Reduction: In our example, stated in DPTASM 16: The Basis of DPM’s Productized Test Automation Service Model, the total cost incurred by each Product or Project Team on Test Automation is “x” and the quantified total benefit derived by each Product or Project Team from Test Automation is “X” when we don’t have DPM’s Productized Test Automation Service Model deployed in the organization. However, once this model is deployed, as depicted in Figure 4: DPM’s Productized Test Automation Cost Benefit Model, the benefit “X” to each Product or Project Team from Test Automation remains constant. 

And, the cost “x” incurred by each Product or Project Team on Test Automation gets reduced to (x/n), where “n” represents the numbers of Product or Project Teams, participating in the DPM’s Productized Test Automation Service Sharing Scheme, that have “n” or more software products or applications the tests of which need to be and are technically possible to be automated using just one mainstream Test Automation Tool “T” and Test Automation Framework “F”. And, the interesting part is this. The cost (x/n) keeps becoming even smaller as more Product or Project Teams participate in the DPM’s Productized Test Automation Service Scheme. 

Author's Note: In case you have a question around the DPM's Productized Test Automation Service Model, please feel free to get in touch with the author. There is, however, a possibility that the answer to your question is already available in one of the the previous sections of this documentation. The author states that this model results in a reduction in cost to an extent of 20% to 70%, a 30% to 70% decrease in “Time to the End-User”, a 15% to 65% gain in productivity and an impressive 25% to 75% increase in quality. To a certain extent, these claims sound too good, to be true. If I were you, I would like to know, how these numbers have been arrived at. Also, you probably notice that there are no references cited to support these claims. Let me clarify this here. These are based on my experiences and observations that I have gathered since I started working as a Test Automation Practitioner in the year 2002. You don’t have to believe what I have to say. However, I would like to encourage and invite you to apply your own reasoning and experiences in validating these claims to ascertain how this model might actually work in your own specific context. And, I am sure; you will be able to see the reality behind the picture that I intended to show. Thank you for your interest. I really appreciate your curiosity.

Comments

Popular Posts