Auto test generator: a framework to generate test cases from requirements in natural language

Detalhes bibliográficos
Ano de defesa: 2019
Autor(a) principal: PINA, Thaís Melise Lopes
Orientador(a): SAMPAIO, Augusto Cezar Alves
Banca de defesa: Não Informado pela instituição
Tipo de documento: Dissertação
Tipo de acesso: Acesso aberto
Idioma: eng
Instituição de defesa: Universidade Federal de Pernambuco
Programa de Pós-Graduação: Programa de Pos Graduacao em Ciencia da Computacao
Departamento: Não Informado pela instituição
País: Brasil
Palavras-chave em Português:
Link de acesso: https://repositorio.ufpe.br/handle/123456789/33916
Resumo: Testing is essential in the software engineering development process. However, it is also one of the most costly tasks. Thus, test automation has become the goal of many researches. Since design, implementation, and execution phases depend substantially on the system requirements, it is of the utmost importance that requirements text is standardized and clear. However, most companies use free natural language to write these documents, which entails the phenomenon of (lexical and structural) ambiguity, giving rise to different interpretations. An option to mitigate this problem is via the use of a Controlled Natural Language (CNL), aiming at standardization and accuracy of texts. A CNL is a subset of a natural language that uses a restrict lexicon to a particular domain, and follow grammatical rules which guide the elaboration of sentences, thus reducing ambiguity and allowing mechanized processing, like the automatic generation of test cases from CNL requirements. This work, in the software testing area, presents the Auto Test Generator (ATG), a tool to assist the writing of requirements and the automatic generation of test cases written in English, which are then automatically translated in test scripts using an automation framework. From a requirement written in CNL, the ATG creates a Use Case (UC). Due to the standardization of the language, it is possible to perform a consistency and dependency analysis, for each UC step, through a graph of associations (dependencies and cancellations) between test actions. Test cases are generated automatically in a transparent way from UCs to the user. ATG was developed and evaluated in partnership with Motorola Mobility. Experimental evaluations were performed. From the seven requirements analyzed, it was possible to create 34 test cases in total. The generated test cases resulted in 151 steps, which were passed to the Zygon (a proprietary automated tool for testing) in order to be automated. As a result, 131 test steps were correctly automated (86% of the total given as input).
id UFPE_48a3504cf5d62238edafaa0d6a7597e4
oai_identifier_str oai:repositorio.ufpe.br:123456789/33916
network_acronym_str UFPE
network_name_str Repositório Institucional da UFPE
repository_id_str
spelling PINA, Thaís Melise Lopeshttp://lattes.cnpq.br/7648955393706189http://lattes.cnpq.br/3977760354511853http://lattes.cnpq.br/5390541720896559SAMPAIO, Augusto Cezar AlvesBARROS, Flávia de Almeida2019-09-27T20:56:33Z2019-09-27T20:56:33Z2019-02-19https://repositorio.ufpe.br/handle/123456789/33916Testing is essential in the software engineering development process. However, it is also one of the most costly tasks. Thus, test automation has become the goal of many researches. Since design, implementation, and execution phases depend substantially on the system requirements, it is of the utmost importance that requirements text is standardized and clear. However, most companies use free natural language to write these documents, which entails the phenomenon of (lexical and structural) ambiguity, giving rise to different interpretations. An option to mitigate this problem is via the use of a Controlled Natural Language (CNL), aiming at standardization and accuracy of texts. A CNL is a subset of a natural language that uses a restrict lexicon to a particular domain, and follow grammatical rules which guide the elaboration of sentences, thus reducing ambiguity and allowing mechanized processing, like the automatic generation of test cases from CNL requirements. This work, in the software testing area, presents the Auto Test Generator (ATG), a tool to assist the writing of requirements and the automatic generation of test cases written in English, which are then automatically translated in test scripts using an automation framework. From a requirement written in CNL, the ATG creates a Use Case (UC). Due to the standardization of the language, it is possible to perform a consistency and dependency analysis, for each UC step, through a graph of associations (dependencies and cancellations) between test actions. Test cases are generated automatically in a transparent way from UCs to the user. ATG was developed and evaluated in partnership with Motorola Mobility. Experimental evaluations were performed. From the seven requirements analyzed, it was possible to create 34 test cases in total. The generated test cases resulted in 151 steps, which were passed to the Zygon (a proprietary automated tool for testing) in order to be automated. As a result, 131 test steps were correctly automated (86% of the total given as input).Testes são essenciais nos processos de desenvolvimento de software. Contudo, esta é também uma das tarefas mais custosas. Assim sendo, a automação de testes tornou-se objetivo de diversas pesquisas. Visto que as fases de projeto, implementação e execução de testes dependem essencialmente dos requisitos do sistema, é de suma importância que eles sejam textos padronizados e de qualidade. Todavia, a maioria das empresas utiliza linguagem natural livre para escrever essa documentação, podendo assim produzir textos com ambiguidade (léxica ou estrutural), dando margem a diferentes interpretações. Uma opção para mitigar esse problema é o uso de uma Linguagem Natural Controlada – CNL, do inglês Controlled Natural Language – visando padronização e precisão dos textos. Uma CNL é um subconjunto de uma dada língua natural, que usa um léxico restrito a um domínio particular e regras gramaticais que orientam a elaboração de sentenças, com redução de ambiguidade e permite mecanizar o processo, como a geração automática de casos de testes a partir de requisitos escritos na CNL. Este trabalho, na área de testes de software, apresenta o Auto Test Generator (ATG), uma ferramenta para auxiliar a escrita de requisitos usados na geração automática de casos de testes escritos em inglês, que são automaticamente traduzidos em scripts de testes usando um framework de automação. A partir de um requisito escrito na CNL, o ATG cria um caso de uso – UC, do inglês Use Case. Devido à padronização da linguagem, em cada passo do UC, foi possível fazer uma análise de consistência e dependência, através de um grafo de associações (dependências e cancelamentos) entre ações de teste. Os casos de teste são gerados automaticamente de modo transparente para o usuário a partir dos UCs. O ATG foi desenvolvido e avaliado em parceria com a Motorola Mobility. Foram feitas avaliações experimentais e, a partir de sete requisitos analisados, foi possível criar 34 casos de testes no total. Os casos de teste gerados resultaram em 151 passos, que foram passados para a ferramenta Zygon (uma ferramenta proprietária de automação de testes), a fim de serem automatizados. Como resultado, 131 passos de teste foram corretamente automatizados (86% do total dado como entrada).engUniversidade Federal de PernambucoPrograma de Pos Graduacao em Ciencia da ComputacaoUFPEBrasilAttribution-NonCommercial-NoDerivs 3.0 Brazilhttp://creativecommons.org/licenses/by-nc-nd/3.0/br/info:eu-repo/semantics/openAccessEngenharia de softwareEspecificação de requisitosLinguagem natural controladaAuto test generator: a framework to generate test cases from requirements in natural languageinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/masterThesismestradoreponame:Repositório Institucional da UFPEinstname:Universidade Federal de Pernambuco (UFPE)instacron:UFPETHUMBNAILDISSERTAÇÃO Thaís Melise Lopes Pina.pdf.jpgDISSERTAÇÃO Thaís Melise Lopes Pina.pdf.jpgGenerated Thumbnailimage/jpeg1240https://repositorio.ufpe.br/bitstream/123456789/33916/5/DISSERTA%c3%87%c3%83O%20Tha%c3%ads%20Melise%20Lopes%20Pina.pdf.jpg1fb45b5518b042fbdcb11bc5b8a54acaMD55ORIGINALDISSERTAÇÃO Thaís Melise Lopes Pina.pdfDISSERTAÇÃO Thaís Melise Lopes Pina.pdfapplication/pdf3739015https://repositorio.ufpe.br/bitstream/123456789/33916/1/DISSERTA%c3%87%c3%83O%20Tha%c3%ads%20Melise%20Lopes%20Pina.pdfe9959ccb183b2afbc06785960b4b800dMD51CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8811https://repositorio.ufpe.br/bitstream/123456789/33916/2/license_rdfe39d27027a6cc9cb039ad269a5db8e34MD52LICENSElicense.txtlicense.txttext/plain; charset=utf-82310https://repositorio.ufpe.br/bitstream/123456789/33916/3/license.txtbd573a5ca8288eb7272482765f819534MD53TEXTDISSERTAÇÃO Thaís Melise Lopes Pina.pdf.txtDISSERTAÇÃO Thaís Melise Lopes Pina.pdf.txtExtracted texttext/plain138743https://repositorio.ufpe.br/bitstream/123456789/33916/4/DISSERTA%c3%87%c3%83O%20Tha%c3%ads%20Melise%20Lopes%20Pina.pdf.txt75d5392d39076b6037f7ce0c3ab8fd48MD54123456789/339162019-10-25 10:58:31.248oai:repositorio.ufpe.br:123456789/33916TGljZW7Dp2EgZGUgRGlzdHJpYnVpw6fDo28gTsOjbyBFeGNsdXNpdmEKClRvZG8gZGVwb3NpdGFudGUgZGUgbWF0ZXJpYWwgbm8gUmVwb3NpdMOzcmlvIEluc3RpdHVjaW9uYWwgKFJJKSBkZXZlIGNvbmNlZGVyLCDDoCBVbml2ZXJzaWRhZGUgRmVkZXJhbCBkZSBQZXJuYW1idWNvIChVRlBFKSwgdW1hIExpY2Vuw6dhIGRlIERpc3RyaWJ1acOnw6NvIE7Do28gRXhjbHVzaXZhIHBhcmEgbWFudGVyIGUgdG9ybmFyIGFjZXNzw612ZWlzIG9zIHNldXMgZG9jdW1lbnRvcywgZW0gZm9ybWF0byBkaWdpdGFsLCBuZXN0ZSByZXBvc2l0w7NyaW8uCgpDb20gYSBjb25jZXNzw6NvIGRlc3RhIGxpY2Vuw6dhIG7Do28gZXhjbHVzaXZhLCBvIGRlcG9zaXRhbnRlIG1hbnTDqW0gdG9kb3Mgb3MgZGlyZWl0b3MgZGUgYXV0b3IuCl9fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fXwoKTGljZW7Dp2EgZGUgRGlzdHJpYnVpw6fDo28gTsOjbyBFeGNsdXNpdmEKCkFvIGNvbmNvcmRhciBjb20gZXN0YSBsaWNlbsOnYSBlIGFjZWl0w6EtbGEsIHZvY8OqIChhdXRvciBvdSBkZXRlbnRvciBkb3MgZGlyZWl0b3MgYXV0b3JhaXMpOgoKYSkgRGVjbGFyYSBxdWUgY29uaGVjZSBhIHBvbMOtdGljYSBkZSBjb3B5cmlnaHQgZGEgZWRpdG9yYSBkbyBzZXUgZG9jdW1lbnRvOwpiKSBEZWNsYXJhIHF1ZSBjb25oZWNlIGUgYWNlaXRhIGFzIERpcmV0cml6ZXMgcGFyYSBvIFJlcG9zaXTDs3JpbyBJbnN0aXR1Y2lvbmFsIGRhIFVGUEU7CmMpIENvbmNlZGUgw6AgVUZQRSBvIGRpcmVpdG8gbsOjbyBleGNsdXNpdm8gZGUgYXJxdWl2YXIsIHJlcHJvZHV6aXIsIGNvbnZlcnRlciAoY29tbyBkZWZpbmlkbyBhIHNlZ3VpciksIGNvbXVuaWNhciBlL291IGRpc3RyaWJ1aXIsIG5vIFJJLCBvIGRvY3VtZW50byBlbnRyZWd1ZSAoaW5jbHVpbmRvIG8gcmVzdW1vL2Fic3RyYWN0KSBlbSBmb3JtYXRvIGRpZ2l0YWwgb3UgcG9yIG91dHJvIG1laW87CmQpIERlY2xhcmEgcXVlIGF1dG9yaXphIGEgVUZQRSBhIGFycXVpdmFyIG1haXMgZGUgdW1hIGPDs3BpYSBkZXN0ZSBkb2N1bWVudG8gZSBjb252ZXJ0w6otbG8sIHNlbSBhbHRlcmFyIG8gc2V1IGNvbnRlw7pkbywgcGFyYSBxdWFscXVlciBmb3JtYXRvIGRlIGZpY2hlaXJvLCBtZWlvIG91IHN1cG9ydGUsIHBhcmEgZWZlaXRvcyBkZSBzZWd1cmFuw6dhLCBwcmVzZXJ2YcOnw6NvIChiYWNrdXApIGUgYWNlc3NvOwplKSBEZWNsYXJhIHF1ZSBvIGRvY3VtZW50byBzdWJtZXRpZG8gw6kgbyBzZXUgdHJhYmFsaG8gb3JpZ2luYWwgZSBxdWUgZGV0w6ltIG8gZGlyZWl0byBkZSBjb25jZWRlciBhIHRlcmNlaXJvcyBvcyBkaXJlaXRvcyBjb250aWRvcyBuZXN0YSBsaWNlbsOnYS4gRGVjbGFyYSB0YW1iw6ltIHF1ZSBhIGVudHJlZ2EgZG8gZG9jdW1lbnRvIG7Do28gaW5mcmluZ2Ugb3MgZGlyZWl0b3MgZGUgb3V0cmEgcGVzc29hIG91IGVudGlkYWRlOwpmKSBEZWNsYXJhIHF1ZSwgbm8gY2FzbyBkbyBkb2N1bWVudG8gc3VibWV0aWRvIGNvbnRlciBtYXRlcmlhbCBkbyBxdWFsIG7Do28gZGV0w6ltIG9zIGRpcmVpdG9zIGRlCmF1dG9yLCBvYnRldmUgYSBhdXRvcml6YcOnw6NvIGlycmVzdHJpdGEgZG8gcmVzcGVjdGl2byBkZXRlbnRvciBkZXNzZXMgZGlyZWl0b3MgcGFyYSBjZWRlciDDoApVRlBFIG9zIGRpcmVpdG9zIHJlcXVlcmlkb3MgcG9yIGVzdGEgTGljZW7Dp2EgZSBhdXRvcml6YXIgYSB1bml2ZXJzaWRhZGUgYSB1dGlsaXrDoS1sb3MgbGVnYWxtZW50ZS4gRGVjbGFyYSB0YW1iw6ltIHF1ZSBlc3NlIG1hdGVyaWFsIGN1am9zIGRpcmVpdG9zIHPDo28gZGUgdGVyY2Vpcm9zIGVzdMOhIGNsYXJhbWVudGUgaWRlbnRpZmljYWRvIGUgcmVjb25oZWNpZG8gbm8gdGV4dG8gb3UgY29udGXDumRvIGRvIGRvY3VtZW50byBlbnRyZWd1ZTsKZykgU2UgbyBkb2N1bWVudG8gZW50cmVndWUgw6kgYmFzZWFkbyBlbSB0cmFiYWxobyBmaW5hbmNpYWRvIG91IGFwb2lhZG8gcG9yIG91dHJhIGluc3RpdHVpw6fDo28gcXVlIG7Do28gYSBVRlBFLCBkZWNsYXJhIHF1ZSBjdW1wcml1IHF1YWlzcXVlciBvYnJpZ2HDp8O1ZXMgZXhpZ2lkYXMgcGVsbyByZXNwZWN0aXZvIGNvbnRyYXRvIG91IGFjb3Jkby4KCkEgVUZQRSBpZGVudGlmaWNhcsOhIGNsYXJhbWVudGUgbyhzKSBub21lKHMpIGRvKHMpIGF1dG9yIChlcykgZG9zIGRpcmVpdG9zIGRvIGRvY3VtZW50byBlbnRyZWd1ZSBlIG7Do28gZmFyw6EgcXVhbHF1ZXIgYWx0ZXJhw6fDo28sIHBhcmEgYWzDqW0gZG8gcHJldmlzdG8gbmEgYWzDrW5lYSBjKS4KRepositório InstitucionalPUBhttps://repositorio.ufpe.br/oai/requestattena@ufpe.bropendoar:22212019-10-25T13:58:31Repositório Institucional da UFPE - Universidade Federal de Pernambuco (UFPE)false
dc.title.pt_BR.fl_str_mv Auto test generator: a framework to generate test cases from requirements in natural language
title Auto test generator: a framework to generate test cases from requirements in natural language
spellingShingle Auto test generator: a framework to generate test cases from requirements in natural language
PINA, Thaís Melise Lopes
Engenharia de software
Especificação de requisitos
Linguagem natural controlada
title_short Auto test generator: a framework to generate test cases from requirements in natural language
title_full Auto test generator: a framework to generate test cases from requirements in natural language
title_fullStr Auto test generator: a framework to generate test cases from requirements in natural language
title_full_unstemmed Auto test generator: a framework to generate test cases from requirements in natural language
title_sort Auto test generator: a framework to generate test cases from requirements in natural language
author PINA, Thaís Melise Lopes
author_facet PINA, Thaís Melise Lopes
author_role author
dc.contributor.authorLattes.pt_BR.fl_str_mv http://lattes.cnpq.br/7648955393706189
dc.contributor.advisorLattes.pt_BR.fl_str_mv http://lattes.cnpq.br/3977760354511853
dc.contributor.advisor-coLattes.pt_BR.fl_str_mv http://lattes.cnpq.br/5390541720896559
dc.contributor.author.fl_str_mv PINA, Thaís Melise Lopes
dc.contributor.advisor1.fl_str_mv SAMPAIO, Augusto Cezar Alves
dc.contributor.advisor-co1.fl_str_mv BARROS, Flávia de Almeida
contributor_str_mv SAMPAIO, Augusto Cezar Alves
BARROS, Flávia de Almeida
dc.subject.por.fl_str_mv Engenharia de software
Especificação de requisitos
Linguagem natural controlada
topic Engenharia de software
Especificação de requisitos
Linguagem natural controlada
description Testing is essential in the software engineering development process. However, it is also one of the most costly tasks. Thus, test automation has become the goal of many researches. Since design, implementation, and execution phases depend substantially on the system requirements, it is of the utmost importance that requirements text is standardized and clear. However, most companies use free natural language to write these documents, which entails the phenomenon of (lexical and structural) ambiguity, giving rise to different interpretations. An option to mitigate this problem is via the use of a Controlled Natural Language (CNL), aiming at standardization and accuracy of texts. A CNL is a subset of a natural language that uses a restrict lexicon to a particular domain, and follow grammatical rules which guide the elaboration of sentences, thus reducing ambiguity and allowing mechanized processing, like the automatic generation of test cases from CNL requirements. This work, in the software testing area, presents the Auto Test Generator (ATG), a tool to assist the writing of requirements and the automatic generation of test cases written in English, which are then automatically translated in test scripts using an automation framework. From a requirement written in CNL, the ATG creates a Use Case (UC). Due to the standardization of the language, it is possible to perform a consistency and dependency analysis, for each UC step, through a graph of associations (dependencies and cancellations) between test actions. Test cases are generated automatically in a transparent way from UCs to the user. ATG was developed and evaluated in partnership with Motorola Mobility. Experimental evaluations were performed. From the seven requirements analyzed, it was possible to create 34 test cases in total. The generated test cases resulted in 151 steps, which were passed to the Zygon (a proprietary automated tool for testing) in order to be automated. As a result, 131 test steps were correctly automated (86% of the total given as input).
publishDate 2019
dc.date.accessioned.fl_str_mv 2019-09-27T20:56:33Z
dc.date.available.fl_str_mv 2019-09-27T20:56:33Z
dc.date.issued.fl_str_mv 2019-02-19
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/masterThesis
format masterThesis
status_str publishedVersion
dc.identifier.uri.fl_str_mv https://repositorio.ufpe.br/handle/123456789/33916
url https://repositorio.ufpe.br/handle/123456789/33916
dc.language.iso.fl_str_mv eng
language eng
dc.rights.driver.fl_str_mv Attribution-NonCommercial-NoDerivs 3.0 Brazil
http://creativecommons.org/licenses/by-nc-nd/3.0/br/
info:eu-repo/semantics/openAccess
rights_invalid_str_mv Attribution-NonCommercial-NoDerivs 3.0 Brazil
http://creativecommons.org/licenses/by-nc-nd/3.0/br/
eu_rights_str_mv openAccess
dc.publisher.none.fl_str_mv Universidade Federal de Pernambuco
dc.publisher.program.fl_str_mv Programa de Pos Graduacao em Ciencia da Computacao
dc.publisher.initials.fl_str_mv UFPE
dc.publisher.country.fl_str_mv Brasil
publisher.none.fl_str_mv Universidade Federal de Pernambuco
dc.source.none.fl_str_mv reponame:Repositório Institucional da UFPE
instname:Universidade Federal de Pernambuco (UFPE)
instacron:UFPE
instname_str Universidade Federal de Pernambuco (UFPE)
instacron_str UFPE
institution UFPE
reponame_str Repositório Institucional da UFPE
collection Repositório Institucional da UFPE
bitstream.url.fl_str_mv https://repositorio.ufpe.br/bitstream/123456789/33916/5/DISSERTA%c3%87%c3%83O%20Tha%c3%ads%20Melise%20Lopes%20Pina.pdf.jpg
https://repositorio.ufpe.br/bitstream/123456789/33916/1/DISSERTA%c3%87%c3%83O%20Tha%c3%ads%20Melise%20Lopes%20Pina.pdf
https://repositorio.ufpe.br/bitstream/123456789/33916/2/license_rdf
https://repositorio.ufpe.br/bitstream/123456789/33916/3/license.txt
https://repositorio.ufpe.br/bitstream/123456789/33916/4/DISSERTA%c3%87%c3%83O%20Tha%c3%ads%20Melise%20Lopes%20Pina.pdf.txt
bitstream.checksum.fl_str_mv 1fb45b5518b042fbdcb11bc5b8a54aca
e9959ccb183b2afbc06785960b4b800d
e39d27027a6cc9cb039ad269a5db8e34
bd573a5ca8288eb7272482765f819534
75d5392d39076b6037f7ce0c3ab8fd48
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositório Institucional da UFPE - Universidade Federal de Pernambuco (UFPE)
repository.mail.fl_str_mv attena@ufpe.br
_version_ 1793516116119650304