Performance Testing for Cloud Computing with Dependent Data Bootstrapping

Sen He, Tianyi Liu, Palden Lama, Jaewoo Lee, In Kee Kim, Wei Wang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

To effectively utilize cloud computing, cloud practice and research require accurate knowledge of the performance of cloud applications. However, due to the random performance fluctuations, obtaining accurate performance results in the cloud is extremely difficult. To handle this random fluctuation, prior research on cloud performance testing relied on a non-parametric statistic tool called bootstrapping to design their stop criteria. However, in this paper, we show that the basic bootstrapping employed by prior work overlooks the internal dependency within cloud performance test data, which leads to inaccurate performance results.We then present Metior, a novel automated cloud performance testing methodology, which is designed based on statistical tools of block bootstrapping, the law of large numbers, and autocorrelation. These statistical tools allow Metior to properly consider the internal dependency within cloud performance test data. They also provide better coverage of cloud performance fluctuation and reduce the testing cost. Experimental evaluation on two public clouds showed that 98% of Metior's tests could provide performance results with less than 3% error. Metior also significantly outperformed existing cloud performance testing methodologies in terms of accuracy and cost - with up to 14% increase in the accurate test count and up to 3.1 times reduction in testing cost.

Original languageEnglish (US)
Title of host publicationProceedings - 2021 36th IEEE/ACM International Conference on Automated Software Engineering, ASE 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages666-678
Number of pages13
ISBN (Electronic)9781665403375
DOIs
StatePublished - 2021
Externally publishedYes
Event36th IEEE/ACM International Conference on Automated Software Engineering, ASE 2021 - Virtual, Online, Australia
Duration: Nov 15 2021Nov 19 2021

Publication series

NameProceedings - 2021 36th IEEE/ACM International Conference on Automated Software Engineering, ASE 2021

Conference

Conference36th IEEE/ACM International Conference on Automated Software Engineering, ASE 2021
Country/TerritoryAustralia
CityVirtual, Online
Period11/15/2111/19/21

Keywords

  • Cloud computing
  • Non-parametric statistics
  • Performance testing
  • Resource contention
  • Single point of estimations

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Safety, Risk, Reliability and Quality
  • Control and Optimization

Fingerprint

Dive into the research topics of 'Performance Testing for Cloud Computing with Dependent Data Bootstrapping'. Together they form a unique fingerprint.

Cite this