Search : [ keyword: 클라우드 컴퓨팅 ] (12)

Service Level Agreement Specification Model of Software and Its Mediation Mechanism for Cloud Service Broker

Taewoo Nam, Keunhyuk Yeom

http://doi.org/

SLA (Service Level Agreement) is an essential factor that must be guaranteed to provide a reliable and consistent service to user in cloud computing environment. Especially, a contract between user and service provider with SLA is important in an environment using a cloud service brokerage. The cloud computing is classified into IaaS, PaaS, and SaaS according to IT resources of the various cloud service. The existing SLA is difficult to reflect the quality factors of service, because it only considers factors about the physical Network environment and have no methodological approach. In this paper, we suggested a method to specify the quality characteristics of software and proposed a mechanism and structure that can exchange SLA specification between the service provider and consumer. We defined a meta-model for the SLA specification in the SaaS level, and quality requirements of the SaaS were described by the proposed specification language. Through case studies, we verified proposed specification language that can present a variety of software quality factors. By using the UDDI-based mediation process and architecture to interchange this specification, it is stored in the repository of quality specifications and exchanged during service binding time.

Distributed Table Join for Scalable RDFS Reasoning on Cloud Computing Environment

Wan-Gon Lee, Je-Min Kim, Young-Tack Park

http://doi.org/

The Knowledge service system needs to infer a new knowledge from indicated knowledge to provide its effective service. Most of the Knowledge service system is expressed in terms of ontology. The volume of knowledge information in a real world is getting massive, so effective technique for massive data of ontology is drawing attention. This paper is to provide the method to infer massive data-ontology to the extent of RDFS, based on cloud computing environment, and evaluate its capability. RDFS inference suggested in this paper is focused on both the method applying MapReduce based on RDFS meta table, and the method of single use of cloud computing memory without using MapReduce under distributed file computing environment. Therefore, this paper explains basically the inference system structure of each technique, the meta table set-up according to RDFS inference rule, and the algorithm of inference strategy. In order to evaluate suggested method in this paper, we perform experiment with LUBM set which is formal data to evaluate ontology inference and search speed. In case LUBM6000, the RDFS inference technique based on meta table had required 13.75 minutes(inferring 1,042 triples per second) to conduct total inference, whereas the method applying the cloud computing memory had needed 7.24 minutes(inferring 1,979 triples per second) showing its speed twice faster.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr