2022
|
Journal ArticleSimona Bernardi, Abel Gómez, José Merseguer, Diego Perez-Palacin, José I. Requeno DICE simulation: a tool for software performance assessment at the design stage In: Automated Software Engineering, vol. 29, pp. 36, 2022, ISSN: 1573-7535. Abstract | Links | BibTeX | Tags: Data-Intensive Applications (DIA), DICE, Model-Driven Engineering (MDE), performance evaluation tools, software performance, Unified Modeling Language (UML) @article{Bernardi:AUSE:2022,
title = {DICE simulation: a tool for software performance assessment at the design stage},
author = {Simona Bernardi and Abel G\'{o}mez and Jos\'{e} Merseguer and Diego Perez-Palacin and Jos\'{e} I. Requeno},
url = {https://rdcu.be/cJ2Wt},
doi = {10.1007/s10515-022-00335-z},
issn = {1573-7535},
year = {2022},
date = {2022-03-28},
urldate = {2022-03-28},
journal = {Automated Software Engineering},
volume = {29},
pages = {36},
abstract = {In recent years, we have seen many performance fiascos in the deployment of new systems, such as the US health insurance web. This paper describes the functionality and architecture, as well as success stories, of a tool that helps address these types of issues. The tool allows assessing software designs regarding quality, in particular performance and reliability. Starting from a UML design with quality annotations, the tool applies model-transformation techniques to yield analyzable models. Such models are then leveraged by the tool to compute quality metrics. Finally, quality results, over the design, are presented to the engineer, in terms of the problem domain. Hence, the tool is an asset for the software engineer to evaluate system quality through software designs. While leveraging the Eclipse platform, the tool uses UML and the MARTE, DAM and DICE profiles for the system design and the quality modeling.},
keywords = {Data-Intensive Applications (DIA), DICE, Model-Driven Engineering (MDE), performance evaluation tools, software performance, Unified Modeling Language (UML)},
pubstate = {published},
tppubtype = {article}
}
In recent years, we have seen many performance fiascos in the deployment of new systems, such as the US health insurance web. This paper describes the functionality and architecture, as well as success stories, of a tool that helps address these types of issues. The tool allows assessing software designs regarding quality, in particular performance and reliability. Starting from a UML design with quality annotations, the tool applies model-transformation techniques to yield analyzable models. Such models are then leveraged by the tool to compute quality metrics. Finally, quality results, over the design, are presented to the engineer, in terms of the problem domain. Hence, the tool is an asset for the software engineer to evaluate system quality through software designs. While leveraging the Eclipse platform, the tool uses UML and the MARTE, DAM and DICE profiles for the system design and the quality modeling. Full Text AvailableOpen Access |
Proceedings ArticleAbel Gómez, Jordi Cabot, Xavier Pi Hacia la (semi)automatización en la Industria 4.0 mediante UML y AsyncAPI In: A. Goñi Sarriguren (Ed.): Actas de las XXVI Jornadas de Ingeniería del Software y Bases de Datos (JISBD 2022), SISTEDES, 2022. Abstract | Links | BibTeX | Tags: AsyncAPI, Industry, Model Transformation (MT), Publish-Subscribe, UML Profiles, Unified Modeling Language (UML) @inproceedings{Gomez:JISBD:2022,
title = {Hacia la (semi)automatizaci\'{o}n en la Industria 4.0 mediante UML y AsyncAPI},
author = {Abel G\'{o}mez and Jordi Cabot and Xavier Pi},
editor = {A. Go\~{n}i Sarriguren},
url = {http://hdl.handle.net/11705/JISBD/2022/572},
year = {2022},
date = {2022-01-01},
urldate = {2022-01-01},
booktitle = {Actas de las XXVI Jornadas de Ingenier\'{i}a del Software y Bases de Datos (JISBD 2022)},
publisher = {SISTEDES},
abstract = {El uso y despliegue de los llamados sistemas ciberf\'{i}sicos ha calado profundamente en la industria, dando lugar a la Industria 4.0. T\'{i}picamente, las arquitecturas de la Industria 4.0 muestran un acoplamiento muy bajo entre sus componentes, siendo distribuidas, as\'{i}ncronas, y gui\'{a}ndose la comunicaci\'{o}n por eventos. Estas caracter\'{i}sticas, diferentes de las de arquitecturas que hasta ahora eran el foco de las t\'{e}cnicas de modelado, conllevan la necesidad de dotar a la Industria 4.0 de nuevos lenguajes y herramientas que permitan un desarrollo m\'{a}s eficiente y preciso. En este art\'{i}culo, proponemos el uso de UML para el modelado de este tipo de arquitecturas y una serie de transformaciones que permiten automatizar su procesamiento. M\'{a}s concretamente, presentamos un perfil UML para la Industria 4.0, as+AO0 como una transformaci\'{o}n de modelos capaz de generar una descripci\'{o}n abstracta \textemdashempleando la especificaci\'{o}n AsyncAPI\textemdash de las interfaces de programaci\'{o}n que subyacen a la arquitectura. A partir de dicha descripci\'{o}n abstracta en AsyncAPI, generamos el c\'{o}digo que dan soporte a dichas interfaces de forma autom\'{a}tica.},
keywords = {AsyncAPI, Industry, Model Transformation (MT), Publish-Subscribe, UML Profiles, Unified Modeling Language (UML)},
pubstate = {published},
tppubtype = {inproceedings}
}
El uso y despliegue de los llamados sistemas ciberfísicos ha calado profundamente en la industria, dando lugar a la Industria 4.0. Típicamente, las arquitecturas de la Industria 4.0 muestran un acoplamiento muy bajo entre sus componentes, siendo distribuidas, asíncronas, y guiándose la comunicación por eventos. Estas características, diferentes de las de arquitecturas que hasta ahora eran el foco de las técnicas de modelado, conllevan la necesidad de dotar a la Industria 4.0 de nuevos lenguajes y herramientas que permitan un desarrollo más eficiente y preciso. En este artículo, proponemos el uso de UML para el modelado de este tipo de arquitecturas y una serie de transformaciones que permiten automatizar su procesamiento. Más concretamente, presentamos un perfil UML para la Industria 4.0, as+AO0 como una transformación de modelos capaz de generar una descripción abstracta —empleando la especificación AsyncAPI— de las interfaces de programación que subyacen a la arquitectura. A partir de dicha descripción abstracta en AsyncAPI, generamos el código que dan soporte a dichas interfaces de forma automática. Full Text AvailableOpen AccessSpanish |
2019
|
ConferenceSimona Bernardi, Juan L. Domínguez, Abel Gómez, Christophe Joubert, José Merseguer, Diego Perez-Palacin, José I. Requeno, Alberto Romeu A Systematic Approach for Performance Assessment Using Process Mining: An Industrial Experience Report (Abstract) Actas de las XXIV Jornadas de Ingeniería del Software y Bases de Datos (JISBD 2019), Cáceres, septiembre de 2019., Sistedes Sistedes, 2019. Abstract | Links | BibTeX | Tags: Complex Event Processing (CEP), Petri net (PN), Process Mining, Software Perfomance, Unified Modeling Language (UML) @conference{Bernardi:JISBD:2019,
title = {A Systematic Approach for Performance Assessment Using Process Mining: An Industrial Experience Report (Abstract)},
author = {Simona Bernardi and Juan L. Dom\'{i}nguez and Abel G\'{o}mez and Christophe Joubert and Jos\'{e} Merseguer and Diego Perez-Palacin and Jos\'{e} I. Requeno and Alberto Romeu},
editor = {Jennifer P\'{e}rez },
url = {http://hdl.handle.net/11705/JISBD/2019/019},
year = {2019},
date = {2019-09-02},
booktitle = {Actas de las XXIV Jornadas de Ingenier\'{i}a del Software y Bases de Datos (JISBD 2019), C\'{a}ceres, septiembre de 2019.},
publisher = {Sistedes},
organization = {Sistedes},
abstract = {Software performance engineering is a mature field that offers methods to assess system performance. Process mining is a promising research field applied to gain insight on system processes. The interplay of these two fields opens promising applications in the industry. In this work, we report our experience applying a methodology, based on process mining techniques, for the performance assessment of a commercial data-intensive software application. The methodology has successfully assessed the scalability of future versions of this system. Moreover, it has identified bottlenecks components and replication needs for fulfilling business rules. The system, an integrated port operations management system, has been developed by Prodevelop, a medium-sized software enterprise with high expertise in geospatial technologies. The performance assessment has been carried out by a team composed by practitioners and researchers. Finally, the paper offers a deep discussion on the lessons learned during the experience, that will be useful for practitioners to adopt the methodology and for researcher to find new routes.},
keywords = {Complex Event Processing (CEP), Petri net (PN), Process Mining, Software Perfomance, Unified Modeling Language (UML)},
pubstate = {published},
tppubtype = {conference}
}
Software performance engineering is a mature field that offers methods to assess system performance. Process mining is a promising research field applied to gain insight on system processes. The interplay of these two fields opens promising applications in the industry. In this work, we report our experience applying a methodology, based on process mining techniques, for the performance assessment of a commercial data-intensive software application. The methodology has successfully assessed the scalability of future versions of this system. Moreover, it has identified bottlenecks components and replication needs for fulfilling business rules. The system, an integrated port operations management system, has been developed by Prodevelop, a medium-sized software enterprise with high expertise in geospatial technologies. The performance assessment has been carried out by a team composed by practitioners and researchers. Finally, the paper offers a deep discussion on the lessons learned during the experience, that will be useful for practitioners to adopt the methodology and for researcher to find new routes. AbstractOpen Access |
ConferenceGwendal Daniel, Abel Gómez, Jordi Cabot UMLto[No]SQL: Mapping Conceptual Schemas to Heterogeneous Datastores 2019 13th International Conference on Research Challenges in Information Science (RCIS), IEEE, 2019, ISBN: 978-1-7281-4844-1. Abstract | Links | BibTeX | Tags: Model Partitioning, Model Persistence, Model-Driven Engineering (MDE), NoSQL, RDBMS, Unified Modeling Language (UML) @conference{Daniel:RCIS:2019,
title = {UMLto[No]SQL: Mapping Conceptual Schemas to Heterogeneous Datastores},
author = {Gwendal Daniel and Abel G\'{o}mez and Jordi Cabot},
editor = {Manuel Kolp and Jean Vanderdonckt and Monique Snoeck and Yves Wautelet},
doi = {10.1109/RCIS.2019.8877094},
isbn = {978-1-7281-4844-1},
year = {2019},
date = {2019-05-29},
booktitle = {2019 13th International Conference on Research Challenges in Information Science (RCIS)},
pages = {215--227},
publisher = {IEEE},
abstract = {The growing need to store and manipulate large volumes of data has led to the blossoming of various families of data storage solutions. Software modelers can benefit from this growing diversity to improve critical parts of their applications, using a combination of different databases to store the data based on access, availability, and performance requirements. However, while the mapping of conceptual schemas to relational databases is a well-studied field of research, there are few works that target the role of conceptual modeling in a multiple and diverse data storage settings. This is particularly true when dealing with the mapping of constraints in the conceptual schema. In this paper we present the UMLto[No]SQL approach that maps conceptual schemas expressed in UML/OCL into a set of logical schemas (either relational or NoSQL ones) to be used to store the application data according to the data partition envisaged by the designer. Our mapping covers as well the database queries required to implement and check the model’s constraints. UMLto[No]SQL takes care of integrating the different data storages, and provides a modeling layer that enables a transparent manipulation of the data using conceptual level information.},
keywords = {Model Partitioning, Model Persistence, Model-Driven Engineering (MDE), NoSQL, RDBMS, Unified Modeling Language (UML)},
pubstate = {published},
tppubtype = {conference}
}
The growing need to store and manipulate large volumes of data has led to the blossoming of various families of data storage solutions. Software modelers can benefit from this growing diversity to improve critical parts of their applications, using a combination of different databases to store the data based on access, availability, and performance requirements. However, while the mapping of conceptual schemas to relational databases is a well-studied field of research, there are few works that target the role of conceptual modeling in a multiple and diverse data storage settings. This is particularly true when dealing with the mapping of constraints in the conceptual schema. In this paper we present the UMLto[No]SQL approach that maps conceptual schemas expressed in UML/OCL into a set of logical schemas (either relational or NoSQL ones) to be used to store the application data according to the data partition envisaged by the designer. Our mapping covers as well the database queries required to implement and check the model’s constraints. UMLto[No]SQL takes care of integrating the different data storages, and provides a modeling layer that enables a transparent manipulation of the data using conceptual level information. |
Journal ArticleAbel Gómez, Ricardo J. Rodríguez, María-Emilia Cambronero, Valentín Valero Profiling the publish/subscribe paradigm for automated analysis using colored Petri nets In: Software & Systems Modeling, vol. 18, no. 5, pp. 2973-3003, 2019, ISSN: 1619-1374. Abstract | Links | BibTeX | Tags: CPN Tools, Model Transformation (MT), Model-Driven Engineering (MDE), Petri net (PN), Publish-Subscribe, Unified Modeling Language (UML) @article{G\'{o}mez2019b,
title = {Profiling the publish/subscribe paradigm for automated analysis using colored Petri nets},
author = {Abel G\'{o}mez and Ricardo J. Rodr\'{i}guez and Mar\'{i}a-Emilia Cambronero and Valent\'{i}n Valero},
doi = {10.1007/s10270-019-00716-1},
issn = {1619-1374},
year = {2019},
date = {2019-01-22},
journal = {Software \& Systems Modeling},
volume = {18},
number = {5},
pages = {2973-3003},
abstract = {UML sequence diagrams are used to graphically describe the message interactions between the objects participating in a certain scenario. Combined fragments extend the basic functionality of UML sequence diagrams with control structures, such as sequences, alternatives, iterations, or parallels. In this paper, we present a UML profile to annotate sequence diagrams with combined fragments to model timed Web services with distributed resources under the publish/subscribe paradigm. This profile is exploited to automatically obtain a representation of the system based on Colored Petri nets using a novel model-to-model (M2M) transformation. This M2M transformation has been specified using QVT and has been integrated in a new add-on extending a state-of-the-art UML modeling tool. Generated Petri nets can be immediately used in well-known Petri net software, such as CPN Tools, to analyze the system behavior. Hence, our model-to-model transformation tool allows for simulating the system and finding design errors in early stages of system development, which enables us to fix them at these early phases and thus potentially saving development costs.},
keywords = {CPN Tools, Model Transformation (MT), Model-Driven Engineering (MDE), Petri net (PN), Publish-Subscribe, Unified Modeling Language (UML)},
pubstate = {published},
tppubtype = {article}
}
UML sequence diagrams are used to graphically describe the message interactions between the objects participating in a certain scenario. Combined fragments extend the basic functionality of UML sequence diagrams with control structures, such as sequences, alternatives, iterations, or parallels. In this paper, we present a UML profile to annotate sequence diagrams with combined fragments to model timed Web services with distributed resources under the publish/subscribe paradigm. This profile is exploited to automatically obtain a representation of the system based on Colored Petri nets using a novel model-to-model (M2M) transformation. This M2M transformation has been specified using QVT and has been integrated in a new add-on extending a state-of-the-art UML modeling tool. Generated Petri nets can be immediately used in well-known Petri net software, such as CPN Tools, to analyze the system behavior. Hence, our model-to-model transformation tool allows for simulating the system and finding design errors in early stages of system development, which enables us to fix them at these early phases and thus potentially saving development costs. Full Text AvailableOpen Access |
2018
|
Journal ArticleSimona Bernardi, Juan L. Domínguez, Abel Gómez, Christophe Joubert, José Merseguer, Diego Perez-Palacin, José I. Requeno, Alberto Romeu A systematic approach for performance assessment using process mining: An industrial experience report In: Empirical Software Engineering, vol. 23, no. 6, pp. 3394–3441, 2018, ISSN: 1573-7616. Abstract | Links | BibTeX | Tags: DICE, Experience Report, Modeling and Analysis of Real Time and Embedded systems (MARTE), Petri net (PN), Process Mining, Simulation, Software Perfomance, Unified Modeling Language (UML) @article{Bernardi:EmSE:2018,
title = {A systematic approach for performance assessment using process mining: An industrial experience report},
author = {Simona Bernardi and Juan L. Dom\'{i}nguez and Abel G\'{o}mez and Christophe Joubert and Jos\'{e} Merseguer and Diego Perez-Palacin and Jos\'{e} I. Requeno and Alberto Romeu},
url = {http://rdcu.be/Jz3J},
doi = {10.1007/s10664-018-9606-9},
issn = {1573-7616},
year = {2018},
date = {2018-03-21},
journal = {Empirical Software Engineering},
volume = {23},
number = {6},
pages = {3394--3441},
abstract = {Software performance engineering is a mature field that offers methods to assess system performance. Process mining is a promising research field applied to gain insight on system processes. The interplay of these two fields opens promising applications in the industry. In this work, we report our experience applying a methodology, based on process mining techniques, for the performance assessment of a commercial data-intensive software application. The methodology has successfully assessed the scalability of future versions of this system. Moreover, it has identified bottlenecks components and replication needs for fulfilling business rules. The system, an integrated port operations management system, has been developed by Prodevelop, a medium-sized software enterprise with high expertise in geospatial technologies. The performance assessment has been carried out by a team composed by practitioners and researchers. Finally, the paper offers a deep discussion on the lessons learned during the experience, that will be useful for practitioners to adopt the methodology and for researcher to find new routes.},
keywords = {DICE, Experience Report, Modeling and Analysis of Real Time and Embedded systems (MARTE), Petri net (PN), Process Mining, Simulation, Software Perfomance, Unified Modeling Language (UML)},
pubstate = {published},
tppubtype = {article}
}
Software performance engineering is a mature field that offers methods to assess system performance. Process mining is a promising research field applied to gain insight on system processes. The interplay of these two fields opens promising applications in the industry. In this work, we report our experience applying a methodology, based on process mining techniques, for the performance assessment of a commercial data-intensive software application. The methodology has successfully assessed the scalability of future versions of this system. Moreover, it has identified bottlenecks components and replication needs for fulfilling business rules. The system, an integrated port operations management system, has been developed by Prodevelop, a medium-sized software enterprise with high expertise in geospatial technologies. The performance assessment has been carried out by a team composed by practitioners and researchers. Finally, the paper offers a deep discussion on the lessons learned during the experience, that will be useful for practitioners to adopt the methodology and for researcher to find new routes. |
2016
|
ConferenceAbel Gómez, José Merseguer Una herramienta para evaluar el rendimiento de aplicaciones intensivas en datos Actas de las XXI Jornadas de Ingeniería del Software y Bases de Datos (JISBD 2016), SISTEDES, Salamanca, Spain, 2016. Abstract | Links | BibTeX | Tags: Computer Aided Design (CASE), Data-Intensive Applications (DIA), DICE, Model-Driven Engineering (MDE), Modeling and Analysis of Real Time and Embedded systems (MARTE), Petri net (PN), Simulation, UML Profiles, Unified Modeling Language (UML) @conference{Gomez:JISBD:2016,
title = {Una herramienta para evaluar el rendimiento de aplicaciones intensivas en datos},
author = {Abel G\'{o}mez and Jos\'{e} Merseguer},
editor = {Jes\'{u}s Garc\'{i}a Molina},
url = {http://hdl.handle.net/11705/JISBD/2016/026},
year = {2016},
date = {2016-09-13},
booktitle = {Actas de las XXI Jornadas de Ingenier\'{i}a del Software y Bases de Datos (JISBD 2016)},
publisher = {SISTEDES},
address = {Salamanca, Spain},
abstract = {Las aplicaciones intensivas en datos (AID) que usan tecnolog\'{i}as de Big Data se est\'{a}n convirtiendo en una parte importante del mercado de desarrollo de software. Sin embargo, las t\'{e}cnicas --y su automatizaci\'{o}n-- para el asesoramiento de la calidad para este tipo de aplicaciones es claramente insuficiente. El proyecto DICE H2020 tiene como objetivo definir metodolog\'{i}as y crear herramientas para desarrollar y monitorizar AID mediante t\'{e}cnicas de ingenier\'{i}a dirigida por modelos. En este art\'{i}culo presentamos un componente clave del proyecto DICE: su herramienta de simulaci\'{o}n. Esta herramienta es capaz de evaluar el rendimiento de AID simulando su comportamiento mediante modelos de redes de Petri. Como complemento, existe a disposici\'{o}n un v\'{i}deo mostrando la herramienta en http://tiny.cc/z1qzay.},
keywords = {Computer Aided Design (CASE), Data-Intensive Applications (DIA), DICE, Model-Driven Engineering (MDE), Modeling and Analysis of Real Time and Embedded systems (MARTE), Petri net (PN), Simulation, UML Profiles, Unified Modeling Language (UML)},
pubstate = {published},
tppubtype = {conference}
}
Las aplicaciones intensivas en datos (AID) que usan tecnologías de Big Data se están convirtiendo en una parte importante del mercado de desarrollo de software. Sin embargo, las técnicas --y su automatización-- para el asesoramiento de la calidad para este tipo de aplicaciones es claramente insuficiente. El proyecto DICE H2020 tiene como objetivo definir metodologías y crear herramientas para desarrollar y monitorizar AID mediante técnicas de ingeniería dirigida por modelos. En este artículo presentamos un componente clave del proyecto DICE: su herramienta de simulación. Esta herramienta es capaz de evaluar el rendimiento de AID simulando su comportamiento mediante modelos de redes de Petri. Como complemento, existe a disposición un vídeo mostrando la herramienta en http://tiny.cc/z1qzay. Open AccessSpanish |
ConferenceAbel Gómez, José Merseguer, Elisabetta Di Nitto, Damian A. Tamburri Towards a UML Profile for Data Intensive Applications Proceedings of the 2nd International Workshop on Quality-Aware DevOps, co-located with ACM SIGSOFT International Symposium on Software Testing and Analysis 2016 (ISSTA'16), QUDOS 2016 ACM, New York, NY, USA, 2016, ISBN: 978-1-4503-4411-1, (Saarbrücken, Germany). Abstract | Links | BibTeX | Tags: Computer Aided Design (CASE), Data-Intensive Applications (DIA), DICE, Model-Driven Engineering (MDE), Modeling and Analysis of Real Time and Embedded systems (MARTE), UML Profiles, Unified Modeling Language (UML) @conference{Gomez:QUDOS:2016,
title = {Towards a UML Profile for Data Intensive Applications},
author = {Abel G\'{o}mez and Jos\'{e} Merseguer and Elisabetta Di Nitto and Damian A. Tamburri},
doi = {10.1145/2945408.2945412},
isbn = {978-1-4503-4411-1},
year = {2016},
date = {2016-07-21},
booktitle = {Proceedings of the 2nd International Workshop on Quality-Aware DevOps, co-located with ACM SIGSOFT International Symposium on Software Testing and Analysis 2016 (ISSTA'16)},
pages = {18--23},
publisher = {ACM},
address = {New York, NY, USA},
series = {QUDOS 2016},
abstract = {Data intensive applications that leverage Big Data technologies are rapidly gaining market trend. However, their design and quality assurance are far from satisfying software engineers needs. In fact, a CapGemini research shows that only 13% of organizations have achieved full-scale production for their Big Data implementations. We aim at addressing an early design and a quality evaluation of data intensive applications,being our goal to help software engineers on assessing quality metrics, such as the response time of theapplication. We address this goal by means of a quality analysis tool-chain.At the core of the tool, we are developing a Profile that converts the Unified Modeling Language into a domain specific modeling language for quality evaluation of data intensive applications. },
note = {Saarbr\"{u}cken, Germany},
keywords = {Computer Aided Design (CASE), Data-Intensive Applications (DIA), DICE, Model-Driven Engineering (MDE), Modeling and Analysis of Real Time and Embedded systems (MARTE), UML Profiles, Unified Modeling Language (UML)},
pubstate = {published},
tppubtype = {conference}
}
Data intensive applications that leverage Big Data technologies are rapidly gaining market trend. However, their design and quality assurance are far from satisfying software engineers needs. In fact, a CapGemini research shows that only 13% of organizations have achieved full-scale production for their Big Data implementations. We aim at addressing an early design and a quality evaluation of data intensive applications,being our goal to help software engineers on assessing quality metrics, such as the response time of theapplication. We address this goal by means of a quality analysis tool-chain.At the core of the tool, we are developing a Profile that converts the Unified Modeling Language into a domain specific modeling language for quality evaluation of data intensive applications. |
ConferenceAbel Gómez, Christophe Joubert, José Merseguer A Tool for Assessing Performance Requirements of Data-Intensive Applications Actas de las XXIV Jornadas de Concurrencia y Sistemas Distribuidos (JCSD 2016), Godel S. L., Granada, Spain, 2016, ISBN: 978-84-16478-90-3. Abstract | Links | BibTeX | Tags: Computer Aided Design (CASE), Data-Intensive Applications (DIA), DICE, Modeling and Analysis of Real Time and Embedded systems (MARTE), Petri net (PN), Posidonia Operations, UML Profiles, Unified Modeling Language (UML) @conference{Gomez:JCSD:2016,
title = {A Tool for Assessing Performance Requirements of Data-Intensive Applications},
author = {Abel G\'{o}mez and Christophe Joubert and Jos\'{e} Merseguer },
editor = {Miguel J. Hornos Barranco},
url = {https://abel.gomez.llana.me/wp-content/uploads/2017/11/gomez-jcsd-2016.pdf},
isbn = {978-84-16478-90-3},
year = {2016},
date = {2016-06-15},
booktitle = {Actas de las XXIV Jornadas de Concurrencia y Sistemas Distribuidos (JCSD 2016)},
pages = {159--169},
publisher = {Godel S. L.},
address = {Granada, Spain},
abstract = {Big Data is becoming a core asset for present economy and businesses, and as such, Data-Intensive Applications (DIA) that use Big Data technologies are becoming crucial products in the software development market. However, quality assurance of such applications is still an open issue. The H2020 DICE project aims to define a quality-driven framework for developing DIA based on model-driven engineering (MDE) techniques. In this paper we present a key component of the DICE Framework, the DICE Simulation Tool. The tool is able to simulate the behavior of a DIA to assess its performance using a Petri net model. To showcase its capabilities we use the Posidonia Operations case study, a real-world scenario brought from one of our industrial partners. In addition to this paper, a video demonstrating the tool is available at http://tiny.cc/z1qzay.
},
keywords = {Computer Aided Design (CASE), Data-Intensive Applications (DIA), DICE, Modeling and Analysis of Real Time and Embedded systems (MARTE), Petri net (PN), Posidonia Operations, UML Profiles, Unified Modeling Language (UML)},
pubstate = {published},
tppubtype = {conference}
}
Big Data is becoming a core asset for present economy and businesses, and as such, Data-Intensive Applications (DIA) that use Big Data technologies are becoming crucial products in the software development market. However, quality assurance of such applications is still an open issue. The H2020 DICE project aims to define a quality-driven framework for developing DIA based on model-driven engineering (MDE) techniques. In this paper we present a key component of the DICE Framework, the DICE Simulation Tool. The tool is able to simulate the behavior of a DIA to assess its performance using a Petri net model. To showcase its capabilities we use the Posidonia Operations case study, a real-world scenario brought from one of our industrial partners. In addition to this paper, a video demonstrating the tool is available at http://tiny.cc/z1qzay.
Open Access |
2010
|
ConferenceAbel Gómez, Isidro Ramos Cardinality-Based Feature Modeling and Model-Driven Engineering: Fitting them Together Fourth International Workshop on Variability Modelling of Software-intensive Systems – Proceedings, no. 37, VaMoS 2010 Institut für Informatik und Wirtschaftsinformatik (ICB) ICB Research Reports, Essen, Germany, 2010, ISSN: 1860‐2770, (Linz, Austria). Abstract | Links | BibTeX | Tags: Feature Modeling (FM), Model-Driven Architecture (MDA), Model-Driven Engineering (MDE), Object Constraint Language (OCL), Query/View/Transformation (QVT), Software Product Lines (SPL), Unified Modeling Language (UML) @conference{Gomez:VaMoS:2010,
title = {Cardinality-Based Feature Modeling and Model-Driven Engineering: Fitting them Together},
author = {Abel G\'{o}mez and Isidro Ramos},
editor = {David Benavides and Don Batory and Paul Gr\"{u}nbacher},
url = {http://www.wi-inf.uni-duisburg-essen.de/FGFrank/download/icb/ICBReportNo37.pdf},
issn = {1860‐2770},
year = {2010},
date = {2010-01-01},
booktitle = {Fourth International Workshop on Variability Modelling of Software-intensive Systems \textendash Proceedings},
number = {37},
publisher = {ICB Research Reports},
address = {Essen, Germany},
organization = {Institut f\"{u}r Informatik und Wirtschaftsinformatik (ICB)},
series = {VaMoS 2010},
abstract = {Feature Modeling is a technique which uses a specific visual notation to characterize the variability of product lines by means of diagrams. In this sense, the arrival of metamodeling frameworks in the Model-Driven Engineering field has provided the necessary background to exploit these diagrams (called feature models) in complex software development processes. However, these frameworks (such as the Eclipse Modeling Framework) have some limitations when they must deal with software artifacts at several abstraction layers. This paper presents a prototype that allows the developers to define cardinality-based feature models with constraints. These models are automatically translated to Domain Variability Models (DVM) by means of model-to-model transformations. Thus, such models can be instantiated, and each different instantiation is a configuration of the feature model. This appproach allows us to take advantage of existing generative programming tools, query languages and validation formalisms; and, what is more, DVMs can play a key role in MDE processes as they can be used as inputs in complex model transformations.
},
note = {Linz, Austria},
keywords = {Feature Modeling (FM), Model-Driven Architecture (MDA), Model-Driven Engineering (MDE), Object Constraint Language (OCL), Query/View/Transformation (QVT), Software Product Lines (SPL), Unified Modeling Language (UML)},
pubstate = {published},
tppubtype = {conference}
}
Feature Modeling is a technique which uses a specific visual notation to characterize the variability of product lines by means of diagrams. In this sense, the arrival of metamodeling frameworks in the Model-Driven Engineering field has provided the necessary background to exploit these diagrams (called feature models) in complex software development processes. However, these frameworks (such as the Eclipse Modeling Framework) have some limitations when they must deal with software artifacts at several abstraction layers. This paper presents a prototype that allows the developers to define cardinality-based feature models with constraints. These models are automatically translated to Domain Variability Models (DVM) by means of model-to-model transformations. Thus, such models can be instantiated, and each different instantiation is a configuration of the feature model. This appproach allows us to take advantage of existing generative programming tools, query languages and validation formalisms; and, what is more, DVMs can play a key role in MDE processes as they can be used as inputs in complex model transformations.
Open Access |