会議情報
RE 2025: International Requirements Engineering Conference
https://conf.researchr.org/home/re-2025
提出日:
2025-03-03
通知日:
2025-05-23
会議日:
2025-09-01
場所:
Valencia, Spain
年:
33
CCF: b   CORE: a   QUALIS: a1   閲覧: 29348   追跡: 29   出席: 10

論文募集
RE 2025 welcomes original papers focusing on traditional RE topics, such as requirements elicitation, analysis, prioritisation, documentation, validation, evolution, and maintenance. In addition, this year we particularly encourage submissions addressing the theme “Future-proofing Requirements Engineering”. This theme focuses on innovating requirements engineering by embracing AI, DevOps, sustainability, security, personalization, and agile practices. It aims to equip professionals with the tools and methodologies needed to address the evolving challenges and opportunities in software development, ensuring robust, user-centric, and adaptable systems.

Categories of Research Papers

The RE 2025 Research Track invites original submissions of research papers in two categories: Solution-focused papers and Evaluation-focused papers.

Solution-focused Papers present novel or significantly improved solutions for requirements-related problems. This includes new approaches or theories, novel tools, modelling languages, infrastructures, or other technologies. All requirements-related activities, such as elicitation, prioritisation, or analysis are in scope. These papers are mainly evaluated based on the significance of the problem addressed, the novelty of the solution in comparison with existing work, clarity of presentation, technical soundness, and evidence of its benefits. A solution-focused paper does not require a thorough validation, but a preliminary evaluation is expected that shows the effectiveness, ease of use, or other relevant quality attributes of the proposed solution.

Evaluation-focused Papers empirically assess phenomena, theories or real-world artefacts (e.g., methods, techniques, or tools) relevant to requirements engineering. These papers apply empirical software engineering approaches, such as experiments, experimental simulations, case studies, surveys, systematic literature reviews, and others to report on qualitative and/or quantitative data, findings and results. The discussion of lessons learned can complement the empirical results. The evaluation criteria for these papers focus on the soundness of the research questions, the appropriateness and correctness of the study design and data analysis, and considerations of threats to validity. Replication studies are welcome.

Review Criteria

Each category of paper has its own review criteria, which reviewers will use for evaluation. Authors are encouraged to study these criteria as well. We also encourage them to read the paper “The ABC of Software Engineering Research” by Klaas-Jan Stol and Brian Fitzgerald, available in Open Access (https://dl.acm.org/doi/10.1145/3241743), which highlights the inherent limitations of each study type. This is to guide the authors in their study design, and to help reviewers determine which aspects of the study design are open to criticism and which are not.

Review Criteria: Solution-focused Papers

• Novelty: to what extent is the proposed solution novel with respect to the state-of-the-art? To what extent is related work considered? To what extent did the authors clarify their contribution?

• Potential Impact: is the potential impact on research and practice clearly stated? Is the potential impact convincing? Has the proposed solution been preliminarily evaluated to show its potential impact (effectiveness, ease of use, or other relevant quality attributes of the proposed solution)?

• Soundness: has the novel solution been developed following a well-motivated approach? Are the design or methodological choices of the proposed solution justified? Did the authors clearly state the research questions? Is the preliminary evaluation of the solution using rigorous and appropriate research methods? Are the conclusions of the preliminary evaluation logically derived from the data? Did the authors discuss the limitations of the proposed solution? Did the authors discuss the threats to validity of the preliminary evaluation?

• Verifiability: did the authors provide guidelines on how to reuse their artifacts and replicate their results? Did the authors share their software, if any? Did the authors share their data?

• Presentation: is the paper clearly presented and well-structured? To what extent can the content of the paper be understood by the general RE public? If highly technical content is presented, did the authors make an effort to also summarise their proposal in an intuitive way?

Review Criteria: Evaluation-focused Papers

• Novelty: to what extent is the study novel with respect to the related literature? To what extent is related literature considered? To what extent did the authors clarify their contribution? To what extent does the study contribute to extending the body of knowledge in RE?

• Potential Impact: is the potential impact on research and practice clearly stated? Is the potential impact convincing? Was the study carried out in a representative setting?

• Soundness: Are the research methods justified? Are the research methods adequate for the problem at hand? Did the authors clearly state the research questions, data collection, and analysis? Are the conclusions of the evaluation logically derived from the data? Did the authors discuss the threats to validity?

• Verifiability: did the authors provide guidelines on how to reuse their artifacts and replicate their results? Did the authors share their software? Did the authors share their data?

• Presentation: is the paper clearly presented and well-structured? To what extent can the content of the paper be understood by the general RE public? If highly technical content is presented, did the authors make an effort to also summarise their study in an intuitive way?

Open Science Policy

The RE 2025 Research Track has an open science policy with the steering principle that all research results should be accessible to the public and, if possible, empirical studies should be reproducible. In particular, we actively support the adoption of open data and open source principles and encourage all contributing authors to disclose (anonymized and curated) data to increase reproducibility and replicability. Note that sharing research data is not mandatory for submission or acceptance. However, sharing is expected to be the default, and non-sharing needs to be justified. We recognize that reproducibility or replicability is not a goal in qualitative research and that, similar to industrial studies, qualitative studies often face challenges in sharing research data. For guidelines on how to report qualitative research to ensure the assessment of the reliability and credibility of research results, see the Q&A page. Upon submission to the research track, authors are asked:

• to make their data available to the program committee (via upload of supplemental material or a link to an anonymous repository) – and provide instructions on how to access this data in the paper; or
• to include in the paper an explanation as to why this is not possible or desirable; and
• to indicate if they intend to make their data publicly available upon acceptance.

Supplementary material can be uploaded via the EasyChair site or anonymously linked from the paper submission. Although PC members are not required to look at this material, we strongly encourage authors to use supplementary material to provide access to anonymized data, whenever possible. Authors are asked to carefully review any supplementary material to ensure it conforms to the double-anonymous policy (see submission instructions). For example, code and data repositories may be exported to remove version control history, scrubbed of names in comments and metadata, and anonymously uploaded to a sharing site to support review. One resource that may be helpful in accomplishing this task is this blog post.

Artifacts

The authors of accepted papers will have the opportunity to increase the visibility of their artifacts (software and data) and to obtain an artifact badge. Upon acceptance, the authors can submit their artifacts, which will be evaluated by a committee that determines their sustained availability and reusability.
最終更新 Dou Sun 2024-09-16
合格率
時間提出受け入れ受け入れ(%)
20121062624.5%
20111382316.7%
20101532717.6%
20091312519.1%
20081643823.2%
20071722916.9%
20061814223.2%
20051753520%
2004992727.3%
20031352518.5%
20021923317.2%
20011272620.5%
1999801923.8%
1997762127.6%
関連会議
CCFCOREQUALIS省略名完全な名前提出日通知日会議日
HEARTInternational Symposium on Highly Efficient Accelerators and Reconfigurable Technologies2022-03-152022-04-252022-06-09
ICCAISInternational Conference on Control, Automation and Information Sciences2014-10-192014-11-022014-11-05
bb2ICEBEInternational Conference on e-Business Engineering2023-08-112023-09-152023-11-04
TMCMInternational Conference on Test, Measurement and Computational Method2017-05-11 2017-05-21
ICIEIInternational Conference on Information and Education Innovations2022-02-052022-02-252022-04-14
SGSESAsia Conference on Smart Grids and Sustainable Energy Systems2024-10-182024-10-252024-11-06
ADAPTIVEInternational Conference on Adaptive and Self-Adaptive Systems and Applications2021-02-052021-02-282021-04-18
SOTICSInternational Conference on Social Media Technologies, Communication, and Informatics2024-06-172024-08-042024-09-29
cbb1ICWEInternational Conference on Web Engineering2024-01-262024-03-222024-06-17
CyConInternational Conference on Cyber Conflict2014-10-012014-10-312015-06-03
おすすめ