Conference Information
HRI 2024: International Conference on Human-Robot Interaction
https://humanrobotinteraction.org/2024/
Submission Date:
2023-09-29
Notification Date:
2023-12-01
Conference Date:
2024-03-11
Location:
Boulder, Colorado, USA
Years:
19
QUALIS: a2   Viewed: 20572   Tracked: 3   Attend: 1

Call For Papers
The ACM/IEEE International Conference on Human-Robot Interaction is a premier, highly-selective venue presenting the latest advances in Human-Robot Interaction. The 19th Annual HRI conference theme is “HRI in the real world.” The conference seeks contributions from a broad set of perspectives, including technical, design, behavioral, theoretical, and methodological, that advance fundamental and applied research in human-robot interaction. Full papers will be archived in the ACM Digital Library.

Submissions are open, submissions can be made here: new.precisionconference.com/submissions

The HRI 2024 conference has five themes: Theory and Methods, Design, Technical, Systems, and User Studies. Each of these themes is aligned with a specific type of knowledge contribution. Authors are strongly encouraged to read through the track descriptions, as the assignment of topics to tracks, and the philosophy of how work will be evaluated within each track (especially the technical track) has changed substantially from prior years.

Theory and Methods
The primary contribution for the theory and methods track is to further the conceptual foundations of HRI. This work helps us to expand upon the ways we think about human-robot interaction (theory) and upon the ways we engage in doing human-robot interaction research (methods). A key benefit of theory and methods work is not only in articulating what we know but there is also generating new ways of seeing the field and posing new questions.
Unlike in previous years, this track will not include reproducibility. Reproducibility studies should be directed to the User Studies track. Theory papers focus on elucidating or connecting fundamental HRI principles beyond individual interfaces or projects, new theoretical concepts in HRI, literature reviews, etc. Such contributions may include meta-analyses of previous findings, narrative or philosophical arguments introducing theoretical or philosophical concepts, detailing underlying interaction paradigms, or providing new interpretations of previously known results.
Methods papers may include new ways of studying HRI, with focus on developing novel evaluation methodologies (e.g. new questionnaires), or in the analysis of existing research and methods derived from original or surveyed empirical research.
Successful papers in this track will clearly detail how they transform our current fundamental understanding of human-robot interaction and why the work is significant and has potential for impact. As appropriate, work must be defended by clear and sound arguments and a thorough reflective analysis of the contribution with respect to the existing state of the art.

We provide the following guidelines to authors about expectations for papers submitted to the theory and methods track:
1. To the extent that methods papers might pertain to new metrics or methods of evaluation, they might include studies; however, the focus of the paper would be on evaluating the novelty and contribution of the concepts and methods, rather than in generalizable knowledge, which is typical of “user studies” contributions.
2. Theory papers are not expected to have user studies.
3. Methods for generating design belong in the Design Track; methods for measuring or studying HRI belong in the Theory and Methods track.
4. If the contribution is an clarification or improvement of an existing theory or method, the paper likely belongs in the Short Contributions track.

Design
The Design track brings together design-centric research contributions to human-robot interaction. This includes novel design approaches, the design of new robot morphologies and appearances, behavior paradigms, interaction techniques or understanding new contexts for interaction which yield unique or improved interaction experiences with or abilities for robots. This year, the design track explicitly also includespapers focusing on design solutions, which describe an outcome in rigorous detail, as well as papers focusing on design-driven methodologies such as participatory design, sustainable design, design for manufacturability, which affect the manner in which designers approach HRI design. Research on the design process itself or proposing new design products, strategies, frameworks, or models as relevant to human-robot interaction belong in the Design Track. Submissions must fully describe their design outcomes or process to enable detailed review and replication of the design process. Further, successful papers will have evaluation appropriate to the work, for example end-user evaluation or a critical reflection on the design process or methodology.

Technical
The primary knowledge contribution of papers submitted to the technical track is expected to be a novel algorithm (formalized through pseudocode), mathematical model (formalized through a set of equations), hardware element, or human-robot interface, and should provide enough detail to allow reproducibility. Evaluations for technical papers should be suitable to the type of contribution and do not need to include user studies unless appropriate.
We provide the following guidelines to authors about expectations for papers submitted to the technical track:

        If a submitted paper contains a user study to empirically evaluate or demonstrate the primary knowledge contribution, authors should be clear in the paper why this paper is indeed a “technical” contribution rather than a “studies” contribution.
        If the novelty or main contribution of a paper lies in the integration of disparate hardware or software capabilities (e.g., a virtual/augmented/mixed-reality HRI system or a cognitive architecture), the paper likely belongs in the Systems track.
        If the novelty or main contribution of a paper lies in the development of a code-based implementation of an existing technical contribution, rather than the algorithm or model itself, the paper likely belongs in the Short Contributions track.

Technical work may be evaluated in a variety of ways, non-exhaustively including proofs of correctness, proof-of-concept demonstration, evaluation in simulation (for example with an Wizard-of-Oz approach), or user study. It is critical to ensure that the claims made by the paper are backed up by the evaluation. This means that while a well designed and appropriate user study with a deployed robot may be advantageous, it is not necessary. Further, a poorly designed and inappropriate user study presented in a technical paper may be worse than including no user study at all. However:

    Authors of submitted papers whose evaluation does not support the paper’s claims (e.g., strong claims of an algorithm’s effect backed up only by a proof-of-concept demonstration or an underpowered user study) might be asked to revise their claims.
    Authors whose paper does not contain a demonstration or study of a physical robot interacting with humans should provide a “pathway to deployment” describing what would need to be done to reach successful deployment onto a physical platform. Such papers would still need either an analytical or computational/empirical evaluation.

Systems
Because HRI is by its nature an integrative discipline, the Systems track focuses on contributions which consist of a synthesis of underlying techniques to achieve system-level HRI behavior. Generally speaking, a Systems paper is one whose contribution is best observed and measured through performance of an integrated system, usually including a robot, rather than through component-level testing. This could involve novel system design or integration of novel techniques or components to enable new system-level functionality. The contribution of a Systems track paper could include demonstration of the usefulness of a component or technique in the larger system, achievement of new system-level capabilities, or enhanced system performance. Authors should emphasize the novelty and significance of their contribution. In particular, authors should highlight what wider systems knowledge, best practices, lessons learned, etc. can be derived from their work. Papers’ evaluations should be appropriate to the systems contribution being presented, potentially including demonstrations based on scenarios or case-studies, user studies, or technical performance evaluations. Similar to the Technical track, a well-designed user study can help support a paper’s claims, but is not strictly necessary for acceptance. Please note that:

        If the primary contribution of a paper is an algorithm or a mathematical model, or is best measured via component-level testing rather than a systems-level evaluation, then the paper likely belongs to the Technical track;
        If the paper’s contribution is about better understanding human behavior or preferences, the paper may belong to the User Studies track;
        If the contribution is limited to a description of a system design or dataset, the paper likely belongs in the Short Contributions track.

User Studies
The primary knowledge contribution of papers submitted to the studies track is expected to be new knowledge about human-robot interactions, based on a study conducted with people. These studies may take a variety of forms, including needfinding studies and other ethnographic and qualitative studies; exploratory, theory-building, or replicative laboratory or field studies; industry case studies, and so forth. These studies may use autonomous robots, wizard of oz, or may not involve a physical robot at all in some cases. What is most important is the size of the knowledge contribution produced through this study for the human-robot interaction community. Because the primary knowledge contribution in this track is expected to be knowledge from a study with humans:

        If a submitted paper contains evaluation of an implemented system, it should be clear why the insights from the study with humans are the primary knowledge contribution rather than the system itself, and thus why the paper belongs in the Studies rather than Technical or Systems track.
        If the paper’s main contribution is the creation of a new survey measure or other scientific tool, the paper likely belongs in the Theory and Methods track.
        If the paper’s main contribution is a design process, or the description of a series of design steps used to design a robotic system, then the paper likely belongs in the Design track. For example, if an author performs a series of co-design workshops, the resulting paper would belong in the Studies track if the contribution of the paper focused on the scientific insights gleaned from the analysis of those workshops, but would belong in the Design track if the contribution of the paper focused on the quality of the design process itself.
        If the paper’s main contribution is the dataset rather than the experimental results themselves, the paper likely belongs in the Short Contributions track.

Short Contributions (Datasets and Code)
To foster Open Science best practices, the HRI conference is accepting Short Contributions (4 pages, excluding references), focused on Code and Dataset sharing. Similarly to the HRI Full papers, HRI Short Contributions will be fully and rigorously reviewed, and will be archived in the ACM Digital Library and      IEEE Xplore.
These submissions will be centered around a software/dataset of significance for the HRI community. The software/dataset has to be open-source or open-access at time of review, and will be assessed for documentation and accessibility as part of the review process. In specific cases (e.g. dataset with privacy-sensitive data), the code or dataset might not be directly downloadable. In this case, the authors must outline in their submission reasonable steps for other researchers to access the data (e.g. ask for specific ethical approval). Note that software or dataset which cannot be shared due to intellectual property issues, including using proprietary licensing, cannot be submitted.
Last updated by Dou Sun in 2023-09-20
Related Conferences
CCFCOREQUALISShortFull NameSubmissionNotificationConference
ICICCT'International Conference on Inventive Communication and Computational Technologies2018-03-202018-03-302018-04-20
EMETInternational Conference on Energy Material and Energy Technology2024-04-20 2024-11-18
bNSSInternational Conference on Network and System Security2023-03-032023-05-052023-08-14
5G Summit5G Summit2014-09-14 2015-01-06
ICEEEInternational Conference on E-Learning and E-Technologies in Education2015-08-252015-08-272015-09-10
ICAIRInternational Conference on Artificial Intelligence and Robotics2023-01-102023-02-152023-07-13
COSITEInternational Conference on Computer System, Information Technology, and Electrical Engineering2021-08-152021-09-012021-10-20
RAICoMInternational Conference on Recent Advances in Intelligent and Connected Mobility2018-11-202019-01-052019-02-15
ICCSE' International Conference on Computer and Software Engineering2013-10-312013-11-302014-03-13
BDHIInternational Conference on Big Data & Health Informatics2022-12-172022-12-272023-01-02
Related Journals
CCFFull NameImpact FactorPublisherISSN
bHuman–Computer InteractionTaylor & Francis0737-0024
aACM Transactions on Computer-Human Interaction ACM1073-0516
Advances in Materials Science and Engineering1.299Hindawi1687-8434
cData Science and EngineeringSpringer2364-1185
Operations Research Letters1.154Elsevier0167-6377
International Journal of Child-Computer Interaction Elsevier2212-8689
Brain-Computer InterfacesTaylor & Francis2326-263X
aInternational Journal of Human-Computer Studies4.866Elsevier1071-5819
Journal of Cloud Computing Springer2192-113X
Data Science JournalICSU1683-1470
Recommendation