1st International Workshop on Data-driven Creation of Immersive Experiences

at the ACM Interactive Media Experiences Conference (IMX 2023), 13-15 June, 2023



The aim of the DataImmers-2023 workshop will be to address the increasing importance and relevance of richly granular and semantically expressive descriptive metadata about multimodal content assets in the media value chain. This metadata should serve the purpose of facilitating the selection and organisation of these content assets as part of immersive content experiences, both in automated ways (e.g. automated insertion of multimodal content from the Web inside a journalistic immersive experience for a breaking news story) and semi-automated (e.g. creative authors can search for and re-use assets in a local MAMS to produce a theatrical or cultural immersive experience).

Such descriptive metadata needs extraction (from different modalities), modelling (according to shared vocabularies and schema) and management (in appropriate storage tools with expressive query support) before it can be meaningfully used to discover and organise content assets for new, innovative data-driven immersive content experiences. This should include means to adapt, merge or remix content according to the usage purpose in the immersive experience as well as support content personalisation in the immersive content creation stage (i.e. author an experience with a choice of different content items which are selected and adapted at play-out according to a user profile or user actions).

The workshop will solicit the latest research and development in all areas around the creation and management of descriptive metadata for multimodal content in immersive experiences as well as approaches to select and adapt the content according to its purpose and use in the immersive experience using the metadata. It aims to support the growth of a community of researchers and practitioners interested in creating an ecosystem for descriptive metadata for content which can support future data-driven creation of immersive experiences.

Topics for the workshop include, but are not limited to:

  • Extraction and modelling of descriptive metadata about multimodal content assets, including image, audio, traditional 2D video, as well as 360 and volumetric video (decomposition, semantic representation, categorization, annotation, emotion/mood etc.).
  • Shared schema and vocabularies (incl. Linked Data) for descriptive metadata that are also supported by tools for metadata storage, query and browsing engines for asset discovery and creation tools for the insertion of (adapted/merged) assets into immersive environments.
  • Curation of this data throughout the media value chain including commercial and contractual issues for re-use in immersive experience creation.
  • Combination of sets of interrelated content assets to form one multi-faceted content item for insertion into an immersive environment, adaptation of individual content assets to fit into this combination e.g. automated summarization of (2D, 360 or volumetric) video.
  • Selection and adaptation of media assets according to user profiles or user actions at play-out on the basis of the technical and descriptive metadata of those assets.
  • Visual analytics of the extracted descriptive metadata to aid creators in content discovery (e.g. clustering content assets around topics), including generating representations of those visual analytics that can be inserted into immersive environments (e.g. pinpointing locations discussed in the assets in a content repository on a 3D globe).

DataImmers-2023 will continue from the successful DataTV workshops held at IMX 2019 and 2021 where a range of topics related to data-driven personalised of television were presented, as reported in the workshop proceedings at and, and which also led to a Special Issue on Data Driven Personalisation of Television Content in the Multimedia Systems journal (


DataImmers-2023 foresees two types of submission. Both submission types will be handled by a dedicated EasyChair page. Full papers will have an oral presentation at the workshop and short papers may be presented as either a poster or a demo at the workshop. All accepted papers will be included in the Workshop Proceedings.

Full papers

These are to be between 7000 and 9000 words in the SIGCHI Proceedings Format with 150 word abstract, describing original research to be presented in the oral session which covers at least one of the workshop topics. We expect papers to show data-driven solutions which are completed or close to completion.

Short papers

These are 3500-5500 words in the SIGCHI Proceedings Format with 150 word abstract. Papers are to describe works in progress or demos, to be included in the poster and demo session. The submitters will be asked to provide links to the work that will be presented and outline in the short paper why this is relevant to the topic of Data-driven Creation of Immersive Experiences, as well as identify if the submission is for a poster or a demo to be shown at the workshop. We expect new concepts and early work-in-progress to be reported here.



Submission dates:

  • Paper submission by 24 April 2023
  • Notification of Acceptance by 3 May 2023
  • Camera ready submission 15 May 2023


Detailed program coming soon


Lyndon Nixon, MODUL Technology GmbH, Austria
Vasileios Mezaris, CERTH-ITI, Greece
Niall Murray, Technological University of the Shannon, Ireland

Privacy & Cookies Policy