OptIntegral | Advertisement displays manufactured by hybrid in-mould integration

Duration: 01-02-2015 / 31-01-2018

Horizon 2020 – No 643956

OptIntegral project has been launched to develop advanced LED advertising displays using in-mould hybrid integration of TOLAE (Thin, Organic and Large Area Electronics) and photonics components.
The project will combine the intrinsic TOLAE benefits of thin lightweight bendable structures and in-mould integration with automated high reproducibility assembly, to produce a revolutionary optical design concept that brings better display resolution, lower costs and energy saving.

The aim of OptIntegral is to prove the viability of the technology, and demonstrate the flexibility, sustainability, and cost effectiveness of this revolutionary manufacturing process. This should enable a wide range of diverse LED display products to be manufactured competitively within the EU labour market.


QoE-Net | Innovative Quality of Experience Management in Emerging Multimedia Services

Duration: 01-01-2015 / 31-12-2018

A Marie Skłodowska-Curie Initial Training Network

The project addresses the needs for models, methodologies and tools for an effective management of the QoE along the whole chain of design, production, delivery and control of multimedia services. Particular attention will be devoted to three applications: mobile gaming, social TV, and web-services.
This figure shows the interaction between the QoE-Net research areas and the eco-system of factors in multimedia service provisioning, where the quality of experience should be strongly considered:

  • Content and context-aware coding: it refers to the coding procedures for optimised QoE at the server side, to be aware of the content to be compressed and protected from errors and the context where the content will be consumed.
  • Context and network-aware delivery: it refers to the media delivery or transmission mechanisms which are able to optimise QoE from network side.
  • Content and context-aware decoding/displaying: it refers to decoding procedures and displaying techniques for optimised QoE at user terminal side.
  • QoE monitoring, control and management: it refers to the activities of monitoring, control and management which should be conducted in each of the three blocks described above.
  • QoE definition, modelling and evaluation: it refers to the definition/features and modelling of QoE, and evaluation of QoE models based on user subjective tests.

As shown in the figure, in addition to the above key technical aspects addressed along the media delivery chain, non-technical aspects which will have a strong impact on QoE will also be investigated. These include: the environmental impact or the context awareness on QoE; the psychological/social impact on user perception and consequently on QoE; the business impact (e.g. cost or financial aspects) on QoE; the impact of different services on QoE; the aesthetics/art and design impact; the display/rendering of multimedia contents and their impact on QoE. As highlighted, all these procedures in this project will be guided by the three key applications.
Every factor is strictly linked to each other and the management of the QoE needs a complete understanding of all issues involved. For this reason the ESRs involved in the network will study the whole QoE eco-system together with an in-depth investigation of the issues covering one or more specific factor(s).


ARI(VA)² | Augmented Reality for Vechice Architecture and Virtual Assessment

Duration: 01-06-2008 / 31-05-2011

Project ID: 4 000


ARI(VA)² project aims at defining and developing an integrated software and hardware generic system and augmented reality applications, tested on an industrial application, Vehicle Architecture with comprehensive Virtual Assessment.

Project developments:
The project objectives of ARI(VA)² will be realized through the development and validation of virtual reality technologies: software and hardware. These developments involve the following steps:

  • Development and industrialization of a large F.O.V helmet
  • Comprehensive hardware and software architectures for industrial Augmented Reality applications
  • Industrial application process analysis
  • Study of observation and virtual exploration in proximal space


  • Validated technology
  • Increased performance of virtual prototyping
  • Commercial development of virtual technology

The participation of Holografika is supported by the Hungarian Miksa Déri programme:

Established by the support of the National Office for Research and Technology


Tangible3D | Tangible Holographic 3D Objects with Virtual Touch

Duration: 01-04-2014 / 31-03-2016


The Tangible3D project will create the first interactive tactile true-3D displays that integrate the complete light field 3D space, with all parts of the scene directly available for interaction (from Holografika) with a non-contact tactile system that allows users to feel the objects in mid-air (from Ultrahaptics). The 3D display system will allow users to reach into the viewing volume to touch virtual objects, and even feel gradients of haptic feedback as the hands penetrate into virtual objects.


LiveRay | Live Light Field Recording and Display

Duration: 01-05-2016 / 30-04-2018


The project creates the first integrated light-field camera & light-field display system that enables the direct capturing of 3D light fields with a purpose-built camera, and visualize it in real time on a glasses-free light-field 3D display. Light-field is an emerging concept for representing rich 3D visual information that is able to capture real world phenomena with unprecedented image quality. A light field describes visual information as a set of light rays that pass a (sensor / display) surface, but apart from capturing the position and color of each light ray, the direction of each light rays is also recorded. Both light-field cameras and light-field display exist in the market as of today (from Raytrix and Holografika, respectively). While light-field displays can reproduce hologram-like, full color, real time representations of anything that can be captured in light-field format, light-field cameras can capture 4D information of a subject, which can be used to reconstruct the depth of the recorded scene for every pixel, as well as to reconstruct all-in-focus images, all from a single shot. This project is the first to combine the two technologies to generate completely new and exciting business opportunities.

National projects

National projects

Nothing found.