Neural Style Transfer

Digital Masterpieces, a start-up company of the Hasso Plattner Institut for Digital Engineering, has recently finish a first full version of digital imaging system based on neural style transfer. The app, freely available as “BeCasso”, instantly converts digital photography into “pieces of artworks”, providing a large number of styles derived from world-class painters (
BeCasso technology received the Best Paper Award at the SIGGRAPH Asia 2017 Symposium on Mobile Graphics and Interactive Applications (MGIA) where it also received the Best Demo Award for the contribution "Pictory - Neural Style Transfer and Editing with CoreML".


This technology delivers features required by future digital brushes, digital paints and digital imaging solutions. As a service-oriented system, it can be integrated as component into complex workflows of image-based or video-based processing systems as well. Research Retreat

The Research Retreat aims at

  • bringing together researchers from computer graphics, multimodal and multimedia technologies and visual interactive digital media technologies;
  • focusing on dedicated key-innovation topics with small groups of researchers from members and associates of;
  • developing proposals for joint research projects and joint development activities;
  • setting the base for joint publications and research agendas in “buttom-up” mode;
  • strengthening network building and alumni networks.

The Research Retreat also targets at strengthening knowledge about the partners’ activities and research projects. In particular, it contributes a platform to present, discuss, and relate own research works such as ongoing or finished works on Master or Ph.D. theses. The Research Retreat also provides a platform to invite leading corporate partners from industry that can provide students and researchers insights into industry perspectives. Prof. Dr. Jürgen Döllner (Hasso Plattner Institute) leads the Research Retreat with its office in Málaga, Spain.

REPLICATE – cReative-asset harvEsting PipeLine to Inspire Collective-AuThoring and Experimentation

REPLICATE‘s main goal is to stimulate and support collaborative creativity for everyone anywhere and anytime (ubiquitous co-creativity). To achieve this goal, the H2020 project, in which partner Fraunhofer Institute for Telecommunications, Heinrich Hertz Institute (HHI) is participating, will address different aspects that currently complicate the process of content creation and collaborative co-creation

REPLICATE aims to build upon leading research into the use of Smartphones and their sensors to deliver robust image-based 3D reconstruction of objects and their surroundings via highly visual, tactile and haptic user interfaces. In order to deliver real-time, interactive tools for high-quality 3D asset production, the project will balance device-based vs. cloud-based computational loading, rather than simply replacing regular computers by mobile devices. In this way, REPLICATE will facilitate everyone to take part in the creative process, anywhere and anytime, through a seamless user experience, ranging from the capturing of the real-world, modifying and adjusting objects for flexible usage, and then finally repurposing them via co-creative, MR workspaces to form novel creative media or through physical expression via rapid prototyping.

More information can be obtained at the project web site.

DAKARA - Design and application of an ultra-compact, energy-efficient and reconfigurable camera matrix for spatial analysis

Within the DAKARA project an ultra-compact, energy-efficient and reconfigurable camera matrix is developed. In addition to standard color images, it provides accurate depth information in real-time, providing the basis for various applications in the automotive industry (autonomous driving), production and many more. The ultra-compact camera matrix is composed of 4x4 single cameras on a wafer and is equipped with a wafer-level optics, resulting in an extremely compact design of approx. 10 x 10 x 3 mm. This is made possible by the innovative camera technology of the AMS Sensors Germany GmbH.

The configuration as a camera matrix captures the scene from sixteen slightly displaced perspectives and thus allows the scene geometry (a depth image) to be calculated from these by means of the light field principle. Because such calculations are very high-intensity, close integration of the camera matrix with an efficient, embedded processor is required to enable real-time applications. The depth image calculations, which are researched and developed by partner DFKI (Department Augmented Vision), can be carried out in real-time in the electronic functional level of the camera system in a manner that is resource-conserving and real-time. Potential applications benefit significantly from the fact that the depth information is made available to them in addition to the color information without further calculations on the user side. Thanks to the ultra-compact design, it is possible to integrate the new camera into very small and / or filigree components and use it as a non-contact sensor. The structure of the camera matrix is reconfigurable so that a more specific layout can be used depending on the application. In addition, the depth image computation can also be reconfigured and thus respond to certain requirements for the depth information.

For more information, visit