On 6 May 2022, the 3IT held its annual Partner & Friends event: the 3IT Summit! The 3IT-Team was thrilled to welcome many old and new faces at this well-established and successful format that has been on hold since the pandemic. This year’s 3IT Summit focused on the most recent projects that have been launched by the 3IT Partners and included live showcases in the Immersive Media Experience Laboratory TiME Lab as well as a collective immersive live experience in 6DoF on VR headsets.
The program of the event included the following talks: The projects TeleSTAR and MultiARC presented the perks of AR-based clinical education. The consortial partners of the project VoluProf discussed about how Mixed Reality could be used to create a more efficient way of direct, interactive and user-oriented teaching via photo- and audiorealistic avatars of real lecturers for remote learning in a panel discussion. And the project Virtual LiVe showed the possibilities concerning the virtualization of live events via audiovisual immersion followed by technical demo tours from the project‘s successfully applied technological tool kit.
TeleSTAR (Telepresence for Surgical Assistance and Training using Augmented Reality) utilizes a digital surgical microscope and augmented/mixed reality to provide new tools for teaching and intraoperative assistance. Standard procedures as well as rare and complicated ones can be visually annotated and augmented to make every step of the surgery transparent to a wide audience at the same time. A bidirectional communication platform allows questions to the surgeon. This concept can be scaled to other surgical domains and existing degree programs.
The project MultiARC engages with developing an interactive and multimodal-based augmented reality system for computer-assisted surgery for ear, nose and throat treatment. Input/output data handling, a hybrid multimodal image analysis, as well as a bi-directional interactive augmented reality (AR) and mixed reality (MR) interface for local and remote surgical assistance are part of the framework. The benefits are manifold and will lead to a highly improved patient outcome.
The speaker Eric L. Wisotzky is a research associate at the Computer Vision & Graphics Group of the Vision & Imaging Technologies Department.
Workshop Report „Live from the Other Room: Martin Kohlstedt“
Live from The Other Room is a VR-concert venue in which musicians perform in volumetric video, live streamed to audiences worldwide. Fans can book tickets to live shows in virtual reality, alone or with friends. Audience members can move freely through the space, communicate with each other and with the artist directly. Custom creative concepts are designed in collaboration with the artists, and musicians are recorded in a comfortable environment with direct audience feedback. The immersive experiences will be streamable on the most widespread 6DoF-VR headsets.
Since the beginning of the COVID pandemic many musicians have been trying to bring their music to audiences remotely. Some have been singing in front of webcams, some have made avatars out of themselves and performed in popular multiplayer video games like Fortnite. These performances spiked some interest, but are artistically not always interesting to serious musicians who thrive on audience presence and authentic communication.
Humans have been performing for each other for thousands of years, and a concert is a cultural expression that can affect a person profoundly – alone and/or in a collective context. Currently in its development stage, LFTOR aims to explore the elementary aspects of musical performance and what constitutes a satisfying experience for both artist and audience. Exactly replicating a real life concert situation in VR is technically not feasible and not useful, whereas a well-considered virtual space offers an opportunity to avoid the current disadvantages of established music industry practices and create a more inclusive standard.
Martin Kohlstedt is a German composer and pianist, known for his mixture of classical and avant-garde music, improvised on stage during passionate performances.
The two speakers were Harmke Heezen of creative studio High Road Stories (Berlin) and Oliver Ihrens of communication agency Radar Media (Bochum).
The official program was followed by live tours within the facilities of 3IT/ Fraunhofer HHI. The participants of the event were able to witness the 6DoF-VR-experience „Live from the Other Room: Martin Kohlstedt“ collectively in small groups. Another highlight was the presentation of the Virtual LiVe hybrid concert that took place at Kesselhaus Berlin in December 2012 and was simultanously broadcasted in 360 degrees to the planetarium Bochum as well as to audiences at home in real-time. Pieces of this concert where shown with 3D sound and in a 180° projection in Fraunhofer HHI’s Immersive Laboratory TiME Lab and allowed the guests to get an immersive glimpse of what the future of hybrid live events might look like.
The live tours were followed by a casual get-together, where our guests were able to share their experiences in the field of immersive imaging technologies and make new plans for possible future collaborations.
It was a great day with well-known as well as new faces, with exciting talks and contributions and exciting showcases.
Efiport is a provider of technology-supported education solutions, offering learning efficiency to global corporations, SMEs and educational institutions. Their service EdTech, (Education Technology) provides video production and livestreaming of videos and holds regular online events in this field. The topic of their latest Meetup event was concerned with how Virtual and Augmented Reality can be applied in education in order to create engaging and lasting learning experiences. The target audience consists mainly of researchers and industry experts. However, the event format is open to a wider audience as well.
3IT project manager Maria Ott was invited to present the 3IT Partner Network and the 3IT project TeleSTAR at the latest edition of efiports EdTech Meetup. She focused on the motivation and state of the art of the project, as well as one of their previous showcases in November 2020: At that time, the project team had successfully transmitted the surgery of a cochlea implantation to four remote locations in parallel (Munich, Rotterdam, two locations in Berlin). Using an Augmented Reality based 3D live stream from the ENT clinic of Berlin’s renowned University Hospital Charité, they have enabled more than 60 participants and trainees to follow the live stream with a latency of maximum 800 milliseconds.
TeleSTAR, launched in January 2020 by a team of experts of TU Delft, Charité – Universitätsmedizin Berlin, Fraunhofer HHI and ARRI Medical, uses a full digital surgical microscope and computer vision algorithms to develop new AR-based training and teaching methods and tools, leading to a new level of educational transparency. The system offers new ways to visualize relevant surgical information, allowing surgeons to improve and explain their decisions. Surgical decisions can be easily visualized during course units, on high-resolution screens or a head-mounted display, in any local or remote location.
Following the presentation, a networking session allowed for some further explanations of the 3IT’s scope and relevance in the field of Virtual and Augmented Reality and 3D live streaming .
The presentation is still available to watch on YouTube:
TeleSTAR is funded by the EIT-Health. EIT-Health is supported by the EIT, a body of the European Union.
On 23 and 25 April 2022, the project “Virtual LiVe” (Virtualization of Live Events through Audio-Visual Immersion) has presented a further validation step with a 360 degree keyed broadcast of INDUSTRIE, an experimental musical act by the ensemble PHØNIX16. Virtual LiVe aims at digitally complementing classic event formats but also at elaborating how immersive media technologies (e.g. 3D-Audio, 360°-Video, Light-fields, Volumetric Video) can be utilized to benefit different types of event formats and thus push the envelope in hybrid event realizations.
Performative Industrial Music with an Agenda
The INDUSTRIE by PHØNIX16 is a performative musical act and reflects on the relationship between music and industrialization as well as industrial processes in music. Two composers, who in their individual ways of working with technical extensions or machinization of the performer, are juxtaposed in this piece.
The two works that are designed for VR glasses could not come from more opposite directions. Mexican artist Marisol Jiménez studied composition with Brian Ferneyhough, then turned to multidisciplinary work with self-constructed sound sculptures – experimental electroacoustic instruments that could all sit in a large noise orchestra.
From the opposite corner moves Christophe Guiraud: He comes from Noise, but then discovered his great passion for Ars Subtilior (14th century). Since then, these two inspirations have been part of his work. In his VR work for PHØNIX16, the audience is immersed in an abstract and colorful 3D chapel, where an equally peculiar Renaissance movement, colored by quarter tones, swells, billows and lives. The singers hang from the ceiling like bats. Gravity and balance are vocally and surreally disturbed.
Marisol Jiménez’s work also works with materials: Six gray fringed carpets that form a cube. This cube is lined with an acoustic carpet that the six musicians weave in an entanglement of cables. Sound streams out of machines. A drone blossoms, takes shape and deforms. Only the viewer gives this sculpture its dimensionality. Both works question, problematize, and exhibit the protected space offered by VR glasses. The show provides an immersive and at the same time transgressive experience.
The INDUSTRIEis part of the series Dead On Arrival (D. O. A.) with contemporary music for voices. D.O.A. challenges the impending irrelevance of the voice in current music in society. In its D.O.A. series, PHØNIX16 tries to initiate a paradigm shift in a comprehensible way in order to find a new status of music for voices nationally and internationally.
Fraunhofer Technology: Blurring the Boundaries between Musical Perfor-mance, Storytelling and Broadcasting Tools
D.O.A. is discursive and sometimes subversive, but in any case inclusive contemporary music for voices and a possible starting point for social or cultural debate. The team of PHØNIX16 partnered up with the team of the 3IT-hosted Project Virtual LiVe to examine how the use of cutting-edge immersive technology could contribute to their experimental format.
In the production of the INDUSTRIE, the artists performed around the Omnicam-360 and were recorded with this panorama camera from 360 degrees in a green screen studio. The virtual background that was inserted, is constructed based on paintings by Iris Terdjiman.
The OmniCam-360, developed at Fraunhofer HHI is a scalable, mirror-based multi-camera system that allows the recording of live videos in a 360° panoramic format. In addition, Fraunhofer HHI developed the technology to transfer the video material in real-time to end devices – from TV monitors to VR glasses. This Real Time Stitching Engine (RTSE) is a software-based solution for the real-time processing of ultra-HD panoramic recordings.
In the context of the project INDUSTRIE, the end-to-end camera system was especially used to test the practicability and the benefits of a high resolution 360 degree live keying. Keying is the process of removing the green screen background. When the green screen background has been keyed, it will be fully transparent, which allows to fill in that transparent area with a different image or video.
During the production of the musical act the RTSE provided a preview in real-time, in which parts of the 360 degree stage that were not green were masked as green. This created a fully green screen background and allowed for the insertion of the planned video on site. This material in turn was made available to the creative director and operators who were able to immediately identify critical scenes and find a solution on the spot. In order to provide a smooth production flow, a Blackmagic Atem mini was used additionally for the live keying during the production. The Atem mini is able to transmit keyed material in HD for live streaming, whereas the processing of the panorama within the RTSE allows resolutions of 10k by 5k with a color depth of 32 bit per channel.
The broadcast of the two pieces was available online on 23 and 25 April 2022. The real-time keying was tested successfully during the production, adding another layer to the Virtual LiVE collective toolbox of immersive media technologies.
The soloist ensemble PHØNIX16 was founded in Berlin in 2012 and is a collective of singers committed to the expansion and performance of current and contemporary music for/with/without voice, especially by composers still alive. The fact that the singers see themselves as soloists and as part of a collective to the same extent has a decisive influence on the sound, the music-making and the rehearsal work. PHØNIX16 combines voices in an unusual way; the division and instrumentation of the ensemble varies from solo to 16 voices. The repertoire consists of works for voices a cappella, as well as voices plus electronics, feed, instruments, video/film, manipulations, objects and/or machines.
About Virtual LiVe
The Fraunhofer Institute for Open Communication Systems FOKUS, the Fraunhofer Institute for Telecommunications – Heinrich Hertz Institute HHI and the Fraunhofer Institute for Integrated Circuits IIS are collaborating on the project Virtual LiVe – Virtualization of Live Events through Audiovisual Immersion. The project is hosted by the 3IT – Innovation Center for Immersive Imaging Technologies – and supported by the ‘KMU-akut’ program “Research for SMEs” of the Fraunhofer-Gesellschaft. The aim of the project is to digitally complement classic event formats by using new immersive media technologies (e.g. 3D-Audio, 360°-Video streaming, interactive low-latency streaming and advanced stream analytics) and to add value compared to typical video streams via the Internet. Find more information on the project here and watch the Making-Of of the previous event here.
In the “Virtual LiVe” project, the three Fraunhofer Institutes FOKUS, HHI and IIS are working with the creative industry and companies from the event sector. Together, they realized their first experimental laboratory concert within the framework of the project. On December 11 2021, the highly anticipated event took place at different locations simultaneously, showcasing the benefits of the implementation of immersive media technology into classic event formats.
The past couple of months have been tremendously exciting within the project Virtual LiVe. All sections were operating at full speed in order to ensure a smooth and technically accurate workflow. How did it all start? In the first quarter of this year, the project partners Fraunhofer FOKUS, Fraunhofer HHI and Fraunhofer IIS set out to extend and enhance hybrid live events with cutting-edge immersive technologies. The ambitious goal promised the exploration of new shores: In order to blur the boundaries between physical and digital perception, high-end Fraunhofer technologies were to be implemented and intertwined, thus improving the digitization of classic event formats.
Together, the project partners invited all the stakeholders at the beginning of the project, analysed dozens of questionnaires and conducted workshops and colloquia to determine the exact needs of the industry. The findings led to the realization of the project’s first hybrid live event: In collaboration with the artist The Dark Tenor the event finally took place in the Kesselhaus at Kulturbrauerei Berlin, in the Planetarium Bochum, in the digital cinema at Fraunhofer IIS and online.
The whole setup of the concert was a one-of-a-kind event and received an overwhelmingly warm response from the various audiences at different locations.
Kesselhaus at Kulturbrauerei – on Site Location with a Digital Twist
The artist Billy Andrews – The Dark Tenor – gave the premiere concert of his Classic Roxx Tour at the Kesselhaus in Berlin. About 150 guests enjoyed the show on site with The Dark Tenor’s full band line-up – a first since the start of the pandemic. The outstanding novelty: He was accompanied by the artists Queenz of Piano, who played from StudioMix; a different location in Berlin. They were incorporated into the concert via an ultra-low latency audio connection utilizing the Digital Stage platform. This enabled the artists to play music together live “on stage”, even though they were not located in the same place. The Queenz of Piano were additionally projected from their studio onto a screen in the Kesselhaus. The musical and visual interplay provided a truly immersive, hybrid experience as the ensemble played together from different locations.
“The Classic RoXX Tour premiere was a resounding success and, in collaboration with the Fraunhofer Institutes, has created new possibilities for how music and art can co-exist over the Internet despite being at a distance from each other. I’m excited to see that technology is helping artists interact. It will definitely become an integral part of what we do at some point!” (Billy Andrews, The Dark Tenor)
„This was certainly something very special, even for the Kesselhaus. For the first time, musicians from different places made music together here at the same time, as if they were standing on the same stage. In addition to the spectators present here, hundreds more watched the show online and participated vigorously. A great experience.“ (Sören Birke, Kesselhaus Berlin)
“With the use of Fraunhofer technologies and those of the partners, we have managed to create a very special concert experience – unique and first of its kind. We are proud to have realized this project with these partners and very happy about the great feedback from experts, artists, but also from the actual fans.” (Stephan Steglich, Fraunhofer FOKUS)
Planetarium Bochum – Live Stream of the Concert in 360 Degrees on a Planetarium Dome
Hybrid live events offer participants the opportunity to attend performances by artists even if they are unable to travel to the venue in person – for many diverse reasons. In the context of this validation project, the project partners have gone one step further and opened up new venues for cross connected artistic performances, by means of their technological expertise.
The event in Berlin’s Kesselhaus was recorded using the panorama camera OmniCam, a scalable, mirror-based multi-camera system that, together with its Real Time Stitching Engine (RTSE), is capable to capture a high-resolution 360° video stream and transmit it in real time.
At the Planetarium Bochum, video images from 4K closeup cameras were combined with those from the 360° camera using a special DomePlayer and geometrically adjusted in real time for the dome shape, to then be displayed in the planetarium’s 6Kx6K dome projection using the 11 present high-resolution projectors. Depending on the song, this projection was further enriched with planetarium-specific content such as dome illumination effects or the razor-sharp starry sky of the Zeiss star projector. All video media displayed on the projection were interactively animated in their properties in real time, such as position, rotation, size, zoom or transparency. Depending on the situation and song mood, this approach provided a suitable and varied image that could visually even exceed the on-site experience.
To complete the immersive experience, the concert sound was produced with MPEG-H Audio. A 3D-Audio mix was created in real time. With the help of a Spatial Audio Designer an MPEG-H scene was authored, then encoded and pushed to the cloud. The audio and video stream were decoded in the planetarium and then distributed to more than 60 loudspeakers in the entire planetarium dome. The different video and audio streams were synchronized frame-accurately to ensure a perfect audio-visual experience.
„The 360° live concert was impressive – as an audience member in the Planetarium Bochum, you got a very direct impression of the atmosphere in the “Kesselhaus”! The interaction of 360° video and 3D audio created a highly immersive feeling in the dome. This was confirmed by our audience, which also asked for more events like this. The collaboration with the Fraunhofer Institutes in this experimental concert setting was consistently congenial, creative, and technically impressive.“ (Susanne Hüttemeister, Planetarium Bochum)
„Immersive rooms like planetariums are excellent platforms to create the feeling of finding oneself at the live location since we have the possibility to present immersive video and 3D-Audio in the highest quality to the audience. Such a hybrid social event, where the line between the real event and the remote audience is blurred, could be a solution for the “new normal” and contribute to overcome constraints of pandemic conditions or simply when the real event is fully booked. Even more, it transforms our perception of live events.“ (Christian Weißig, Fraunhofer HHI)
“It was a big challenge for all of us to make this high-profile experience possible, as it was the first time – world wide – that a concert was streamed live and displayed in 360° on a dome with this variety of functions and top notch audio-video quality! We are all the happier that everything worked out as planned and that the audience feedback was so overwhelmingly positive!“ (Manuel Schiewe, Fraunhofer FOKUS)
“A live concert is a perfect match for a 3D audio production. Listeners that can reproduce 3D-Audio – which can be as easy as using headphones – can enjoy the ambience of the concert hall and the noise of the crowd. The Dark Tenor fired up his audience and motivated them to sing along. This auditory experience has been successfully transported to audiences at the various other locations.” (Ulli Scuda, Fraunhofer IIS)
Interactive Concert Streaming to Remote Locations
More than 600 participants from 16 countries followed the concert virtually via the Internet. For a truly hybrid experience it was important to give viewers at home the opportunity to enjoy the concert in a way that comes as close as possible to a live experience. The FAMIUM-DASH solution that was used here allows video streams to be transmitted with low-latency, adaptive quality, including auditory or visual feedback. In this way, the remote audience was not only able to view the concert online with a small delay of a few seconds, but also to send their applause in real time to the Kesselhaus via digital emojis and video chats. A Public Fan Channel, which was displayed on a big screen in the Kesselhaus during the concert, emphasized the interconnectedness of the different audiences. Additionally, viewers were able to setup their own private video chat rooms and invite their friends to watch the concert together. The FAMIUM SAND solution provides advanced streaming analytics and video player coordination and ensured a smooth playback of the low-latency video stream in the context of this hybrid event.
Additionally, the 360-degree panorama live stream of the OmniCam that was transmitted to the planetarium was also provided online for audiences at home in a resolution of up to 4K. With this additional stream, the viewers at home were able to navigate in the panorama of the live concert at Kesselhaus and immerse themselves into the show.
MPEG-H audio production can be rendered by a wide range of playback devices, once an audio mix is produced. For on-site monitoring of the encoded signal, a tablet was used to receive the stream from the internet and decode it for binaural headphone reproduction. At the same time, the very same bit stream was used to send to a digital cinema in Erlangen, where project partners watched the show with 3D sound. The laboratory concert demonstrated to what extent a production workflow that applies top edge technologies is able to provide a truly immersive real time content for audiences on site, on remote event locations such as planetariums or cinemas and at home.
“Soon after we started the live transmission of the concert stream, viewers immediately started providing feedback by clicking on the several feedback buttons and asked to enter our moderated public Fan Channel to appear on stage in the Kesselhaus. They were able to experience a smooth low-latent video playback of the live concert on their player platform of choice, including Web-Browsers on mobile and desktop devices, setup-boxes, TVs via HbbTV, and even the dome in the Planetarium in Bochum.“ (Louay Bassbouss, Fraunhofer FOKUS)
The laboratory concert was a great success. The different technologies were put to the test in this experimental interaction and showed how to push the boundaries of hybrid live events. Currently, the second validation project is already in planning, so look forward to more exciting news from the Virtual LiVE project.
“With our collaborative lab concert, we have proven that hybrid live events are technically feasible. Now, together with our partners, we need to transform the technical complexity of such events into easily manageable and scalable solutions in order to provide artists and event organizers a tool for future and commercially successful hybrid productions.” (Robert Seeliger, Fraunhofer FOKUS)
„We are already planning further to investigate technological vanguard set ups for fancy ideas like volumetric recording at concerts and trade shows. This will allow the audience equipped with head mounted displays to virtually move around. They will decide individually from which direction they experience the concert or show floor. We are looking forward to sharing our findings with you!” (Siegfried Fößel, Fraunhofer IIS)
Did you miss the concert or the livestream? Watch the announcement video of the event and stay tuned for the video of the live event!
The Fraunhofer Institute for Open Communication Systems FOKUS, the Fraunhofer Institute for Telecommunications – Heinrich Hertz Institute HHI and the Fraunhofer Institute for Integrated Circuits IIS are collaborating on the project Virtual LiVe – Virtualization of Live Events through Audiovisual Immersion. The project is hosted by the 3IT – Innovation Center for Immersive Imaging Technologies – and supported by the ‘KMU-akut’ program “Research for SMEs” of the Fraunhofer-Gesellschaft. The aim of the project is to digitally complement classic event formats by using new immersive media technologies (e.g. 3D-Audio, 360°-Video streaming, interactive low-latency streaming and advanced stream analytics) and to add value compared to typical video streams via the Internet. Find more information on the project here.
3IT’s longtime partner Rohde & Schwarz offers technological solutions for 5G broadcast! All details summed up in a free eBook!
5G is for sure promising to provide new and original technological and business opportunities. However, 5G brings also new broadcast and multicast capabilities to the whole ecosystem allowing other new applications and granular business potentials.
With 5G broadcast network operators have an exciting opportunity to make their infrastructure more dynamic and intelligent, delivering a better quality of service with higher quality of experience as a result.
With the lower latency and higher flexibility that 5G broadcast/multicast offers, the consumer experience can be improved with more real-time apps. 5G broadcast gives network operators a simple way to use network resources more efficiently to create new business opportunities and reach a wider audience. It opens up a whole new world of possibilities.
It does not only deliver media and entertainment to smartphones but also reaches smart vehicles with OTA updates, media and entertainment inside the car and map updates. It provides many application segments to be efficiently implemented using 5G Multicast either in downlink- only mode or in a combination with uplink channel, using the same infrastructure.
Live event multicasting makes more sense while using this feature and 5G broadcast can transmit public safety multicast, such as urgent weather and community information, simplifying the relationship between community members and public authorities.
Several other services could be optimized while using multicast over 5G, including OTA multicast for centralized configuration and control, live commerce and rural E-learning where no internet connection is available or possible. In addition, 5G multicast enables venue casting where the consumer can combine live experience with home comfort.
Rohde & Schwarz has prepared an eBook which gives an overview of the 5G Broadcast/Multicast technology and the benefits it has to offer. It highlights challenges and solutions for broadcast network operators, mobile network operators and content providers in a world of ever-growing media consumption.
Watch a teaservideo about the 5G broadcast/multicast eBookhere.
The goal of the project is to develop a Mixed Reality application that is suitable for everyday use within the context of knowledge transfer. The idea is to create a novel and more efficient way of direct, interactive and user-oriented teaching via photo- and audiorealistic avatars of real lecturers for remote learning.
Since the beginning of the pandemic in 2020, online teaching has taken on a completely new meaning and relevance. At the same time the weaknesses and limitations of the technological tools that have been hitherto used in the context of education have been exposed: traditional online teaching, where content is delivered via static video and audio scripts, is not sufficient to provide an interactive and engaging learning experience for users.
This is the point where VoluProf comes into play: the project partners want to create a space of future learning where both traditional lecturing and online learning environments are combined in a way, where lecturers can be experienced not only on video but also virtually and physically. In other words, the user participates in a lesson with a virtual professor by means of Augmented Reality glasses for example in their living room, while also being able to interact with them via eye contact and verbal communication.
Which criteria will play a role in the project realization?
Generating a photorealistic and animatable volumetric representation of the lecturer.
Location-independent applicability through transmission via 5G networks and direct integration into mixed reality devices on the market, as well as subjective optimization of image quality at minimal bit rate.
Natural and fast synthesis of the lecturer’s original voice.
Answering ethical questions about the implications of the new way of knowledge transfer.
Media reception and effects research for the analysis and optimization of user acceptance.
Educational research on the optimal and user-adaptive use of the new way of knowledge transfer
VoluProf strives for a ubiquitous and democratic use of technology, enabling accessibility for everyone to photo- and audiorealistic avatars of lecturers in particular by using 5G networks and advanced streaming technologies. By creating an immersive knowledge space, an improved learning experience is made possible, in which the limits of human-to-human interactions in digital learning are transcended.
About the project partners
The different in-depth knowledge of the project partners, ranging from the fields of language technology software (Aristech GmbH), 5G networks (Deutsche Telekom AG), Artificial Intelligence and educational technology (DFKI), Video, Image Coding and Processing of volumetric data (Fraunhofer HHI), empirical user research (University of Rostock) as well as production and post-production of Volumetric Video data (Volucap GmbH) guarantee a very interesting collaboration with exciting outcomes.
Are you interested in the future of Virtual and Mixed Reality Learning? Or are you interested in becoming a partner of the 3IT? Do not hesitate to contact the 3IT’s team here.
From 25 to 27 July 2021, the “6th ITG/VDE Graduate Summer School on Video Coding and Processing” took place at CINIQ Center and 3IT, after being postponed three times. The event, originally scheduled for October 2020, was organized by the ITG Technical Committee MT 2 “Image Communication and Processing” (spokesperson: Prof. André Kaup) and offered participants the opportunity to exchange ideas on current topics in the fields of video coding, image processing and multimedia communication. Thanks to an elaborate hygiene concept, the event could take place on site with 32 participants. As the Summer School was offered in a hybrid format, up to 20 international participants were able to join at the same time.
The three-day event opened with guided tours at the 3IT, at the CINIQ Center and in the TiME-Lab and was complemented by a get-together where the participants had the opportunity to exchange their research topics and network with each other.
The second day (Session Day 1) included a keynote speech from Jonathan Pfaff (Fraunhofer HHI) on machine learning methods for video compression. Altogether, there were three sessions with a total of 13 presentations covering topics from intra-coding and machine learning methods to 3D video and various topics such as multispectral image sensors, effects of video streaming on climate change, and implementation of video coding methods. In the evening, the participants were taken on a boat tour to further exchange ideas and get to know each other even better over dinner with view of the government district, Museum Island and East Side Gallery.
On the third and last day (Session Day 2), there were two more sessions with seven presentations on 3D video and interframe coding. At the end of all presentations, the participants of SVCP voted for the two best presentations. The winners of the “Best Presentation Award” were Dominik Mehlem and Jens Schneider, both from RWTH Aachen University. Furthermore, due to a sponsorship, participants have the opportunity to win the “Joint Research Incubator Award”, which will be presented on next year’s Summer School from 3 to 5 July 2022.
The SVCP ended for the organizers and participants with a joint lunch at the 3IT.
All information on SVCP 202(0)1 and its presentations can be found here.
The event, moderated by AC Coppens from THE CATALYSTS, aimed at the elaboration of the needs in live events of the culture and events sectors. Due to the pandemic situation, most of the cultural, business and other events have been taking place online or in hybrid formats. In this context, the kick-off event served as a platform for various players from the different industries to come together, discuss challenges and exchange ideas on the topic of live events.
Virtual LiVe has originated and is financed by the KMU-akut program “Research for SMEs” by the Fraunhofer-Gesellschaft with the objective to foster the innovative strength of institutions and SMEs from the cultural and events sectors by providing high-end technology solutions and innovations. However, the project does not intend to overcome or replace physical live events. Rather, it aims at being a supplement and extension to live events by combining the virtual with the physical realm, as presented by Angela Raguse from Fraunhofer IIS and Business Area Digital Media during the introduction of the project.
According to the renowned keynote speakers (Kavaye Ozong, ARTE; Harmke Heezen, High Road Stories; Oliver Ihrens, Radar Media and Thomas Bedenk, Freelancer), different small and large businesses as well as cultural institutions suddenly faced similar problematics in terms of live events, due to COVID-19. This has led to an intersectoral surge in demand to combine Virtual Reality with real live events in hybrid formats.
Subsequently, the technological possibilities of Virtual LiVe were presented, focusing on the technological possibilities and their combination within the framework of the project. Siegfried Foessel from Fraunhofer IIS, Stephan Steglich from Fraunhofer FOKUS and Christian Weißig from Fraunhofer HHI talked about the spectrum of the respective solutions that can contribute to the development of a practicable tool box of technologies.
All panel speakers expressed the difficulties and challenges they had been facing since the beginning of the situation due to COVID-19, as events, concerts and plays had been cancelled. None of the participants had comprehensive previous knowledge about the transfer of an analog event into the digital sphere. In particular, the implementation of new technologies into both cultural and business events in terms of the legal and administrative situation in Germany as well as logistics often presented an obstacle. What’s more, latency remained a predominant issue.
However, all speakers were united by a common interest in technological innovation, the implementation of Virtual and Mixed Reality as well as hybrid event formats. While according to some speakers, a virtual or digital event would never be able to replace a real event, the majority of the speakers acknowledged the innovative potential of immersion, even for speculative fiction and new ideas. Hybrid event formats and the combination of virtual and analog spaces, it was argued, could in turn promote the emergence of innovative opportunities.
On another note, a lively debate concerning the accessibility of technology when it comes to age, race and class arose. Some orators argued that Virtual Reality could play a pivotal role in including individuals that have never attended a concert or went to see a theater play, for various reasons.
Relating to the question of a dream technology, the first answer was an entanglement of physical and digital events through the use of 3D technologies. From the social perspective, the participants wished for a path where culture and business sectors could be united, thus creating an interdisciplinary forum without competition. To sum up, the speakers asked for a technology that is able to bring together different artists from around the globe on platforms without barriers – with low latency and in a borderless space between viewers and performers, while the public gets immersed in interactive virtual spaces.
The different contributions and perspectives led to an interesting debate and to the exchange of important ideas between the developers of innovative technologies and the providers of events by means of or in this case, in need of these technologies. We are looking forward to the further development of this exciting project that brings together very different protagonists!
Want to keep updated? Sign up for the Virtual LiVe Newsletter and stay tuned!
On 13 April 2021, numerous partners of the 3IT came together in the second Partner Meeting, which took place online. The partners of the 3IT met virtually and were able to brief each other about news and innovations as well as to decide upon future common projects and the direction of the 3IT.
The first Partner Meeting was inaugurated last year and has superseded the Steering Committee Meeting. The Steering Committee Meeting, chaired by Dr. Ralf Schäfer, consisted of 3IT’s gold and platinum partners and was in charge of the 3IT’s strategic direction. In the Partner Meeting, the new participative format of the 3IT, all 3IT partners are able to attend. It takes place up to four times a year and thus introduces a new approach for decision-making processes.
The 3IT offers its partners the unique possibility to collaborate in the field of immersive imaging technologies in a pre-competitive environment: as a virtual network, as a platform and as a venue. As part of the network, the 3IT partners enjoy multiple advantages: The 3IT serves both as a communication platform for providers, users and a broad audience and as a marketing instrument for advertising, sales and PR. Moreover, the 3IT provides a development platform and testbed for immersive imaging technologies, applications and infrastructures for its partners. Additionally, the partners are given a platform of knowledge exchange through workshops and conferences. Last, but not least, 3IT’s partners are supported in research projects on industry-relevant applications – with a possibility for technical advice and financial support from the Federal Ministry of Economic Affairs (BMWi).
Throughout the Partner Meeting, the partners had the chance to discuss their views and needs within the network in a very friendly and relaxed atmosphere. Obviously, the current situation caused by the pandemic has affected nearly all partners and equally the activities of the 3IT. Consequently, the 3IT has largely increased its PR work and its Social Media presence and the participation and presentation of its partners at virtual events.
The 2nd Partner Meeting has been successful in bringing up a lively discussion about new ideas for projects and workshops, just as seen in the last Partner Meeting. All participating partners have been inspired through the mutual exchange and have left the meeting with numerous stimulating thoughts in mind.
We are very much looking forward to our next partner meeting, which will take place in October of this year! If you are interested in staying up to date, subscribe to our newsletter.
Not a partner yet, but interested in joining the next meeting? Feel free to write us a mail.
The aim of the project Virtual LiVe is to digitize and enhance classic event formats, which require physical presence by using new immersive media technologies (e.g. 3D-Audio, 360°-Video, Light-fields, Volumetric Video). In the context of the new research project Virtual LiVe, the different needs of the culture and events industries in virtual formats in terms of live events will be elaborated in workshops. This will form the basis to revolutionize the field of virtual live events as well as to expand the experience of physical events.
The expertise of the three institutes Fraunhofer FOKUS, Fraunhofer HHI and Fraunhofer IIS in their specific core fields allows for a highly efficient and synergetic approach, which enables practical commercial applications and possibilities for SME’s. Virtual LiVe is funded within the framework of the ‘KMU-akut’ program “Research for SMEs” of the Fraunhofer-Gesellschaft.
The goal of Virtual LiVe is to develop a platform in the form of a toolbox, which provides applicable and scalable high-end technological solutions for the different players’ event streaming needs. Thanks to such a toolbox, SMEs will be able to put together a customized program according to the requirements of their respective event. To be more precise, they will obtain a technology-based solution for their specific format’s realization. Virtual LiVe thus seeks to enhance the existing range of virtual live events with a focus on audiovisual immersion, but also including accessibility (payment barriers, hardware integration) as well as legal aspects.
The project will be introduced through a hybrid kick-off event at the facilities of the 3IT – Innovation Center for Immersive Technologies and CINIQ Center on 26 May 2021. On this occasion, renowned players from the cultural and event industries will also take part and will present both the status quo of digital event formats and specify concrete requirements for technologies.
Are you interested in participating and shaping the future of digital event formats together with the project partners or in becoming a partner of the 3IT? Don’t hesitate to contact 3IT’s project manager, Maria Ott, here.
Stay tuned for more information on Virtual LiVe and subscribe for its exclusive newsletter here.