3IT’s longtime partner Rohde & Schwarz offers technological solutions for 5G broadcast! All details summed up in a free eBook!
5G is for sure promising to provide new and original technological and business opportunities. However, 5G brings also new broadcast and multicast capabilities to the whole ecosystem allowing other new applications and granular business potentials.
With 5G broadcast network operators have an exciting opportunity to make their infrastructure more dynamic and intelligent, delivering a better quality of service with higher quality of experience as a result.
With the lower latency and higher flexibility that 5G broadcast/multicast offers, the consumer experience can be improved with more real-time apps. 5G broadcast gives network operators a simple way to use network resources more efficiently to create new business opportunities and reach a wider audience. It opens up a whole new world of possibilities.
It does not only deliver media and entertainment to smartphones but also reaches smart vehicles with OTA updates, media and entertainment inside the car and map updates. It provides many application segments to be efficiently implemented using 5G Multicast either in downlink- only mode or in a combination with uplink channel, using the same infrastructure.
Live event multicasting makes more sense while using this feature and 5G broadcast can transmit public safety multicast, such as urgent weather and community information, simplifying the relationship between community members and public authorities.
Several other services could be optimized while using multicast over 5G, including OTA multicast for centralized configuration and control, live commerce and rural E-learning where no internet connection is available or possible. In addition, 5G multicast enables venue casting where the consumer can combine live experience with home comfort.
Rohde & Schwarz has prepared an eBook which gives an overview of the 5G Broadcast/Multicast technology and the benefits it has to offer. It highlights challenges and solutions for broadcast network operators, mobile network operators and content providers in a world of ever-growing media consumption.
Watch a teaservideo about the 5G broadcast/multicast eBookhere.
The goal of the project is to develop a Mixed Reality application that is suitable for everyday use within the context of knowledge transfer. The idea is to create a novel and more efficient way of direct, interactive and user-oriented teaching via photo- and audiorealistic avatars of real lecturers for remote learning.
Since the beginning of the pandemic in 2020, online teaching has taken on a completely new meaning and relevance. At the same time the weaknesses and limitations of the technological tools that have been hitherto used in the context of education have been exposed: traditional online teaching, where content is delivered via static video and audio scripts, is not sufficient to provide an interactive and engaging learning experience for users.
This is the point where VoluProf comes into play: the project partners want to create a space of future learning where both traditional lecturing and online learning environments are combined in a way, where lecturers can be experienced not only on video but also virtually and physically. In other words, the user participates in a lesson with a virtual professor by means of Augmented Reality glasses for example in their living room, while also being able to interact with them via eye contact and verbal communication.
Which criteria will play a role in the project realization?
Generating a photorealistic and animatable volumetric representation of the lecturer.
Location-independent applicability through transmission via 5G networks and direct integration into mixed reality devices on the market, as well as subjective optimization of image quality at minimal bit rate.
Natural and fast synthesis of the lecturer’s original voice.
Answering ethical questions about the implications of the new way of knowledge transfer.
Media reception and effects research for the analysis and optimization of user acceptance.
Educational research on the optimal and user-adaptive use of the new way of knowledge transfer
VoluProf strives for a ubiquitous and democratic use of technology, enabling accessibility for everyone to photo- and audiorealistic avatars of lecturers in particular by using 5G networks and advanced streaming technologies. By creating an immersive knowledge space, an improved learning experience is made possible, in which the limits of human-to-human interactions in digital learning are transcended.
About the project partners
The different in-depth knowledge of the project partners, ranging from the fields of language technology software (Aristech GmbH), 5G networks (Deutsche Telekom AG), Artificial Intelligence and educational technology (DFKI), Video, Image Coding and Processing of volumetric data (Fraunhofer HHI), empirical user research (University of Rostock) as well as production and post-production of Volumetric Video data (Volucap GmbH) guarantee a very interesting collaboration with exciting outcomes.
Are you interested in the future of Virtual and Mixed Reality Learning? Or are you interested in becoming a partner of the 3IT? Do not hesitate to contact the 3IT’s team here.
From 25 to 27 July 2021, the “6th ITG/VDE Graduate Summer School on Video Coding and Processing” took place at CINIQ Center and 3IT, after being postponed three times. The event, originally scheduled for October 2020, was organized by the ITG Technical Committee MT 2 “Image Communication and Processing” (spokesperson: Prof. André Kaup) and offered participants the opportunity to exchange ideas on current topics in the fields of video coding, image processing and multimedia communication. Thanks to an elaborate hygiene concept, the event could take place on site with 32 participants. As the Summer School was offered in a hybrid format, up to 20 international participants were able to join at the same time.
The three-day event opened with guided tours at the 3IT, at the CINIQ Center and in the TiME-Lab and was complemented by a get-together where the participants had the opportunity to exchange their research topics and network with each other.
The second day (Session Day 1) included a keynote speech from Jonathan Pfaff (Fraunhofer HHI) on machine learning methods for video compression. Altogether, there were three sessions with a total of 13 presentations covering topics from intra-coding and machine learning methods to 3D video and various topics such as multispectral image sensors, effects of video streaming on climate change, and implementation of video coding methods. In the evening, the participants were taken on a boat tour to further exchange ideas and get to know each other even better over dinner with view of the government district, Museum Island and East Side Gallery.
On the third and last day (Session Day 2), there were two more sessions with seven presentations on 3D video and interframe coding. At the end of all presentations, the participants of SVCP voted for the two best presentations. The winners of the “Best Presentation Award” were Dominik Mehlem and Jens Schneider, both from RWTH Aachen University. Furthermore, due to a sponsorship, participants have the opportunity to win the “Joint Research Incubator Award”, which will be presented on next year’s Summer School from 3 to 5 July 2022.
The SVCP ended for the organizers and participants with a joint lunch at the 3IT.
All information on SVCP 202(0)1 and its presentations can be found here.
The event, moderated by AC Coppens from THE CATALYSTS, aimed at the elaboration of the needs in live events of the culture and events sectors. Due to the pandemic situation, most of the cultural, business and other events have been taking place online or in hybrid formats. In this context, the kick-off event served as a platform for various players from the different industries to come together, discuss challenges and exchange ideas on the topic of live events.
Virtual LiVe has originated and is financed by the KMU-akut program “Research for SMEs” by the Fraunhofer-Gesellschaft with the objective to foster the innovative strength of institutions and SMEs from the cultural and events sectors by providing high-end technology solutions and innovations. However, the project does not intend to overcome or replace physical live events. Rather, it aims at being a supplement and extension to live events by combining the virtual with the physical realm, as presented by Angela Raguse from Fraunhofer IIS and Business Area Digital Media during the introduction of the project.
According to the renowned keynote speakers (Kavaye Ozong, ARTE; Harmke Heezen, High Road Stories; Oliver Ihrens, Radar Media and Thomas Bedenk, Freelancer), different small and large businesses as well as cultural institutions suddenly faced similar problematics in terms of live events, due to COVID-19. This has led to an intersectoral surge in demand to combine Virtual Reality with real live events in hybrid formats.
Subsequently, the technological possibilities of Virtual LiVe were presented, focusing on the technological possibilities and their combination within the framework of the project. Siegfried Foessel from Fraunhofer IIS, Stephan Steglich from Fraunhofer FOKUS and Christian Weißig from Fraunhofer HHI talked about the spectrum of the respective solutions that can contribute to the development of a practicable tool box of technologies.
All panel speakers expressed the difficulties and challenges they had been facing since the beginning of the situation due to COVID-19, as events, concerts and plays had been cancelled. None of the participants had comprehensive previous knowledge about the transfer of an analog event into the digital sphere. In particular, the implementation of new technologies into both cultural and business events in terms of the legal and administrative situation in Germany as well as logistics often presented an obstacle. What’s more, latency remained a predominant issue.
However, all speakers were united by a common interest in technological innovation, the implementation of Virtual and Mixed Reality as well as hybrid event formats. While according to some speakers, a virtual or digital event would never be able to replace a real event, the majority of the speakers acknowledged the innovative potential of immersion, even for speculative fiction and new ideas. Hybrid event formats and the combination of virtual and analog spaces, it was argued, could in turn promote the emergence of innovative opportunities.
On another note, a lively debate concerning the accessibility of technology when it comes to age, race and class arose. Some orators argued that Virtual Reality could play a pivotal role in including individuals that have never attended a concert or went to see a theater play, for various reasons.
Relating to the question of a dream technology, the first answer was an entanglement of physical and digital events through the use of 3D technologies. From the social perspective, the participants wished for a path where culture and business sectors could be united, thus creating an interdisciplinary forum without competition. To sum up, the speakers asked for a technology that is able to bring together different artists from around the globe on platforms without barriers – with low latency and in a borderless space between viewers and performers, while the public gets immersed in interactive virtual spaces.
The different contributions and perspectives led to an interesting debate and to the exchange of important ideas between the developers of innovative technologies and the providers of events by means of or in this case, in need of these technologies. We are looking forward to the further development of this exciting project that brings together very different protagonists!
Want to keep updated? Sign up for the Virtual LiVe Newsletter and stay tuned!
On 13 April 2021, numerous partners of the 3IT came together in the second Partner Meeting, which took place online. The partners of the 3IT met virtually and were able to brief each other about news and innovations as well as to decide upon future common projects and the direction of the 3IT.
The first Partner Meeting was inaugurated last year and has superseded the Steering Committee Meeting. The Steering Committee Meeting, chaired by Dr. Ralf Schäfer, consisted of 3IT’s gold and platinum partners and was in charge of the 3IT’s strategic direction. In the Partner Meeting, the new participative format of the 3IT, all 3IT partners are able to attend. It takes place up to four times a year and thus introduces a new approach for decision-making processes.
The 3IT offers its partners the unique possibility to collaborate in the field of immersive imaging technologies in a pre-competitive environment: as a virtual network, as a platform and as a venue. As part of the network, the 3IT partners enjoy multiple advantages: The 3IT serves both as a communication platform for providers, users and a broad audience and as a marketing instrument for advertising, sales and PR. Moreover, the 3IT provides a development platform and testbed for immersive imaging technologies, applications and infrastructures for its partners. Additionally, the partners are given a platform of knowledge exchange through workshops and conferences. Last, but not least, 3IT’s partners are supported in research projects on industry-relevant applications – with a possibility for technical advice and financial support from the Federal Ministry of Economic Affairs (BMWi).
Throughout the Partner Meeting, the partners had the chance to discuss their views and needs within the network in a very friendly and relaxed atmosphere. Obviously, the current situation caused by the pandemic has affected nearly all partners and equally the activities of the 3IT. Consequently, the 3IT has largely increased its PR work and its Social Media presence and the participation and presentation of its partners at virtual events.
The 2nd Partner Meeting has been successful in bringing up a lively discussion about new ideas for projects and workshops, just as seen in the last Partner Meeting. All participating partners have been inspired through the mutual exchange and have left the meeting with numerous stimulating thoughts in mind.
We are very much looking forward to our next partner meeting, which will take place in October of this year! If you are interested in staying up to date, subscribe to our newsletter.
Not a partner yet, but interested in joining the next meeting? Feel free to write us a mail.
The aim of the project Virtual LiVe is to digitize and enhance classic event formats, which require physical presence by using new immersive media technologies (e.g. 3D-Audio, 360°-Video, Light-fields, Volumetric Video). In the context of the new research project Virtual LiVe, the different needs of the culture and events industries in virtual formats in terms of live events will be elaborated in workshops. This will form the basis to revolutionize the field of virtual live events as well as to expand the experience of physical events.
The expertise of the three institutes Fraunhofer FOKUS, Fraunhofer HHI and Fraunhofer IIS in their specific core fields allows for a highly efficient and synergetic approach, which enables practical commercial applications and possibilities for SME’s. Virtual LiVe is funded within the framework of the ‘KMU-akut’ program “Research for SMEs” of the Fraunhofer-Gesellschaft.
The goal of Virtual LiVe is to develop a platform in the form of a toolbox, which provides applicable and scalable high-end technological solutions for the different players’ event streaming needs. Thanks to such a toolbox, SMEs will be able to put together a customized program according to the requirements of their respective event. To be more precise, they will obtain a technology-based solution for their specific format’s realization. Virtual LiVe thus seeks to enhance the existing range of virtual live events with a focus on audiovisual immersion, but also including accessibility (payment barriers, hardware integration) as well as legal aspects.
The project will be introduced through a hybrid kick-off event at the facilities of the 3IT – Innovation Center for Immersive Technologies and CINIQ Center on 26 May 2021. On this occasion, renowned players from the cultural and event industries will also take part and will present both the status quo of digital event formats and specify concrete requirements for technologies.
Are you interested in participating and shaping the future of digital event formats together with the project partners or in becoming a partner of the 3IT? Don’t hesitate to contact 3IT’s project manager, Maria Ott, here.
Stay tuned for more information on Virtual LiVe and subscribe for its exclusive newsletter here.
The scalable, mirror-based multi-camera system OmniCam-360 is a Fraunhofer HHI technology that allows the recording of live video in360 degree panoramic format & 3D-video and -audio content. Its generated videos are optimal for immersive applications!
Due to the Corona pandemic, the opening project was not possible live at the Haus der Berliner Festspiele. However, with the OmniCam-360, a unique and immersive visual experience was created.
The project “Environment” explored the special pandemic situation and was created accordingly: The musicians spent 84 days together in quarantine and created a show that the audience was able to listen to live from their homes.
To find out how the OmniCam-360 works, contact us and schedule a meeting or a guided tour in the 3IT Showroom in Berlin!
On 23 March 2021, 3IT project manager Maria Ott presented our partner network at this year’s Innovatour Congress. The congress was dedicated to everything surrounding the latest innovations from the fields of technology, mobility, life science, fintech and insurtech, with the goal to bring together smaller and larger companies and players working in tech. From 23 to 26 March, the participants came together from diverse areas of the tech industry, ranging from young startups and innovation hubs, to large companies and research institutions.
On the first of four congress days, everything related to “Consumer Electronics” was the topic of discussion. Maria Ott gave a short insight into the Fraunhofer Heinrich Hertz Institute and its research fields and then focused on introducing the 3IT with its showroom, partners, projects, aims and the displayed cutting-edge demonstrators in the 3IT facilities.
In the presentation of projects and technologies at the 3IT, Maria Ott spoke about 360 degree panorama video and audio production, the most recent 3IT project TeleSTAR, which focuses on AR-based 3D live broadcasting from the ER, and the Streaming of Interactive Volumetric Video Production and gave an outlook about future projects.
In addition to 3IT and other companies, there were 3IT partners such as Sony and Fraunhofer IIS attending and presenting their projects as well. Innovatour’s goal of bringing together global technology companies and innovative startups, thus providing the opportunity to network and present interesting technologies was certainly a success! Many questions have been asked and answered and new connections have been made. The 3IT-Team is happy to have been able to join this exciting new event format!
For today’s Holocaust Remembrance Day, the 3IT would like to remind of a special project of the partners Fraunhofer HHI and Volucap: The first volumetric interview with a contemporary witness of the Holocaust, filmed in 2019 in the Volucap studio in Potsdam-Babelsberg and titled “ERNST GRUBE – THE LEGACY”.
The short film has been realized as a cooperation of UFA, under the direction of UFA Technology head and producer Ernst Feiler as well as UFA SHOW & FACTUAL producer Philipp Grieß, together with the expert Dr. Oliver Schreer, Head of Immersive Media & Communication group at Vision & Imaging Technologies department at Fraunhofer Heinrich Hertz Institute.
As the last survivors of the horrors of National Socialism are aging, thus it becomes more and more important to find solutions to preserve their voices, in order to keep them alive also for subsequent generations. Holocaust survivor Ernst Grube’s account has been recorded with the help of volumetric video technology and its underlying technology ‘3D Human Body Reconstruction’, which was developed by Fraunhofer HHI. In Volucap GmbH’s volumetric studio, which has made a name of its own, Ernst Grube narrates his personal experience in NS-Germany and his imprisonment in the concentration camp Theresienstadt. Through 16 camera pairs as well as an important amount of computer power, a lifelike three-dimensional depiction of Ernst Grube’s witness report was created, allowing for a direct integration in a virtual world.
The portrayal of Ernst Grube does not only enable the maintenance of a singular voice, it even approaches this heavy topic as a virtual reality experience. This is in particular relevant to reach also a younger audience, to whom the NS’ atrocities and the Holocaust might seem distant. The short film “Ernst Grube – the Legacy” is thus a pivotal contribution to a vivid German Culture of Remembrance.
The intersection of cutting-edge technology and social relevant topics further proves the relevance of immersive imaging technologies and calls for new perspectives on how the respective technology can contribute to society. Find out more about “Ernst Grube – The Legacy” in this interesting talk.
Berlin, 15 November 2020 – For the second time, 3IT’s partner project TeleSTARsuccessfully transmitted the surgery of a cochlear implantation using an augmented reality based 3D live stream from ENT clinics of Berlin’s renowned hospital university Charité. This time, the stream was broadcasted to four lecture halls Europe-wide, hence, it permitted 45 viewers to witness the surgery live. Students from the ENT department of the hospital ErasmusMC in Rotterdam, one of the most prestigious research hospitals in Europe, as well as from the living lab of trauma surgery in the University hospital of Ludwig-Maximiliam University in Munich were able to follow the surgery for teaching purposes live with a latency of 800ms. Moreover, other surgeons from the field of visceral and transplant surgery as well as medical master students could survey the surgery in 3IT’s 3D-cinema, equally with a latency of 800ms. The additional broadcast to Charité’s Intranet gave students the advantage to observe the surgery live with a latency of mere 20ms. Subsequent to the live streaming, 67 viewers watched the 2D YouTube stream, with a latency of 8s.
This time, an improved audio concept helped to relief the operating senior surgeon, Dr. Florian Uecker. Once again, the 3D stream was supported by synchronous audio commentary and an intraoperative annotation mode using AR. Thus, every single step was conveyed by explanations, making them more comprehensible for the students. By means of the intraoperative annotation mode, surgeons were able to append visual information to the live stream, in the form of sketches, references and anatomy measurements. Even more, the students had the opportunity to ask questions from the various remote lecture halls to the second assistant surgeon Dr. Sophia Häusler.
As the OR is a time and resource constrained environment, live surgeries enjoy a long tradition within medical education, already. However, the integration of innovative AR approaches, facilitates groundbreaking new teaching methods. The live transfer did not solely enable a larger public to monitor the surgery live. In fact, in times of Covid19, this new format of teaching becomes pivotal, rendering possible high quality distant learning without putting anybody at unnecessary risk.
The insertion of a cochlear implant is a standard operation in ENT surgery, in order to treat deaf people. The implementation is advantageous in order to demonstrate the use of AR-technology and to test it, as the surgery can be made within a short time. The fully digital surgical microscope “ARRISCOPE” of the company Munich Surgical Imaging (formerly ARRI Medical) was used for the visualizations, as it renders possible high-resolution 3D images, a radiation-free and image-based measurement of the anatomy and the acquisition of multispectral image data.
TeleSTAR is funded by the EIT-Health. EIT-Health is supported by the EIT, a body of the European Union.
Find all information on our 3IT-Project TeleStar here.
Find the press release to the 3D live surgery here.
Dr. R.M. (Mick) Metselaar Mick, ENT surgeon at Erasmus MC: “I’m definitely hoping for a follow-up.”
Prof. Dr. Reiner Kunz, visceral surgeon, Berlin-Tempelhof: “Congratulations on this absolutely impressive demonstration of the Arriscope System on the occasion of cochlear implantation. In my opinion, extremely good image, excellent 3-dimensionality and the additional attributes such as ROI markings, tissue recognition etc. were demonstrated very effectively and in my opinion already very well suited for routine operation.”
Prof. Dr. John v.d. Dobbelsteen, faculty of Biomedical Engineering at TU Delft: “Students and clinical trainees need to experience the full spectrum of modern medical technology. TeleSTAR offers surgical trainees and medical technology developers this important exposure”
N.N., Biomedical Engineer, TU Delft “Amazing experience to observe a surgery like this. I found it very useful to be able to ask the surgeon exactly where we should be focusing on, this was only further helped by the ability for the surgeons to point out exact locations using the annotation mode.”
Dr. med. Florian C. Uecker, leading senior physician, ENT clinic at Charité Berlin, Virchow-Campus “This technology provides an unparalleled quality of remote teaching and communication, which is particularly valuable given the current COVID-19 pandemic”
Dr. med. Steffen Dommerich, leading senior physician, ENT clinic at Charité ,Campus Mitte “A surgical broadcast with all this additional information makes the whole thing appear much more descriptive and increases the learning experience.”^
Dr. med. Philipp Arens, ENT senior physician, ENT clinic at Charité, Campus Mitte “A fascinating system, which gives a deep insight into the patient that is almost the same as if you were operating yourself”.
Dr. PM Ute Morgenstern, Biomedical Engineer, TU Dresden: “This is really a very good way to provide a realistic environment for medical students far away from the surgical location, especially if the analogue training possibilities are now limited due to Corona. Above all – what interests us as biomedical engineers – the possibility to convey technical information about the medical environment in such a way that engineering students or practitioners can see exactly the interface between their research/development and the clinical user and can therefore approach their own work with more knowledge, motivation and criticism.”