Home » The Future of 3D Virtual Studio Production and Live Streaming in Singapore

The Future of 3D Virtual Studio Production and Live Streaming in Singapore

by William

With the fast development of technology, who knows, 50 years down the track it may be possible to have a whole film with actors and environments all created in a studio without ever going on location. Undoubtedly, there are endless possibilities for 3D virtual studio production in the future.

In 3D virtual studio production, actors or presenters are filmed in front of a blue or green screen, which has been a technique used in the film industry for quite some time, but now the difference is that actors are composited into a 3D environment in real-time or for recorded material. The compositing and keying out the blue or green screen is done using a technique called chroma-keying. In the 3D environment, virtual cameras can be used to produce camera angles that are impossible in a real studio. The camera data can even be saved and reused at a later time to go back exactly to the same location in the virtual environment.

Despite being invented in the 1960s, the modern concept of 3D virtual studio production became more prevalent in the mid-1990s as computers became more powerful and 3D graphic technology was more advanced. Due to the advancement of technology, 3D virtual studio production has become a cost-effective solution for many things such as professional filmmaking, video production, presentation production, and even corporate and educational material. Using 3D virtual studio technology has a faster turnaround time compared to conventional filmmaking, and it has become more affordable to use it for professional productions as costs for 3D graphic artists, animators, and 3D software are more affordable than in previous years.

Overview of 3D Virtual Studio Production

A three-dimensional virtual studio is an environment that is simulated and created with the help of computers. Virtual studio environments have many advantages over real studios. For example, virtual sets can be used as 3D graphics files where they can be easily shared, modified, and transported to other locations. Virtual sets also allow for the use of computer graphics and effects, a tool that is migrating to the virtual world from the real world at an increasing rate. This allows for greater consistency between the different parts of a show, greater flexibility in the types of effects that can be put on air, and the effects can be done more cost-effectively in a virtual environment. Computer-generated effects and graphics are becoming an increasingly important part of the branding of a TV show. With increased globalization and hence increased competition between different TV shows, the ability to have strong consistent branding across many different types of media is crucial. A virtual studio can be used as a tool to plan and storyboard a real studio set. A director can walk through the virtual set to decide camera angles before the physical set is constructed. This alone can be a cost-saving measure in preventing mistakes being made during the construction phase.

Importance of Live Streaming Production

3D virtual studio production can provide professional-looking studio sets for a fraction of the cost and time of building a physical studio. Live compositing will allow for a variety of genres of shows to have the look and feel of a pre-produced show.

A 3D virtual studio production has the potential to be a variety of shows and events. One of the biggest benefits of 3D is that it can be less expensive and time-consuming than building a physical studio of the same design. This allows for small independent live streaming channels to have a professional-looking studio set at an affordable price. High-end virtual set systems such as Orad and Immersia allow for live compositing of real presenters with a virtual set. Presenters are shot with a camera against a green screen backdrop, with the live compositor keying out the green and placing them in the virtual set in real-time. They can be combined with overlaying pre-recorded motion-captured characters, making it possible to have a variety of genres of shows, from news to sports and talk shows. It is still a niche market, but it is a more efficient alternative to creating pre-produced television shows of the same genres.

Content delivery networks have made live streaming easily accessible to anyone that has an internet connection. High bandwidth global internet penetration has already seen more people turn to the internet as their source of news and entertainment.

Many companies know that there is high demand for live streaming and are turning to it as a new marketing tool. Although they know there is a demand for it, they have no idea how to execute their ideas into production. The traditional method, both in field and post-production, is not very efficient for live streaming production. Ideally, they would like their live stream to have the look and feel of a pre-produced television show or documentary, at a cost that is less than producing those traditional media types. Many of these companies are now discovering that 3D animation technology, virtual studio sets, and live compositing are the solutions they are looking for.

 Significance of Singapore as a Hub for Virtual Studio Production and Live Streaming

Singapore is at present an internationally renowned financial hub, and the significance of virtual TV and live streaming production in relation to the television and mass communication industry has set the stage for future developments in the country. With the rise of virtual TV in recent years, lowered production costs and production derived from local and international sources has brought an increase in vibrant activity in the industry to create a more varied and competitive market. Virtual TV provides a platform for various companies to produce content to showcase their products in the form of infotainment, documentaries or short films, and with the mass availability of online video resources, the demand for better quality video and production services has been increasing globally. Live streaming of sports events, business events, education and learning, and entertainment has become a platform for mass communication in the information age, and availability of such services brings more business and job opportunities to an ever-growing industry in Singapore. Virtual TV and live streaming production is an innovative step into the future for the Mediapolis development in Singapore. With increasing demand for information, education and entertainment through the use of digital media, the setting up of a 3D media city will further push Singapore’s status as a global city for the arts and digital entertainment. With the development of live streaming services, it will be easier for consumers to keep in close touch with current events and targeted niche-specific content at a global scale. The utilization of services provided by 3D virtual TV and live streaming will become a norm for media producers to reach out to various clients in both government and private sectors, and the ability to provide various services through digital media will increase job opportunities and raise the level of competitiveness in local and international markets.

 Advancements in 3D Virtual Studio Production

Virtual Studio and VR technologies are generally highly developmental. While work can be classified into using technologies for better content production or for more realistic content visualization, there is interchange between the two areas. On the visualization side, methods to create and enhance realism in the virtual environment are the focus. I2R’s virtual newsroom appears very realistic, as a result of extensive room scanning and novel lighting simulation techniques. Lighting is a very important factor in realism, and simplified lighting methods are being developed to allow a desired level of realism to be achieved without extensive effort. Posture and gesture are other fundamental aspects of communication often overlooked. The ultimate vision in this area is real-time video-driven animation of realistic 3D avatars, reacting to and generating spoken natural language in a virtual environment and captures the user’s intended mood and tone. Such technology has potential in areas far beyond news presentation.

The earliest virtual studio systems were fairly primitive, with basic computer graphics (CG) overlaying video. More advanced is the notion of constructing the entire studio environment in CG, then employing camera motion capture to allow live video to control the viewpoint. A key benefit here is that the same video can be used to drive any number of different virtual cameras, thus providing more angles on the action than would be feasible in a real studio. Singapore’s Institute for Infocomm Research (I2R) has made significant progress on perceptual-based 3D video processing, which infers the 3D structure and appearance of a scene from 2D video. AI techniques are then used to automate processes, for example to create a simplified 3D environment for sports analysis, or to extract the visual elements of a cockpit from 2D flight simulator footage. This takes a stage further the idea of automatic 2D to 3D conversion, which has potential far beyond the immediate application. Development now is in the latter areas, with the formation of a Virtual Reality Applications Development Centre (VRADC).

 Realistic Virtual Sets and Environments

This new technology has introduced advanced software which allows the creation of realistic 3D models and environments in a cost-effective manner. Previous technologies required time-consuming data entry using digitizers to create 3D models and environments. This data was often incomplete and not very detailed. The new software uses photo imaging or video footage to create realistic textured 3D models. An example of this software used is the Maestro still store for the VizRT virtual set system, which uses texture-mapped polygon technology to allow the importing of complex 3D objects created in external graphics applications. This can provide TV producers with the near impossible, bringing the outside inside. An example of such a production was in the Rugby World Cup 2015 studio, where ITV Sport recreated the Brighton Pier in its virtual studio. With the new software and technology, TV producers will be able to take the TV audience to places or events that previously, on-location filming would have been the only option. The desire to capture the reality of a venue, site, or event has made virtual reconstructions a pivotal topic for 3D modelers. Nevertheless, taking the virtual setting to places that surpass reality in appeal and an alternative location has reached the spotlight. The highly detailed and lifelike models have set a new standard for 3D virtual environments, which will continue to be pushed towards improvement.

Interactive Features and Immersive Experiences

The creation of 3D virtual studios has opened new doors for interactive features and immersive experiences. Raffles Design Institute has already introduced the use of a 3D virtual newsroom to its broadcast journalism students, and hopes to implement a 3D virtual studio at a later date. The virtual newsroom allows broadcast students to explore and experience real life events in which they would be reporting on. The advantage of using the 3D newsroom is that students can pre-plan their shoot or story and simulate scenarios before they do a live production. This reduces time and effort during the production phase (Hossain, 2006). Prior to virtual studios, if one wanted to research or visualize a location, it would require a lot of time, effort and money. For example, in media such as television, film or games, the use of location shooting is a difficult and time-consuming task. Location shooting is often not a practical option for students, due to time, cost and resource constraints. Simulation of locations in a 3D environment changes it all. It allows people to experience and interact with the environment or a situation. This is especially useful for media producers, as they can pre-visualise an event and plan production using simulations, which can save time and costs in various media production. Simulation can also be used for teaching and learning situations to educate students or workers. Step simulations can teach new tasks, and the assessment of these tasks can take place in a virtual environment, without any risk (Ndong, Laurette, & Link, 2002).

 Integration of Augmented Reality (AR) and Virtual Reality (VR) Technologies

The fusion of AR and VR technologies with virtual studio production and live streaming can render both more versatile and expansive. The ability of AR technology to overlay virtual elements onto a real environment is already being implemented into sports broadcasts in the form of the first down line. A more advanced example from the realm of entertainment is The Black Eyed Peas’ utilization of AR during a live performance at the 2010 Super Bowl. The group’s vocalist Fergie performed a rendition of “Sweet Child O’ Mine” while standing on a real stage in front of a live stadium audience. Unbeknownst to her audience in the stadium or watching at home, a giant robot counterpart of Fergie rose from beneath the stage, the AR version that only viewers watching at home could see. Such a concept could be overseeing for narrative style programming such as talk shows or sitcoms where not all scenes would necessitate a purpose-built virtual set. This technology would also be more feasible for television production, with consumer-level AR interfaces providing more accessible means of creating AR elements. A simple example might be a news anchor working from home and using an AR interface to appear as if he is reporting from a studio set. By the time such technology is commonplace, the distinction between virtual and real studio production may become somewhat blurred. As for VR technology, it would generally be an extension of what is previously mentioned in this paper and would be more applicable to entertainment than information-based content due to its solitary and immersive nature. An example would be a music performance that is viewable at a later time, the viewer could wear a VR interface and be placed within the audience or even onto the stage with the band.

Live Streaming Production Techniques

Multi-camera setups are used in traditional TV production to provide various viewing angles of a scene, capturing emotion and showing more detail of what’s happening. This technique, however, is rarely used in current-day live streaming due to it being more costly and complex to produce. One possible way to simplify this would be to have an AI director programmed to switch between cameras to automatically capture the best moments of an event. This concept has already been implemented by companies such as Microsoft (Hololens Academy Awards) and IBM (Wimbledon Tennis Championship) using machine learning technology. This can be done by analyzing player movement data to determine points of interest and then using this information to control a virtual camera (attached to spectator view) in a game to capture these highlights. It is important to note that when implementing this, it’s best to have it done in a non-intrusive manner where the game is still the main focus of the stream.

With the coming of 5G internet, we can only expect live-streamed media to improve further and increase in demand. Mediacorp’s chief technology officer, Suhaimi Sulong, envisions content to be streamed in both 2D and 3D with multiple camera angles. Given how gaming communities are a huge audience for live stream media, we foresee game developers using live streaming as a platform to showcase development progress and provide in-game developer commentary to improve engagement with fans. Given this anticipated growth, we explore live stream production techniques that can be used in a virtual studio environment to improve broadcast quality and meet the high expectations of future viewers.

Live streaming has become a popular channel of media consumption with various platforms that support it such as Twitch, Microsoft Mixer, YouTube Gaming, and so on. With the growing demand, it is no surprise to see TV broadcasting companies taking to live streaming media. In a trend noted in the Southeast Asian region, Mediacorp recently conducted a live stream on their Facebook page of the 2017 Star Awards behind the scenes using a 360 camera where viewers could experience it in VR. The now popular Dota 2 e-sports competition, The International, has also gained traction with its live stream where it now offers various ways to view in-game stats directly on the video.

Multi-Camera Setups for Dynamic Broadcasting

The primary form of dynamic and effective broadcasting lies in multi-camera setups. This involves using more than one camera and instantaneously cutting between cameras to obtain different shots/angles of the subject. In real television production, this takes the form of a vision mixer that takes in multiple video feeds and can select between them. In the context of virtual studios and live streaming, this can be emulated in a 3D environment with multiple cameras placed and controlled just like in a real multi-camera setup. The beauty of this, however, is that any camera angle is obtainable. In a real studio setting, sometimes it may not be possible to get a camera in a particular position or it may be obstructed by scenery, etc. In a 3D environment, say for example a sporting event in a stadium, a bird’s eye view of the action can be achieved without a cable-suspended camera or actually being in the stadium. With the trend of 3D television on the rise, the ability to produce content is also increasing given that it is not subject to the same restrictions as real broadcasting. This could also take the form of multiple studios in different locations simultaneously producing content for live streaming. An important recent development and pertinent to the idea of multi-camera setups is the adoption of NDI (Network Device Interface) by NewTek. This is a method by which video data can be sent over a network in real-time. NewTek claims this to be a shift in paradigm equivalent to the move from tape to non-linear editing. This technology can be applied to virtual studio production and in a multi-location scenario, NDI allows video data to be sent and received over a network with a negligible loss in quality. An example of this might be multiple feeds of an event venue being sent to a central location for production and streaming.

Real-Time Graphics and Visual Effects

Interactive elements provided by Mirukuru can be seen as a type of visual effect, however it is separated in its own category because it engages the audience and is an active application by the user, thus differentiating itself from passive visual effects techniques. Mirukuru proposes that in the future, audiences will want to be active participants in their viewing experience. As the newer generation of audiences are increasingly computer literate and have been raised on video games, computer graphics and the internet, there is a growing demand for more interactive and user controlled programming. Audiences are expected to want greater interactivity with the content and also between themselves. During a live basketball telecast, the ESPN Full Court interactive 3D system by Fakespace Systems enabled viewers to manipulate a virtual camera and watch the same game from a different perspective to the main broadcast, this proved to be a popular feature. This can be translated to a virtual studio environment where an interview scenario is mixed into a sports game that contains a studio telecast of the game in progress. The interview is filmed in front of a green screen with the virtual set keyed in later. Using a system like the Full Court, the viewer would be able to change the game they are watching between the live action and the interview, providing an enhanced and entertaining experience for the viewer and increasing ratings for the network. This is just one example of a simple application and there is much potential for development in this field. By employing these techniques, it is likely that content producers will be able to consciously impact viewer interaction, thus providing a more fulfilling viewing experience for the audience all round.

Enhancing Audience Engagement through Interactive Elements

The world is moving towards integrating more and more interactive elements into day to day programming to enhance the level of audience engagement and to ensure that the audience stays involved throughout the content. A study by Leung and Wei suggests that this concept can be extended to the development of interactive elements in TV content. We contend that integrating interactive elements can increase audience engagement with the program content. Interactive elements include those that are implemented in the traditional TV format to engage the audience in a specific TV program, as well as those that are used to foster a more integrated viewer/program experience. Traditional examples of interactive elements in TV content include televoting, sending SMS to participate in surveys and games, and chat sessions, while non-traditional examples include interactive storylines and alternate reality games. Over the past two decades, we have seen many examples of TV shows which have tried to include interactive storylines using online platforms; however, few have been successful due to the complexity of implementation at that time.

Singapore’s Role in the Virtual Studio Production and Live Streaming Industry

The digital media space is somewhat a new venture for Singapore’s media industry and there are still many unexplored opportunities that could lead to a high return. It is necessary for Singapore to remain globally competitive in a rapidly changing media environment and it must be able to anticipate and respond to new trends in media and exploit new opportunities. With global trends moving towards digital and interactive media, this is a golden opportunity that Singapore must take advantage of. The traditional audio-visual and film industry is already being pushed in new directions with digital content and it is likely that in the near future, this may well turn into the primary platform for audio-visual content. This is a shift from the traditional content aired on TV or visible in cinemas, but the medium of delivery remains the same. The fusion of 3D digital content and TV is already evident with programs like ReBoot and CG movies becoming more popular with children and families. This will be a great shift for the current media industry and a virtual studio production industry might be an alternative platform for local media content.

Singapore has now focused its attention on the interactive and digital media space, somewhat of an untapped niche in the media production industry. This media space includes various forms of immersive digital media that is beginning to change the way content is being received. This industry aligns with the government’s initiative of exploring new media space as it continues to grow and develop. The 3D digital content is attractive and lucrative and is an industry aspiring to be the regional leader. Singapore’s MDA has mapped out a 10-year blueprint that aims to transform Singapore into a global digital media capital. The Future of 3D Virtual Studio Production and Live Streaming in Singapore document is well aligned with this initiative and if it is a success, it would be a good testament to the level of competency and expertise Singapore has achieved. Step two in the national blueprint is to develop internationally competitive industry clusters. A well-entrenched virtual studio and live streaming industry can be fully integrated with the media industry and can be a key driver for the future of media production in Singapore. It can also help to support the media industry, providing an alternative platform for media content and a source of new jobs and roles in the industry. This direction will provide an alternative cutting-edge platform for content and open up new jobs and roles in the industry.

 Singapore’s Technological Infrastructure and Support

There are three major aspects to technological infrastructure. Firstly, Singapore has the necessary technology and hardware to support virtual studio production. This involves having the processing capability and storage capacity to handle large volumes of digital content. Graphics rendering is computationally heavy and supporting a virtual production set would require a considerable amount of storage space for the large volumes of digital images and video. This is fast becoming a reality with increasingly powerful PCs and hardware, and affordable storage devices. Singapore also boasts a highly efficient and pervasive broadband network with high internet penetration reaching over 70% of households. This can only benefit real-time live streaming of events and content delivery to end users. Known as the next generation Nationwide Broadband Network (NGNBN), it is an ultra-high speed, pervasive and intelligent network that will enhance and transform WiMax and fixed line broadband connections island wide. Another important aspect of technology is software development and system integration. Singapore has increasingly been involved in R&D and software development activities, having gained substantial expertise in various vertical industries including gaming, animation, advertisement and media production. This includes development of 3D graphics engines, virtual reality tools, content management systems and integration software to link various systems together for efficient content creation and delivery. As virtual studio technology takes form of a new media platform, contestable R&D grants will open the doors for software development activities in the area of digital media. This promises a future of highly integrated virtual studio systems that are cost-effective and easy to use.

 Collaborations and Partnerships with Global Industry Players

In order to gain an edge and become a strong player in the virtual studio and live streaming industry, companies in Singapore have started to forge collaborations and partnerships with international industry players in this field. By partnering with renowned global firms, local companies could be involved in global projects and gain access to the latest technology and R&D in this field. The recent partnership between *scape and US-based Ambient Performance to build a mixed-reality studio provides a platform for youth talents in the media industry to venture into 3D production and gain valuable experience. Another example would be the collaboration between Exploit Technologies, the strategic marketing and commercialisation arm of the Agency for Science, Technology and Research (A*STAR) and Nanyang Technological University with UK-based Oxford Metrics Group. The joint effort aimed at developing a technology platform for gesture-driven real-time character animation could potentially put the local media industry at the forefront of technology. Joining the bandwagon of 3D production is local visual effects company CosmoStudios. A co-founding member of the 3D Media Alliance (Singapore), the company specialises in stereoscopic 3D production. Dr. Marc Coderre, Managing Director of CosmoStudios believes that alliances such as this would “push the local industry to specialise and help in the sharing of 3D information and technology.” Being involved in global projects and technology transfer could potentially elevate the standards and capabilities of the local media industry. With the rising demand of 3D content, the convergence of global efforts for 3D media will provide a huge potential market and leading edge for media professionals in Singapore.

Government Initiatives and Incentives for Virtual Studio Production and Live Streaming

In Singapore, the government plays a huge role in the development and the push for emerging industries. Over the years, many government bodies have launched initiatives and incentives to support the growth of the media industry. In the effort to position Singapore as a leading interactive digital media hub, the Media Development Authority (MDA) has launched various grants and schemes to develop talent, content, R&D and infrastructure for the industry. Initiated in 2009, the MDA’s 5-year $230 million Interactive Digital Media R&D (IDM21) Programme aimed at growing Singapore’s IDM R&D capabilities to support the development of innovative IDM solutions and technologies. This is also in line with Singapore’s vision to become a global digital media city by 2015. One key initiative under the IDM21 is the development of a digital media R&D cluster at Fusionopolis. The cluster aims to create vibrant and collaborative R&D environment for the IDM industry. It will house public and private research institutes, global MNCs and local enterprises. The proximity of these organisations will encourage public-private sector partnerships and collaboration to develop innovative digital media solutions. By 2013, the MDA has also launched the Public Service Media (PSM) Research & Development (R&D) Programme to develop digital media R&D capabilities in the public service media sector, to develop and trial innovative digital media solutions to enhance quality and productivity of public service media content.

You may also like