Future Technology.
In 1992, American sci-fi writer Neal Stephenson coins the term metaverse in his book Snow Crash, which depicts dystopian future world where rich people escape into an alternative 3D, connected reality. In October 2021, CEO Mark Zuckerberg announced Facebook's vision to build an alternate virtual existence for individuals by employing augmented reality (AR) and virtual reality (VR) technologies. Adaptation is all about volume, creation and origination. So, in Metaverse - How do we virtualise production?
How did it develop? Not over-night, and we are not yet there. The Zensar team of Metaverse developers provides all the support and tools to the businesses to bring virtual content and experiences into the real world. Infosys has introduced the Infosys metaverse foundry to navigate the unusual confluence of technologies such as XR, DLT, 5G, AI, IoT, and many more. The tool builders, the software developers, world builders, artists, 3D modelers, game developers, users, investors, those are the proverbial owners of the metaverse.
The journey, began with tiny steps, with focus moving from Analog to digital, built in a virtual world. CGI. Combination of real-life action and virtual production. What is happening right now is, the work on principals of creating metaverse which big 4 are working on like PWC, Garner.
Analog was manual/slow. Product had to be shipped, there was pre shoot, photo shoot, post production and output analog. This could be used in print, social media, digital and POS. When there was a new product or change needed, one had to reshoot, involving lot of cost.
Digital helped create digital twin, there was a virtual product library, helped create variations, and customise to customer. Output could be anywhere, turntable 360 degree, could be printed, used in social media, AR, digital platform, TVC and more. New product or change needed new render, which involved lesser cost. Digital Brewery, Kitchen, travel and many more could be created.
CGI Models can be used for adaptation. It is not just two dimensional. It becomes digital assets. You can show flower blooming, - can be used throughout the year.
Things moved from a mix of digital and CGI and eventually to full CGI environment with CGI Backplates.
Pick different locations for different product, including different people. Unreal engine was used and output is so realistic and photo real, it can be done in automatic rendering, at real time. It gives the client flexibility to get additional content and changes, once full environment is created.
Virtual production has many uses. With T V screen behind. Animate wheels moving vehicles could move with ETS, - Unreal engine and artist, but working real time. It is done using motion control camera.
Today, Amazon has biggest virtual production stage in America LA for amazon prime video shows and they have come up with Stage 5.
Brick and Motors - Lego - plastics.
- Sustainability is one of the biggest things for virtual production. offset carbon emission
- With unreal engine, driving dynamic - Expertise Travel Services drives; It's easier and faster.
- Correction and recreation is cheaper, easier and faster.
- It comes at lesser cost and quicker.
No comments:
Post a Comment