For any VR solution, ease of deployment is essential. tOG-VR meets this requirement with flexible options for workflow, creation and output.
The output of the camera is fed through the Live renderer to key internally and simplify the chain. With Chroma, Matte and Segment keyers available in the tOG-VR software, optimum results are achieved, even on the most demanding keying applications. Of course, if this is not enough, keying externally through a third-party keyer is always possible.
With the option of WebControl, control of the virtual studio, such as choosing media content, creating playlists, and control of playout, is possible from anywhere on the network through a Chrome browser. Not only can this place of work control multiple cameras in your virtual studio, but can also control multiple studios.
Rapid loading time
Change the whole virtual set background seamlessly in seconds. Cut to Camera B, load a new set in Camera B, and cut back to a completely different set.
tOG-VR enables rendering of foreground graphics such as overlays, images and text/tickers from different static cameras, with all the power of the Live overlay system at the same as rendering the Virtual Background. This cuts down on complexity, so no need for a separate overlay system. It also means that the same control method for controlling the Virtual Studio (e.g. WebControl) can also control foreground graphics.
.fbx Import: tOG-VR supports full import of 3D sets or graphics created in popular third party applications such as 3DS Max, Maya and Cinema 4D as .fbx files. As elements such as mesh, materials, animations, lighting and scene hierarchy are imported with the .fbx files, the biggest part of the design process for tOG-VR can be done even without requiring specific training.
Gives presenters a feedback monitor to allow them to interact closely with virtual graphics.
No Tracking – It is, of course, possible to use VR with static camera(s). For this you would just need a regular 3d-Live licence. We even throw in the Lens Calibration and GAP tools for free!
No tracking, but with Billboarding – At first pass, this stunning technique would leave you convinced there is a lot of expensive camera tracking equipment in use. But the reality is it’s just a static locked-off camera with talent in front of a blue screen. All the clever stuff is handled in the Live render engine. Download our Guide to VR to find out how.
Encoded ptz Heads – These robust and relatively low cost items are the backbone of quality VR, especially for virtual graphics at sporting events – with Vinten offering a robust range of products available around the world.
Cranes and Jibs – VR really comes together when the camera starts to track in 3D space, Stype and ncam delivering good solutions if this is the effect you need.
Full Freedom of Movement – The utopia of VR, but at a cost of course. We work closely with Motion Analysis, having a full rig in our demo and test studio. We particularly like the accuracy, meaning the most demanding AR graphics are rock solid and where they should be. That said there options including the Mo-Sys Star Tracker system as used by the BBC in the 2014 Scottish Referendum.
Creation of custom shaders that can be dynamic and controllable via the user interface
Ability to apply full screen shade effects or individual geometry effects.
Such as bump map, stretch, warp, edge detect and outline grows.
As the camera lens focus changes, the changes are reflected in the Virtual Studio. For example if the camera focuses on a real object in the foreground, the Virtual Studio background is rendered out of focus to reflect depth of field adds to the realism of the Virtual Set.
Real time depth of field
Changes in depth of field are processed in real time.
High Quality Lighting effects
Per Pixel, phong based Lighting effects.
Able to run a calibration process for each studio camera in order to deal with lens distortions.
Handling CCD, lens and barrel distortion.
A process allowing the capture of each individual lens distortion characteristics.
This is vital for good VR as each lens is individual. Without such a process, VR graphics will drift across the Lens’ FOV.
For VR to work, the renderers must know the exact X,Y,Z position of each real camera.
The Gap process allows this and as it is a 5 minute procedure, users have the confidence to know that if cameras are de-rigged or moved to another position, their position can quickly be redefined.
RT Software only qualifies hardware that is proven to meet the exacting demands of live broadcast for reliability, durability and performance.
Desktop/side and rack mount systems are offered from:
Laptops are offered from:
RT Software’s solutions harness the power of Quadro GPUs made by nVidia, the world leader in visual computing technology.
To ensure the quality of RT Software’s rendered graphics are maintained all the way to video output, our solutions use the highest quality video input and output cards from DVS and nVidia.
Click here for the qualified hardware specification.