This post continues the series dedicated to generating synthetic visual data for AI training. Check out the previous article Generating Synthetic Data for Artificial Intelligence Training.
When generating data, it is essential to diversify it visually. Even with only one object, in this case, an armchair, we can create hundreds of photos, each of which will be unique. All we need to do is add a few randomizations in the data rendering process, e.g., changing the texture of wood and upholstery, rotating an object, or changing the camera viewing angle.
We can use any number of such or other treatments, and each subsequent one increases the number of our unique images exponentially. We can add a variety of backgrounds, lighting, color saturation, cropping, and many other randomizations because both Unreal Engine and Omniverse offer a wide range of tools for managing objects in the scene and effects in post-processing.
In our experiment, we will use a simple scene with one object and only three randomizations:
Our armchair consists of two sections – wooden and upholstered. For each of them, we prepare an individual set of textures (i.e., five versions of wood textures + 15 versions of upholstery).
In Omniverse, to invoke the randomization command, you must write a script referring to the appropriate parts of the chair and the files you want to use as textures for the selected parts. For this purpose, we define the wooden part as “Section0” and the upholstered part as “Section1”:
Script:
Import omni.replicator.core as rep
Section0 = rep.get.prim_at_path("/Root/BP_Chair/SM_Chair/Section0")
def randomize_Section0_textures():
with Section0:
rep. randomizer.texture(
textures=[
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture1.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture2.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture3.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture4.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture5.jpg",
],
)
return Section0.node
Rep.randomizer.register(randomize_Section0_textures)
Section1 = rep.get.prim_at_path("/Root/BP_Chair/SM_Chair/Section1")
def randomize_Section1_textures():
with Section1:
rep. randomizer.texture(
textures=[
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture1.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture2.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture3.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture4.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture5.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture6.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture7.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture8.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture9.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture10.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture11.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture12.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture13.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture14.jpg",
"omniverse://localhost/Projects/Post_UE_Omni/Materials/Textures/Texture15.jpg",
],
)
return Section1.node
Rep.randomizer.register(randomize_Section1_textures)
With rep. trigger.on_frame(rt_subframes=90):
rep. randomizer.randomize_Section0_textures()
rep. randomizer.randomize_Section1_textures()
In the next step, we add the chair rotation, or more precisely, we randomize the position in which the chair will be rotated in the Z axis by any angle. In the 360° range, we have a chance to see 360 unique views:
Script:
import omni.replicator.core as rep
chair = rep.get.prim_at_path("/Root/BP_Chair/SM_Dining_Set_Chair2")
def rotate_chair():
with chair:
rep.modify.pose(rotation=rep.distribution.uniform((0,0,0),(0,0,360)))
return chair.node
rep.randomizer.register(rotate_chair)
with rep.trigger.on_frame():
rep.randomizer.rotate_chair()
Result:
Finally, we add camera movement – its random deviation in the Y and Z axes. The range of deviations is adjusted so that there is still at least a fragment of the armchair in the frame at extreme values at extreme values. The effect of these actions is to change the position of the chair within the frame so that it is not always placed in its central part. At the same time, we avoid situations where the chair is out of the camera’s view. We did not want to use the option of tracking the object through the camera because this would give us the effect of consistently placing the chair in the center of the frame. In a natural data generation situation, we would use several cameras with different settings to increase the variety of shots. In the current experiment, we assumed one camera, but in a way that would provide the most excellent variety of generated data.
Script:
import omni.replicator.core as rep
camera = rep.get.prim_at_path("/Root/Cam_1")
def rotate_camera():
with camera:
rep.modify.pose(rotation=rep.distribution.uniform((80,0,-108),(98,0,-72)))
return camera.node
rep.randomizer.register(rotate_camera)
with rep.trigger.on_frame(rt_subframes=10):
rep.randomizer.rotate_camera()
Result:
We created an analogous scene and identical randomizations in Unreal Engine. In this case, instead of writing code, we used Blueprints, a visual programming system. Blueprint is a graphical, very intuitive interface for creating scripts based on nodes representing various functions and actions.
In the first step, we created a Blueprint for the armchair with three nodes: one for drawing the wood texture, the second for drawing the upholstery texture, and the third for drawing the armchair’s rotation around the Z axis. Texture sets appear here in the form of “variables”, grouped into two containers/arrays: ”Material_Wood” and “Material_Fabric”:
Result:
Result:
We created a second Blueprint for the camera with the same motion characteristics as we previously did for the camera in Omniverse:
Result:
As you can see, we can achieve the same effects in both environments, although using completely different methods.
It remains a matter of individual preferences, which tool we choose to generate synthetic data – Omniverse or Unreal. While both environments are friendly to graphic designers and artists at the stage of arranging objects and building the scene, there is a clear distinction at the stage of generating data and introducing randomization. Omniverse hands this task over to programmers, who can use Python to manipulate the data further. Omniverse differs from Unreal here, which allows artists without programming skills to manipulate objects conveniently using blueprints. However, regarding the final visual effect, we are reassured and satisfied that Omniverse is equal in quality to Unreal.
Autonomous data governing and analysis system for all kinds of video data in your organization