Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

segmentation maps are bad (disconnected) #1160

Open
mustachemo opened this issue Nov 20, 2024 · 0 comments
Open

segmentation maps are bad (disconnected) #1160

mustachemo opened this issue Nov 20, 2024 · 0 comments
Labels
question Question, not yet a bug ;)

Comments

@mustachemo
Copy link

Describe the issue

I'm using blenderproc to make a dataset for a semantic segmentation task. I have to render the objects at different distances, so most of the time they are very small. The segmentation maps come out disconnected and it is not good for model training.

Minimal code example

def render_scene(scene_dir, distance="random", num_renders=100, resolution=3072):
    metadata = {}
    print(f"screne_dir: {scene_dir}")
    scene = glob(str(scene_dir) + "/*.blend")[0]

    # load the objects into the scene
    # blend_file = scene
    # objs = bproc.loader.load_blend(blend_file)

    # define a light and set its location and energy level
    light = bproc.types.Light()
    light.set_type("SUN")
    light.set_location(sample_sun_loc())
    field_of_view = np.deg2rad(0.4)
    # decrease the energy level to mimic d1-images
    light.set_energy(5)

    # define the camera intrinsics
    bproc.camera.set_intrinsics_from_blender_params(lens=field_of_view, lens_unit="FOV", clip_end=np.inf)
    bproc.camera.set_resolution(resolution, resolution)

    # Create camera poses for n renders. each camera pose corresponds to one image
    for i in range(num_renders):
        if type(distance) == str:
            distance = random.randint(50000, 200000)
            # distance = random.randint(10000, 20000)
            # distance = 5000
        position = np.random.uniform([-distance, -distance, -distance], [distance, distance, distance])
        x = position[0]
        y = position[1]
        z = np.sqrt(np.abs(distance**2 - x**2 - y**2))
        position = [x, y, z]
        quat, rot_matrix = create_tracking(position, [0, 0, 0])
        # print('position ', position, 'rot ', quat, rot_matrix.shape)
        matrix_world = bproc.math.build_transformation_mat(position, rot_matrix)
        bproc.camera.add_camera_pose(matrix_world)

    # activate depth rendering, render the whole pipeline
    # bproc.renderer.set_world_background([0,0,0])
    bproc.renderer.enable_segmentation_output(
        map_by=["category_id", "instance", "name"], default_values={"category_id": 1}, pass_alpha_threshold=0.01
    )

    data = bproc.renderer.render()
    satellite_name = scene.split("/")[-1].split(".")[0]

    try:
        os.mkdir("output/" + satellite_name)
    except OSError:
        print("Folder exists...")

    bproc.writer.write_hdf5("output/" + satellite_name, data)
    bproc.clean_up()

    for hdf5_file in glob("output/" + satellite_name + "/*.hdf5"):
        write_image(hdf5_file, "output")

Files required to run the code

No response

Expected behavior

image

BlenderProc version

4.2

@mustachemo mustachemo added the question Question, not yet a bug ;) label Nov 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question, not yet a bug ;)
Projects
None yet
Development

No branches or pull requests

1 participant