Skip to content

evaluator_optimizer assumes expert feedback wanted #450

@bandinopla

Description

@bandinopla

I'm playing with the workflow/evaluator.py and instead of striving for EXCELLENT I'm aiming for POOR.
I've set the generator's instuction to "You are a kid playing with the keyboard"
@fast.evaluator_optimizer also doesn't have the option to strive for mediocrity, it only has min_rating no max rating. But still, when it pass the first generation back to the generator it appends the text You are tasked with improving your previous response based on expert feedback and I never said or wanted an expert. This value is hardcoded :

You are tasked with improving your previous response based on expert feedback. This is iteration {iteration + 1} of the refinement process.

What if one wants a POORLY redacted message? To simulate a low IQ or dumb person's writing style.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions