infer/vllm/static: Align LLM.generate with modern vLLM API
Description: Refactored input handling to replace the deprecated prompt_token_ids argument with the modern prompts parameter using the PromptDict format. This resolves a TypeError on vLLM 0.6.0+, aligns LLM.generate with Sequence[PromptType] specifications, and optimizes RandomGenerator to yield fully compliant data structures.