Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[GEMINI] create_with_completion does not work when the response_model is a List[Object] #1303

Open
3 of 8 tasks
yajatvishwak opened this issue Jan 11, 2025 · 0 comments
Open
3 of 8 tasks
Labels
bug Something isn't working

Comments

@yajatvishwak
Copy link

  • This is actually a bug report.
  • I am not getting good LLM Results
  • I have tried asking for help in the community on discord or discussions and have not received a response.
  • I have tried searching the documentation and have not found an answer.

What Model are you using?

  • gpt-3.5-turbo
  • gpt-4-turbo
  • gpt-4
  • Other (please specify) - models/gemini-1.5-flash-latest

Describe the bug

  • I have an instructor client with gemini
lite_instructor_client: instructor.Instructor = instructor.from_gemini(
    client=instructorai.GenerativeModel(
        model_name="models/gemini-1.5-flash-latest",
        generation_config=generation_config,
    ),
    mode=instructor.Mode.GEMINI_JSON,
)
  • I am trying to create_with_completion
listings = lite_instructor_client.create_with_completion(
                        response_model=List[ExtractedListing],
                        messages=[
                            {
                                "role": "user",
                                "content": "Extract the blog links from the following content.",
                            },
                            {
                                "role": "user",
                                "content": chunk, # some blog chunk here...
                            },
                        ],
                    )
  • Throws the following error:
client.py", line 331, in create_with_completion
    return model, model._raw_response
                  ^^^^^^^^^^^^^^^^^^^
AttributeError: 'list' object has no attribute '_raw_response'

I think this happens because model here is a list and we are trying to return ._raw_response.

Please help! Thank you!

@github-actions github-actions bot added the bug Something isn't working label Jan 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant