Replies: 2 comments 7 replies
-
I have the same problem now. Did you get a solution? |
Beta Was this translation helpful? Give feedback.
-
the reason for this error is that the model does not follow the prompt and its response does not conform to the expected format in which the community should be organized. |
Beta Was this translation helpful? Give feedback.
-
I ran the Christmas story and thought it worked well. Then I used different test set and realized the community report wasn't generated at all (well there is one entry). I was curious to check the demo input's report and found below error messages. I looked at the string but don't know what's wrong in the format. But the community report was not generated for the id : 42 (there are many other communities that have similar problems).
It looks like the issues is due to the limit of the maximum token that the LLM can handle. In this example, the returned json from the model does not contain all the field required (due to the length of the 'summary' section).
While final_community_report requires below fields:
Is there anything that I can do to walk around the issue (other than use larger model :) )
Beta Was this translation helpful? Give feedback.
All reactions