-
Notifications
You must be signed in to change notification settings - Fork 26
Description
in the readme, in the MCP section it says to use "actions/ai-inference@v1.2" but this is not a correct tag/version number.
Also, it would be great if there was an aciton that could batch repo context. I am building a yml file that will auto generate a read me file for the repo upon a commit. So, i do this using the MCP server and give the prompt to the ai inference@v1 using github mcp token prompting it to look for certain files and anayayze those file contents in order to write the read me. I get the error for hititing the max token limit (4,000) but i have no control to customize batching the repo context that is sent via the mcp, nor do i have any insight as to how many tokens the repo context is because that is not logging how many tokens the MCP repo context is. Can y'all update the action to where it can handle matching MCP repo context? This might need to be a separate action and then send the MCP context in batched forms to the ai-inference action... not sure?