Skip to content
#

prompt-engineering-tool

Here are 13 public repositories matching this topic...

🚀 Browser-based tool for creating reusable sets of context for LLM. Improve response quality & time and reduce token usage. Privacy-first, works with any LLM (Claude, GPT-4, Gemini). Stop re-explaining your codebase to AI (and your team members).

  • Updated Jun 29, 2025
  • TypeScript

Contextual Memory Intelligence for AI Systems - Persistent memory, cognitive tools, and adaptive reasoning capabilities for LLMs Experimental memory system for LLMs (see MemMimic for optimized version)

  • Updated Jun 13, 2025
  • Python

CRoM (Context Rot Mitigation)-EfficientLLM is a Python toolkit designed to optimize the context provided to Large Language Models (LLMs). It provides a suite of tools to intelligently select, re-rank, and manage text chunks to fit within a model's context budget while maximizing relevance and minimizing performance drift.

  • Updated Sep 2, 2025
  • Python

Improve this page

Add a description, image, and links to the prompt-engineering-tool topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the prompt-engineering-tool topic, visit your repo's landing page and select "manage topics."

Learn more