Local LLM Inference:
  1. Ollama: Small memory foot print LLM inference.
  2. LM Studio
  3. Nexa SDK
  4. Transfer Lab
  5. Clean UI
  6. Bitnet.cpp - running LLM on cpu
  7. ...
Most popular Vector DBs:
  1. Chroma
  2. Milvus
  3. Cassandra (support vector search)
  4. Weaviate
  5. Pgvector (extension)
  6. Oracle (announced on 13th september)
  7. ...
Rag frameworks:
  1. Langchain
  2. LlamaIndex
  3. Ragflow
  4. Haystack
  5. ...