LLM
  • Overview
    • LLM
  • Key Concepts
    • Models
    • Key Concepts
  • Quckstart
    • Jan.ai
    • 🦙Ollama & Chatbox
  • Playground
  • Workflows
    • n8n
    • Flowise
      • Basic Chatbot
      • Research Agent
      • Company Information Agent
      • PDF Parser
      • Knowledge Base
      • Lead Gen
    • Agno
  • LLM Projects
    • RAG + ChainLit
  • Model Context Protocol
    • Claude + MCPs
    • Flowise + MCPs
  • Knowledge Graphs
    • neo4j
    • WhyHow.ai
  • Setup
    • WSL & Docker
    • 🦙Quickstart
    • Key Concepts
    • Workflows
    • Model Context Protocol
Powered by GitBook
On this page
  1. Knowledge Graphs

WhyHow.ai

Previousneo4jNextWSL & Docker

Last updated 25 days ago

WhyHow

xx

x

x

x

x

x

Knowledge Table

Knowledge Table is an open-source package designed to simplify extracting and exploring structured data from unstructured documents. It enables the creation of structured knowledge representations, such as tables and graphs, using a natural language query interface.

With customizable extraction rules, fine-tuned formatting options, and data traceability through provenance displayed in the UI, Knowledge Table is adaptable to various use cases.

  1. Create a Unstructured.io account.

x

  1. Clone the Knowledge Table repository.

gh repo clone whyhow-ai/knowledge-table

  1. Edit the .env.sample file.

cd
cd knowledge-table/backend
nano .env.sample 
# -------------------------
# LLM CONFIG
# -------------------------
DIMENSIONS=768
EMBEDDING_PROVIDER=ollama
LLM_PROVIDER=ollama
# OPENAI_API_KEY={your-openai-key}  # Commented out as not needed
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_EMBEDDING_MODEL=nomic-embed-text
OLLAMA_MODEL=llama3  # or whatever Ollama model you want to use for generation

# -------------------------
# VECTOR DATABASE CONFIG
# -------------------------
VECTOR_DB_PROVIDER=qdrant
INDEX_NAME=knowledge_table

# -------------------------
# Milvus Config
# Applicable if VECTOR_DB_PROVIDER=milvus-lite
# -------------------------
# MILVUS_DB_URI=./milvus_demo.db
# MILVUS_DB_TOKEN={your-milvus-token}

# -------------------------
# Qdrant Config
# Applicable if VECTOR_DB_PROVIDER=qdrant
# -------------------------
QDRANT_LOCATION="http://localhost:6333"
# QDRANT_PORT=
# QDRANT_GRPC_PORT=
# QDRANT_PREFER_GRPC=
# QDRANT_URL=
# QDRANT_HTTPS=
# QDRANT_API_KEY=
# QDRANT_PREFIX=
# QDRANT_TIMEOUT=
# QDRANT_HOST=
# QDRANT_PATH=

# -------------------------
# QUERY CONFIG
# -------------------------
QUERY_TYPE=hybrid

# -------------------------
# DOCUMENT PROCESSING CONFIG
# -------------------------
LOADER=pypdf
CHUNK_SIZE=512
CHUNK_OVERLAP=64

# -------------------------
# UNSTRUCTURED CONFIG
# -------------------------
UNSTRUCTURED_API_KEY=[API_KEY]
  1. Save as: .env

  2. Change to the directory and docker-compose to start.

cd
cd knowledge-table
docker-compose up -d --build

x

x

x

x

x

x

x

x

Unstructured | The Unstructured Data ETL for Your LLM
Link to Unstructured.io
Logo
https://github.com/whyhow-ai/knowledge-table