AgenticSeek

Manus-like AI Powered by DeepSeek R1 Agents

A fully local alternative to Manus AI – a voice-enabled AI assistant that codes, explores your filesystem, browses the web, and corrects its mistakes, all without sending a byte of data to the cloud.

Features

Installation

Make sure you have ChromeDriver and Docker installed.

Run Locally on Your Machine

Step 1: Clone the Repository

git clone https://github.com/Fosowl/agenticSeek.git
cd agenticSeek
mv .env.example .env

Step 2: Create a Virtual Environment

python3 -m venv agentic_seek_env
source agentic_seek_env/bin/activate
# On Windows: agentic_seek_env\Scripts\activate

Step 3: Install Package

./install.sh
# Or manually: pip3 install -r requirements.txt

Step 4: Download Models

ollama pull deepseek-r1:7b

Step 5: Run the Assistant

Start the Ollama server:

ollama serve

Update config.ini:

[MAIN]
is_local = True
provider_name = ollama
provider_model = deepseek-r1:7b

Start services and run:

./start_services.sh
python3 main.py

Note: We recommend DeepSeek 14B or larger for better performance.

Run the LLM on Your Own Server

Step 1: Set Up the Server

On your server, get the IP address:

ip a | grep "inet " | grep -v 127.0.0.1 | awk '{print $2}' | cut -d/ -f1

Clone and run the server script:

python3 server_ollama.py

Step 2: Run on Your Computer

Clone the repository, update config.ini:

[MAIN]
is_local = False
provider_name = server
provider_model = deepseek-r1:14b
provider_server_address = x.x.x.x:5000

Start services and run:

./start_services.sh
python3 main.py

Run with an API

Clone the repository, update config.ini:

[MAIN]
is_local = False
provider_name = openai
provider_model = gpt4-o
provider_server_address = 127.0.0.1:5000

Start services and run:

./start_services.sh
python3 main.py

Contribute

We’re looking for developers to improve AgenticSeek! Check out open issues on GitHub.

License: GPL-3.0

Join Us