Chat with Feishu Using Codex: A Remote Dev Assistant That Actually Gets Things Done
Intro
If you've ever wished you could chat with an AI assistant directly inside Feishu (Lark) without leaving your team's messaging platform, this project is for you.
I stumbled across Codex Remote Feishu on GitHub, and it’s a neat little bridge that connects OpenAI Codex (or any compatible LLM) with Feishu’s bot API. No fluff, no heavy infrastructure — just a clean way to run coding and reasoning tasks inside your team’s chat.
It’s not a full-blown “AI agent” framework, but it’s exactly the kind of tool that makes you go “why didn’t I think of that?” when you see it working.
What It Does
In short, Codex Remote Feishu lets you send messages to a Feishu bot, which forwards them to a remote Codex instance (like OpenAI’s GPT-3.5/Codex or a local model), and returns the response back into the chat.
The magic is that it’s remote — the actual AI processing happens on your own machine or a server you control, not inside Feishu’s cloud. You just point the bot at your endpoint, and it works like a shared AI assistant for your whole team.
Why It’s Cool
- Private & Customizable – The AI runs on your own hardware or OpenAI key, so no third-party sees your prompts or data. You can swap models, tweak prompts, or even add custom functions without any Feishu plugin limits.
- Real-time Streaming – Responses come back as they’re generated (multiple messages or markup), so it feels natural in a chat. No waiting for a full reply before you see anything.
- Zero Setup for Teammates – They just @mention the bot in a group or DM. No plugins, no extra accounts, no CLI. It’s as easy as talking to a person.
- Multi-turn Context – The bot remembers previous messages in a thread, so you can ask follow-ups like “fix that bug from earlier” and it actually works.
How to Try It
The repo is straightforward and well-documented. Here’s the quick path:
-
Clone the repo:
git clone https://github.com/kxn/codex-remote-feishu -
Install dependencies (Python 3.8+):
pip install -r requirements.txt -
Set up a Feishu custom bot in your admin console (get a webhook URL and verify token).
-
Configure environment variables:
FEISHU_APP_ID,FEISHU_APP_SECRET,OPENAI_API_KEY(or your own API endpoint). -
Run the server:
python main.py
That’s it. Once running, your Feishu bot will start responding.
For a more detailed walkthrough, check the README.md — there’s even a Dockerfile if you prefer containers.
Final Thoughts
This project solves a real pain point: bringing AI assistance into an existing team chat without forcing everyone to learn a new tool or expose sensitive data. The implementation is lean, uses standard APIs, and the code is clean enough to hack on.
If you’re already on Feishu (or Lark) and want a shared AI coding buddy that respects your data, this is worth spinning up in an afternoon. It’s not flashy, but it’s solid. And sometimes that’s exactly what you need.
Built with 💻 from @githubprojects
Repository: https://github.com/kxn/codex-remote-feishu