Skip to content

BAESY2/MiroFish

 
 

Repository files navigation

MiroFish Logo

666ghj%2FMiroFish | Trendshift

简洁通用的群体智能引擎,预测万物
A Simple and Universal Swarm Intelligence Engine, Predicting Anything

666ghj%2MiroFish | Shanda

GitHub Stars GitHub Watchers GitHub Forks Docker Ask DeepWiki

Discord X Instagram

English | 中文文档

⚡ 项目概述

MiroFish 是一款基于多智能体技术的新一代 AI 预测引擎。通过提取现实世界的种子信息(如突发新闻、政策草案、金融信号),自动构建出高保真的平行数字世界。在此空间内,成千上万个具备独立人格、长期记忆与行为逻辑的智能体进行自由交互与社会演化。你可透过「上帝视角」动态注入变量,精准推演未来走向——让未来在数字沙盘中预演,助决策在百战模拟后胜出

你只需:上传种子材料(数据分析报告或者有趣的小说故事),并用自然语言描述预测需求
MiroFish 将返回:一份详尽的预测报告,以及一个可深度交互的高保真数字世界

我们的愿景

MiroFish 致力于打造映射现实的群体智能镜像,通过捕捉个体互动引发的群体涌现,突破传统预测的局限:

  • 于宏观:我们是决策者的预演实验室,让政策与公关在零风险中试错
  • 于微观:我们是个人用���的创意沙盘,无论是推演小说结局还是探索脑洞,皆可有趣、好玩、触手可及

从严肃预测到趣味仿真,我们让每一个如果都能看见结果,让预测万物成为可能。

🌐 在线体验

欢迎访问在线 Demo 演示环境,体验我们为你准备的一次关于热点舆情事件的推演预测:mirofish-live-demo

📸 系统截图

截图1 截图2
截图3 截图4
截图5 截图6

🎬 演示视频

1. 武汉大学舆情推演预测 + MiroFish项目讲解

MiroFish Demo Video

点击图片查看使用微舆BettaFish生成的《武大舆情报告》进行预测的完整演示视频

2. 《红楼梦》失传结局推演预测

MiroFish Demo Video

点击图片查看基于《红楼梦》前80回数十万字,MiroFish深度预测失传结局

金融方向推演预测时政要闻推演预测等示例陆续更新中...

🔄 工作流程

  1. 图谱构建:现实种子提取 & 个体与群体记忆注入 & GraphRAG构建
  2. 环境搭建:实体关系抽取 & 人设生成 & 环境配置Agent注入仿真参数
  3. 开始模拟:双平台并行模拟 & 自动解析预测需求 & 动态更新时序记忆
  4. 报告生成:ReportAgent拥有丰富的工具集与模拟后环境进行深度交互
  5. 深度互动:与模拟世界中的任意一位进行对话 & 与ReportAgent进行对话

🚀 快速开始

一、源码部署(推荐)

前置要求

工具 版本要求 说明 安装检查
Node.js 18+ 前端运行环境,包含 npm node -v
Python ≥3.11, ≤3.12 后端运行环境 python --version
uv 最新版 Python 包管理器 uv --version

1. 配置环境变量

# 复制示例配置文件
cp .env.example .env

# 编辑 .env 文件,填入必要的 API 密钥

必需的环境变量:

# LLM API配置(支持 OpenAI SDK 格式的任意 LLM API)
# 推荐使用阿里百炼平台qwen-plus模型:https://bailian.console.aliyun.com/
# 注意消耗较大,可先进行小于40轮的模拟尝试
LLM_API_KEY=your_api_key
LLM_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1
LLM_MODEL_NAME=qwen-plus

# Zep Cloud 配置
# 每月免费额度即可支撑简单使用:https://app.getzep.com/
ZEP_API_KEY=your_zep_api_key

2. 安装依赖

# 一键安装所有依赖(根目录 + 前端 + 后端)
npm run setup:all

或者分步安装:

# 安装 Node 依赖(根目录 + 前端)
npm run setup

# 安装 Python 依赖(后端,自动创建虚拟环境)
npm run setup:backend

3. 启动服务

# 同时启动前后端(在项目根目录执行)
npm run dev

服务地址:

  • 前端:http://localhost:3000
  • 后端 API:http://localhost:5001

单独启动:

npm run backend   # 仅启动后端
npm run frontend  # 仅启动前端

二、Docker 部署

# 1. 配置环境变量(同源码部署)
cp .env.example .env

# 2. 拉取镜像并启动
docker compose up -d

默认会读取根目录���的 .env,并映射端口 3000(前端)/5001(后端)

docker-compose.yml 中已通过注释提供加速镜像地址,可按需替换

📬 更多交流

QQ交流群

 

MiroFish团队长期招募全职/实习,如果你对多Agent应用感兴趣,欢迎投递简历至:mirofish@shanda.com

📄 致谢

MiroFish 得到了盛大集团的战略支持和孵化!

MiroFish 的仿真引擎由 OASIS 驱动,我们衷心感谢 CAMEL-AI 团队的开源贡献!

📈 项目统计

Star History Chart

⚡ Temporal Inversion Engine (ChaosEngine Fork)

MiroFish predicts the future by simulating it forward.
This fork adds Temporal Inversion — attacking uncertainty from the opposite direction.

FORWARD (MiroFish):
  Seed →→→→ uncertainty increasing →→→→ Predictions

INVERTED (ChaosEngine):  
  Preconditions ←←←← certainty increasing ←←←← Outcome

TEMPORAL PINCER:
  Forward →→→→→ ⚡ TURNSTILE ⚡ ←←←←← Inverted
                  (convergence)

How It Works

  1. MiroFish runs normally — generates forward predictions
  2. Inversion engine takes the prediction DAG and applies Bayes' theorem to reverse every edge:
    • Forward: P(Effect | Cause) — from simulation
    • Inverted: P(Cause | Effect) — computed via Bayes
  3. Shannon entropy computed in both directions — gradient reveals the Turnstile (point of no return)
  4. Temporal Pincer: √(forward × inverted) = highest confidence prediction

What You Get (in 46ms, 0 LLM calls)

Analysis Question Answered
Necessity "What HAD to be true for this outcome?"
Turnstile "When does the outcome become inevitable?"
Sensitivity "Which single factor changes the outcome most?"
Monte Carlo "How stable is this prediction?" (confidence intervals)
do-Calculus "If we FORCE this to change, what happens?" (intervention vs observation)
Markov Blanket "What's the minimum set to monitor?" (75%+ compression)
Robustness "How many things must break for the prediction to flip?"
Counterfactual "What if this factor didn't exist?"
KL Divergence "Do forward and inverted views agree?"

Agent Verification (optional, requires LLM API)

After math inversion, 4 adversarial agents debate each finding:

  • Prosecutor: tries to disprove
  • Defender: provides evidence
  • Devil's Advocate: finds edge cases
  • Judge: scores survival (0-1)

Findings that survive → confidence boosted. Killed → DAG probabilities adjusted → re-inversion. Loop until convergence.

Quick Start

# Math core only (no API key needed)
cd backend/test_inversion
python test_true_inversion.py

# API endpoint
curl -X POST http://localhost:5001/api/inversion \
  -H "Content-Type: application/json" \
  -d '{"nodes": [...], "edges": [...]}'

Inspired By

Christopher Nolan's Tenet — temporal pincer movement.
"What's happened, happened. But what had to be true?"

About

A Simple and Universal Swarm Intelligence Engine, Predicting Anything. 简洁通用的群体智能引擎,预测万物

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 60.0%
  • Vue 38.9%
  • Other 1.1%