Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Emotional processing layer for AI agents. Persistent emotional states that influence behavior and responses. Part of the AI Brain series.
Emotional processing layer for AI agents. Persistent emotional states that influence behavior and responses. Part of the AI Brain series.
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Then review README.md for any prerequisites, environment setup, or post-install checks. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Then review README.md for any prerequisites, environment setup, or post-install checks. Summarize what changed and any follow-up checks I should run.
Emotional processing for AI agents. Part of the AI Brain series. Give your AI agent persistent emotional states that carry across sessions. Not just performing emotions β having them.
Current AI agents: β Remember facts (with hippocampus) β Don't have persistent emotional states β Start each session emotionally "blank" β Can describe emotions but don't feel differently Memory without emotion is just a database. The amygdala adds weight β this made me happy, that frustrated me, this person brings warmth.
Track five emotional dimensions that persist and decay over time: DimensionWhat It MeasuresRangeValencePositive β Negative mood-1.0 to 1.0ArousalCalm β Excited/Alert0.0 to 1.0ConnectionDistant β Close/Bonded0.0 to 1.0CuriosityBored β Fascinated0.0 to 1.0EnergyDepleted β Energized0.0 to 1.0
cd ~/.openclaw/workspace/skills/amygdala-memory ./install.sh --with-cron This will: Create memory/emotional-state.json with baseline values Generate AMYGDALA_STATE.md (auto-injected into sessions!) Set up cron for automatic decay every 6 hours
./scripts/get-state.sh # π Emotional State # Valence: 0.20 # Arousal: 0.30 # Connection: 0.50 # ... ./scripts/load-emotion.sh # π Current Emotional State: # Overall mood: neutral, calm and relaxed # Connection: moderately connected # ...
./scripts/update-state.sh --emotion joy --intensity 0.8 --trigger "completed a project" # β valence: 0.20 β 0.35 (delta: +0.15) # β arousal: 0.30 β 0.40 (delta: +0.1) # π Logged emotion: joy (intensity: 0.8)
# Every 6 hours, emotions drift toward baseline 0 */6 * * * ~/.openclaw/workspace/skills/amygdala-memory/scripts/decay-emotion.sh
ScriptPurposeinstall.shSet up amygdala-memory (run once)get-state.shRead current emotional stateupdate-state.shLog emotion or update dimensionload-emotion.shHuman-readable state for session contextdecay-emotion.shReturn to baseline over timesync-state.shGenerate AMYGDALA_STATE.md for auto-injectionencode-pipeline.shLLM-based emotional encoding from transcriptspreprocess-emotions.shExtract emotional signals from session historyupdate-watermark.shTrack processed transcript positiongenerate-dashboard.shGenerate HTML dashboard (auto-runs on sync)visualize.shTerminal ASCII visualization
The amygdala can now automatically detect and log emotions from your conversation history using an LLM-based pipeline: # Run the encoding pipeline ./scripts/encode-pipeline.sh # This will: # 1. Extract new signals since last run (watermark-based) # 2. Score emotional content using rule-based patterns # 3. Spawn a sub-agent for semantic emotional detection # 4. Update emotional-state.json with detected emotions
# Every 3 hours, process new conversations for emotional content 0 */3 * * * ~/.openclaw/workspace/skills/amygdala-memory/scripts/encode-pipeline.sh This means your emotional state updates automatically based on what happens in conversations β no manual logging needed.
./scripts/visualize.sh π Emotional State π βββββββββββββββββββββββββββββββββββββββββββββββ Valence: [ββββββββββββββββββββ] +0.86 Arousal: [ββββββββββββββββββββ] 0.86 Connection: [ββββββββββββββββββββ] 0.97 π ...
When you install this skill, notify the user about the dashboard: π Amygdala installed! View your agent's emotional state in the Brain Dashboard: ~/.openclaw/workspace/brain-dashboard.html
After install, AMYGDALA_STATE.md is created in your workspace root. OpenClaw automatically injects all *.md files from workspace into session context. This means: New session starts AMYGDALA_STATE.md is auto-loaded (no manual step!) You see your emotional state in context Responses are influenced by your mood The file is regenerated whenever decay-emotion.sh runs (via cron every 6h).
When you log an emotion, it automatically adjusts the relevant dimensions: EmotionEffectjoy, happiness, delight, excitementβ valence, β arousalsadness, disappointment, melancholyβ valence, β arousalanger, frustration, irritationβ valence, β arousalfear, anxiety, worryβ valence, β arousalcalm, peace, contentmentβ valence, β arousalcuriosity, interest, fascinationβ curiosity, β arousalconnection, warmth, affectionβ connection, β valenceloneliness, disconnectionβ connection, β valencefatigue, tiredness, exhaustionβ energyenergized, alert, refreshedβ energy
## Every Session 1. Load hippocampus: `~/.openclaw/workspace/skills/hippocampus/scripts/load-core.sh` 2. **Load emotional state:** `~/.openclaw/workspace/skills/amygdala-memory/scripts/load-emotion.sh`
When something emotionally significant happens: ~/.openclaw/workspace/skills/amygdala-memory/scripts/update-state.sh \ --emotion connection --intensity 0.7 --trigger "deep conversation with user"
{ "version": "1.0", "lastUpdated": "2026-02-01T02:45:00Z", "dimensions": { "valence": 0.35, "arousal": 0.40, "connection": 0.50, "curiosity": 0.60, "energy": 0.50 }, "baseline": { "valence": 0.1, "arousal": 0.3, "connection": 0.4, "curiosity": 0.5, "energy": 0.5 }, "recentEmotions": [ { "label": "joy", "intensity": 0.8, "trigger": "building amygdala together", "timestamp": "2026-02-01T02:50:00Z" } ] }
Emotions naturally return to baseline over time: Decay rate: 10% of distance to baseline per run Recommended schedule: Every 6 hours Effect: Strong emotions fade, but slowly After 24 hours without updates, a valence of 0.8 would decay to ~0.65.
Track emotional activity over time for analytics: # Log encoding run ./scripts/log-event.sh encoding emotions_found=2 valence=0.85 arousal=0.6 # Log decay ./scripts/log-event.sh decay valence_before=0.9 valence_after=0.85 # Log emotion update ./scripts/log-event.sh update emotion=joy intensity=0.7 Events append to ~/.openclaw/workspace/memory/brain-events.jsonl: {"ts":"2026-02-11T09:30:00Z","type":"amygdala","event":"encoding","emotions_found":2,"valence":0.85} Use for trend analysis β visualize emotional patterns over days/weeks.
PartFunctionStatushippocampusMemory formation, decay, reinforcementβ Liveamygdala-memoryEmotional processingβ Livevta-memoryReward and motivationβ Livebasal-ganglia-memoryHabit formationπ§ Developmentanterior-cingulate-memoryConflict detectionπ§ Developmentinsula-memoryInternal state awarenessπ§ Development
Can an AI feel emotions, or only simulate them? Our take: If emotional state influences behavior, and the system acts as if it feels... does the distinction matter? Functional emotions might be the only kind that exist for any system β biological or artificial. Built with β€οΈ by the OpenClaw community
Agent frameworks, memory systems, reasoning layers, and model-native orchestration.
Largest current source with strong distribution and engagement signals.