
MongoDB / WiredTiger Corruption / Error: A Cautionary Tale of Adding MCP Servers to LibreChat
How a simple MCP config change crashed my entire LibreChat setup - twice
Sometimes the most frustrating bugs are the ones that punch you in the face twice before you learn to duck. I ran into a somewhat mysterious issue while trying to integrate the Pal MCP server (formerly known as Zen MCP server) into my local LibreChat install on my Macbook. What should have been a straightforward configuration change ended up corrupting MongoDB and taking down my entire local LLM infrastructure. Twice.
In this blogpost, I will document how Pal MCP + LibreChat caused MongoDB to catastrophically fail - not once, but twice - and how I worked around it.
Table of Contents
Context
I use LibreChat as a core part of my LLM workflows. If you're not familiar with it, LibreChat is a self-hosted chat interface that lets you work with multiple LLM providers via it's integration with Openrouter. I run it via Docker Compose on my Mac, which spins up several containers including MongoDB for data persistence.
I also rely heavily on the Pal MCP server for "council of LLMs" - essentially getting multiple models to weigh in on problems, debate solutions, and reach consensus. It's become indispensable for complex reasoning tasks, especially for controversial topics - rangingfrom:
- the petty ones ("is a hot dog a sandwich?" or "is chocolate actually healthy?")
- to contentious / polarizing / thornier questions like gun control policy, climate policy trade-offs, the ethics of capital punishment, whether military intervention can ever be justified, abortion rights, cryptocurrency regulation, immigration reforms or when (if ever) war is morally defensible.
Having multiple models debate these issues surfaces blind spots and steelmans arguments I'd otherwise miss. When models disagree, I learn more than when they don't.
The Setup That Should Have Worked
I was trying to integrate the Pal MCP server (formerly known as Zen MCP server) into my LibreChat installation. The configuration seemed straightforward enough - just add the appropriate settings to librechat.yaml and restart the containers.
What could go wrong? Well, everything.
I added the Pal MCP server configuration to my librechat.yaml:
mcpServers:
pal:
command: sh
args:
- -c
- "for p in $(which uvx 2>/dev/null) $HOME/.local/bin/uvx /opt/homebrew/bin/uvx /usr/local/bin/uvx uvx; do [ -x \"$p\" ] && exec \"$p\" --from git+https://github.com/BeehiveInnovations/pal-mcp-server.git pal-mcp-server; done; echo 'uvx not found' >&2; exit 1"
env:
PATH: "/usr/local/bin:/usr/bin:/bin:/opt/homebrew/bin:~/.local/bin"
OPENROUTER_API_KEY: "<INSERT_KEY_HERE>"After updating the config and restarting the containers, MongoDB's Docker container refused to boot. The logs revealed something ugly:
{"t":{"$date":"2025-12-23T19:01:11.620+00:00"},"s":"E", "c":"WT", "id":22435, "ctx":"initandlisten","msg":"WiredTiger error message","attr":{"error":-31802,"message":{"ts_sec":1766516471,"ts_usec":620540,"thread":"1:0x7fb37d73e400","session_dhandle_name":"file:WiredTiger.wt","session_name":"connection","category":"WT_VERB_DEFAULT","log_id":1000000,"category_id":12,"verbose_level":"ERROR","verbose_level_id":-3,"msg":"__wti_block_read_off:308:WiredTiger.wt: fatal read error","error_str":"WT_ERROR: non-specific WiredTiger error","error_code":-31802}}}
{"t":{"$date":"2025-12-23T19:01:11.620+00:00"},"s":"E", "c":"WT", "id":22435, "ctx":"initandlisten","msg":"WiredTiger error message","attr":{"error":-31804,"message":{"ts_sec":1766516471,"ts_usec":620606,"thread":"1:0x7fb37d73e400","session_dhandle_name":"file:WiredTiger.wt","session_name":"connection","category":"WT_VERB_DEFAULT","log_id":1000000,"category_id":12,"verbose_level":"ERROR","verbose_level_id":-3,"msg":"__wti_block_read_off:308:the process must exit and restart","error_str":"WT_PANIC: WiredTiger library panic","error_code":-31804}}}
{"t":{"$date":"2025-12-23T19:01:11.620+00:00"},"s":"F", "c":"ASSERT", "id":23089, "ctx":"initandlisten","msg":"Fatal assertion","attr":{"msgid":50853,"location":"src/mongo/db/storage/wiredtiger/wiredtiger_util.cpp:644:9:int mongo::{anonymous}::mdb_handle_error_with_startup_suppression(WT_EVENT_HANDLER*, WT_SESSION*, int, const char*)"}}
{"t":{"$date":"2025-12-23T19:01:11.620+00:00"},"s":"F", "c":"ASSERT", "id":23090, "ctx":"initandlisten","msg":"\n\n***aborting after fassert() failure\n\n"}
{"t":{"$date":"2025-12-23T19:01:11.620+00:00"},"s":"F", "c":"CONTROL", "id":6384300, "ctx":"initandlisten","msg":"Writing fatal message","attr":{"message":"Got signal: 6 (Aborted).\n"}}
{"t":{"$date":"2025-12-23T19:01:11.912+00:00"},"s":"I", "c":"CONTROL", "id":31380, "ctx":"initandlisten","msg":"BACKTRACE","attr":{"bt":{"backtrace":[{"a":"563CE7A76D47","b":"563CDDB8E000","o":"9EE8D47","s":"_ZN5mongo15printStackTraceEv","C":"mongo::printStackTrace()","s+":"37"},{"a":"563CE7A55C45","b":"563CDDB8E000","o":"9EC7C45","s":"_ZN5mongo12_GLOBAL__N_115printErrorBlockEv","C":"mongo::(anonymous namespace)::printErrorBlock()","s+":"225"},{"a":"563CE7A55D55","b":"563CDDB8E000","o":"9EC7D55","s":"abruptQuit","s+":"85"},{"a":"7FB37E203330","b":"7FB37E1BE000","o":"45330"},{"a":"7FB37E25CB2C","b":"7FB37E1BE000","o":"9EB2C","s":"pthread_kill","s+":"11C"},{"a":"7FB37E20327E","b":"7FB37E1BE000","o":"4527E","s":"gsignal","s+":"1E"},{"a":"7FB37E1E68FF","b":"7FB37E1BE000","o":"288FF","s":"abort","s+":"DF"},{"a":"563CE7A43F70","b":"563CDDB8E000","o":"9EB5F70","s":"_ZN5mongo12_GLOBAL__N_19callAbortEv","C":"mongo::(anonymous namespace)::callAbort()","s+":"3C"},{"a":"563CE7A46631","b":"563CDDB8E000","o":"9EB8631","s":"_ZN5mongo14fassert_detail6failedENS0_5MsgIdENS_24WrappedStdSourceLocationE","C":"mongo::fassert_detail::failed(mongo::fassert_detail::MsgId, mongo::WrappedStdSourceLocation)","s+":"F9"},{"a":"563CE28D4FA5","b":"563CDDB8E000","o":"4D46FA5","s":"_ZN5mongo12_GLOBAL__N_141mdb_handle_error_with_startup_suppressionEP18__wt_event_handlerP12__wt_sessioniPKc.cold","C":"mongo::(anonymous namespace)::mdb_handle_error_with_startup_suppression(__wt_event_handler*, __wt_session*, int, char const*) [clone .cold]","s+":"11"},{"a":"563CE2A81E00","b":"563CDDB8E000","o":"4EF3E00","s":"__eventv","s+":"1310"},{"a":"563CE2A82284","b":"563CDDB8E000","o":"4EF4284","s":"__wt_panic_func","s+":"156"},{"a":"563CE28E86BD","b":"563CDDB8E000","o":"4D5A6BD","s":"__wti_block_read_off.cold","s+":"134"},{"a":"563CE2069C96","b":"563CDDB8E000","o":"44DBC96","s":"__wti_block_extlist_read","s+":"96"},{"a":"563CE28DF06F","b":"563CDDB8E000","o":"4D5106F","s":"__wti_block_checkpoint_extlist_dump","s+":"22F"},{"a":"563CE1E7410E","b":"563CDDB8E000","o":"42E610E","s":"__wti_block_read_off","s+":"3AE"},{"a":"563CE2069C96","b":"563CDDB8E000","o":"44DBC96","s":"__wti_block_extlist_read","s+":"96"},{"a":"563CE28E5ACB","b":"563CDDB8E000","o":"4D57ACB","s":"__wti_block_extlist_read_avail","s+":"2B"},{"a":"563CE28DE78C","b":"563CDDB8E000","o":"4D5078C","s":"__wt_block_checkpoint_load","s+":"2AC"},{"a":"563CE28F45D9","b":"563CDDB8E000","o":"4D665D9","s":"__bm_checkpoint_load","s+":"39"},{"a":"563CE291279B","b":"563CDDB8E000","o":"4D8479B","s":"__wt_btree_open","s+":"E6B"},{"a":"563CE297D609","b":"563CDDB8E000","o":"4DEF609","s":"__wt_conn_dhandle_open","s+":"379"},{"a":"563CE1FE2298","b":"563CDDB8E000","o":"4454298","s":"__wt_session_get_dhandle","s+":"C8"},{"a":"563CE1FE267A","b":"563CDDB8E000","o":"445467A","s":"__wt_session_get_dhandle","s+":"4AA"},{"a":"563CE1FE2157","b":"563CDDB8E000","o":"4454157","s":"__wt_session_get_btree_ckpt","s+":"7C7"},{"a":"563CE1E10C57","b":"563CDDB8E000","o":"4282C57","s":"__wt_curfile_open","s+":"177"},{"a":"563CE2A6CB59","b":"563CDDB8E000","o":"4EDEB59","s":"__session_open_cursor_int","s+":"599"},{"a":"563CE1E3D536","b":"563CDDB8E000","o":"42AF536","s":"__wt_open_cursor","s+":"56"},{"a":"563CE1F943D0","b":"563CDDB8E000","o":"44063D0","s":"__wt_metadata_cursor_open","s+":"70"},{"a":"563CE1F65869","b":"563CDDB8E000","o":"43D7869","s":"__wt_metadata_cursor","s+":"49"},{"a":"563CE29773A8","b":"563CDDB8E000","o":"4DE93A8","s":"wiredtiger_open","s+":"1EE8"},{"a":"563CE289FFC9","b":"563CDDB8E000","o":"4D11FC9","s":"_ZN5mongo18WiredTigerKVEngine15_openWiredTigerERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES8_","C":"mongo::WiredTigerKVEngine::_openWiredTiger(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)","s+":"79"},{"a":"563CE28ACC12","b":"563CDDB8E000","o":"4D1EC12","s":"_ZN5mongo18WiredTigerKVEngineC1ERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES8_PNS_11ClockSourceENS_22WiredTigerKVEngineBase16WiredTigerConfigERKNS_20WiredTigerExtensionsEbbbb","C":"mongo::WiredTigerKVEngine::WiredTigerKVEngine(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, mongo::ClockSource*, mongo::WiredTigerKVEngineBase::WiredTigerConfig, mongo::WiredTigerExtensions const&, bool, bool, bool, bool)","s+":"812"},{"a":"563CE285B81A","b":"563CDDB8E000","o":"4CCD81A","s":"_ZNK5mongo12_GLOBAL__N_117WiredTigerFactory6createEPNS_16OperationContextERKNS_19StorageGlobalParamsEPKNS_21StorageEngineLockFileEbbb","C":"mongo::(anonymous namespace)::WiredTigerFactory::create(mongo::OperationContext*, mongo::StorageGlobalParams const&, mongo::StorageEngineLockFile const*, bool, bool, bool) const","s+":"2DA"},{"a":"563CE3D9960C","b":"563CDDB8E000","o":"620B60C","s":"_ZN5mongo23initializeStorageEngineEPNS_16OperationContextENS_22StorageEngineInitFlagsEbbbPNS_14BSONObjBuilderE","C":"mongo::initializeStorageEngine(mongo::OperationContext*, mongo::StorageEngineInitFlags, bool, bool, bool, mongo::BSONObjBuilder*)","s+":"44C"},{"a":"563CE3D42477","b":"563CDDB8E000","o":"61B4477","s":"_ZN5mongo7catalog40startUpStorageEngineAndCollectionCatalogEPNS_14ServiceContextEPNS_6ClientENS_22StorageEngineInitFlagsEPNS_14BSONObjBuilderE","C":"mongo::catalog::startUpStorageEngineAndCollectionCatalog(mongo::ServiceContext*, mongo::Client*, mongo::StorageEngineInitFlags, mongo::BSONObjBuilder*)","s+":"E7"},{"a":"563CE20B7B96","b":"563CDDB8E000","o":"4529B96","s":"_ZN5mongo12_GLOBAL__N_114_initAndListenEPNS_14ServiceContextE","C":"mongo::(anonymous namespace)::_initAndListen(mongo::ServiceContext*)","s+":"6D6"},{"a":"563CE20BA092","b":"563CDDB8E000","o":"452C092","s":"_ZN5mongo12_GLOBAL__N_113initAndListenEPNS_14ServiceContextE","C":"mongo::(anonymous namespace)::initAndListen(mongo::ServiceContext*)","s+":"22"},{"a":"563CE20BF78E","b":"563CDDB8E000","o":"453178E","s":"_ZN5mongo11mongod_mainEiPPc","C":"mongo::mongod_main(int, char**)","s+":"E1E"},{"a":"563CE20A80E7","b":"563CDDB8E000","o":"451A0E7","s":"main","s+":"9"},{"a":"7FB37E1E81CA","b":"7FB37E1BE000","o":"2A1CA"},{"a":"7FB37E1E828B","b":"7FB37E1BE000","o":"2A28B","s":"__libc_start_main","s+":"8B"},{"a":"563CE20A7FC5","b":"563CDDB8E000","o":"4519FC5","s":"_start","s+":"25"}],"processInfo":{"mongodbVersion":"8.2.2","gitVersion":"594f839ceec1f4385be9a690131412d67b249a0a","compiledModules":[],"uname":{"sysname":"Linux","release":"6.12.54-linuxkit","version":"#1 SMP PREEMPT_DYNAMIC Tue Nov 4 21:39:03 UTC 2025","machine":"x86_64"},"somap":[{"b":"563CDDB8E000","path":"/usr/bin/mongod","elfType":3,"buildId":"9EE04132F3F41053B388D9BA9B3762DCB713C1A4"},{"b":"7FB37E1BE000","path":"/lib/x86_64-linux-gnu/libc.so.6","elfType":3,"buildId":"274EEC488D230825A136FA9C4D85370FED7A0A5E"}]}}},"tags":[]}
{"t":{"$date":"2025-12-23T19:01:11.912+00:00"},"s":"I", "c":"CONTROL", "id":31445, "ctx":"initandlisten","msg":"Frame","attr":{"frame":{"a":"563CE7A76D47","b":"563CDDB8E000","o":"9EE8D47","s":"_ZN5mongo15printStackTraceEv","C":"mongo::printStackTrace()","s+":"37"}}}
{"t":{"$date":"2025-12-23T19:01:11.912+00:00"},"s":"I", "c":"CONTROL", "id":31445, "ctx":"initandlisten","msg":"Frame","attr":{"frame":{"a":"563CE7A55C45","b":"563CDDB8E000","o":"9EC7C45","s":"_ZN5mongo12_GLOBAL__N_115printErrorBlockEv","C":"mongo::(anonymous namespace)::printErrorBlock()","s+":"225"}}}
{"t":{"$date":"2025-12-23T19:01:11.912+00:00"},"s":"I", "c":"CONTROL", "id":31445, "ctx":"initandlisten","msg":"Frame","attr":{"frame":{"a":"563CE7A55D55","b":"563CDDB8E000","o":"9EC7D55","s":"abruptQuit","s+":"85"}}}
{"t":{"$date":"2025-12-23T19:01:11.912+00:00"},"s":"I", "c":"CONTROL", "id":31445, "ctx":"initandlisten","msg":"Frame","attr":{"frame":{"a":"7FB37E203330","b":"7FB37E1BE000","o":"45330"}}}
{"t":{"$date":"2025-12-23T19:01:11.912+00:00"},"s":"I", "c":"CONTROL", "id":31445, "ctx":"initandlisten","msg":"Frame","attr":{"frame":{"a":"7FB37E25CB2C","b":"7FB37E1BE000","o":"9EB2C","s":"pthread_kill","s+":"11C"}}}
{"t":{"$date":"2025-12-23T19:01:11.912+00:00"},"s":"I", "c":"CONTROL", "id":31445, "ctx":"initandlisten","msg":"Frame","attr":{"frame":{"a":"7FB37E20327E","b":"7FB37E1BE000","o":"4527E","s":"gsignal","s+":"1E"}}}The Diagnosis 🩺
After feeding the logs to Claude, the verdict was clear: WiredTiger storage engine corruption causing MongoDB to crash with a segmentation fault (error code 139).
The key errors in the logs were:
This is a WiredTiger storage engine corruption issue causing MongoDB to crash. Error code 139 indicates a segmentation fault (SIGSEGV). The key error is: WT_PANIC: WiredTiger library panic
fatal read error on WiredTiger.wt
the process must exit and restart
WiredTiger is MongoDB's default storage engine, and when it panics like this, your data is likely compromised. The corruption was severe enough that MongoDB container was stuck in a bootloop.
The Fix (Nuclear Option 💣)
Warning: This solution causes data loss. Back up everything first if you have any hope of recovery.
Since the database was corrupted beyond repair, I had to nuke everything and start fresh:
DANGER AHEAD! PROCEED WITH CAUTION
# Stop all containers and remove volumes
docker-compose down -v
# Remove orphaned volumes
docker volume prune -f
# Check what's left
docker volume ls
# The nuclear option - removes everything
docker system prune -a --volumesThat last command is particularly destructive. It removes:
- all stopped containers
- all networks not used by at least one container
- all anonymous volumes not used by at least one container
- all images without at least one container associated to them
- all build cache
After the purge, I rebuilt everything from scratch:
docker compose up -dLibreChat came back up with a fresh MongoDB instance. All my chat history was gone, but at least it was working.
The Mystery Remains 🕵🏻♂️
Here's the frustrating part: I still don't know why adding the Pal MCP server configuration corrupted MongoDB. My common sense tells me that the config change shouldn't have touched the database at all.
The fact that it happened twice with the same config change suggests it's not random, but I haven't been able to pin down the root cause (yet!).
Here's what Sonnet 4.5 told me to try (which I haven't tried yet):
The issue is with your PAL MCP server configuration - specifically the shell command and path handling. The complex shell script is likely causing resource issues or hanging during startup, which is blocking MongoDB initialization.
The Problem
Your PAL MCP config uses a shell loop that:
- Searches multiple paths for
uvx- Clones and installs from git
- Has
~/.local/binin PATH (tilde expansion doesn't work in Docker)This is probably timing out or consuming resources during container startup, preventing MongoDB from initializing properly.
Solution
Replace your PAL MCP configuration with a simpler, Docker-friendly approach:
mcpServers: pal: command: uvx args: - --from - git+https://github.com/BeehiveInnovations/pal-mcp-server.git - pal-mcp-server env: OPENROUTER_API_KEY: "<INSERT_KEY_HERE>"
What's Next
I'm now back to square one: LibreChat running without the Pal MCP integration. Both tools are essential to my workflow:
LibreChat's Chain of Agents
I also depend heavily on LibreChat's no-code Agent Chain - Think Unix pipes, but for LLMs! But in this case, think of each prompt-model combo as a Unix command doing one job. LibreChat's Agent Chain is just the pipe between them - carrying output from one prompt to the next.
In other words: Agent chain is a no-code pipeline of multiple prompts, where each prompt does one thing and does it well (via a specific LLM model that specializes for that task), and hands off it's output to the next...
In Unix Philosophy:
cat file.txt | transform | formatchains commands via pipes.In LibreChat:
extraction prompt (via gpt5.2) → analyze (via sonnet 4.5) → summarize (via gemini 3.5 pro)chains prompt-model combos via the agent chain.
Pal MCP's Council of LLMs
For complex decisions, I want multiple models with different perspectives to analyze the problem. Pal MCP makes this possible by orchestrating multi-model consensus.
I have updated this post with instructions to safely add Pal MCP to LibreChat without corrupting everything. If you've successfully integrated these two tools, I'd love to hear how you did it.
Have you experienced similar issues with LibreChat, MCP servers, or MongoDB in Docker? Reach out - I'm curious if this is a known issue or if I'm just uniquely unlucky.
I might (or might not) update this blog post if and when I figure out how to configure Pal MCP inside LibreChat. For now, they run separately; Pal MCP in Claude code, and my MongoDB sleeps peacefully.
How to get Pal MCP working in librachat:
This is how my bare minimal librechat.yaml looks like (feel free to remove parts that you dont need). Make sure to replace <INSERT_OPENROUTER_KEY_HERE> with your Openrouter API key:
version: 1.2.8
cache: true
endpoints:
agents:
recursionLimit: 100 # increase default agent steps to 50
maxRecursionLimit: 500 # allow up to 100 if adjusting via UI
custom:
- name: "OpenRouter"
apiKey: "<INSERT_OPENROUTER_KEY_HERE>"
baseURL: "https://openrouter.ai/api/v1"
models:
default: ["gpt-3.5-turbo"]
fetch: true
titleConvo: true
titleModel: "current_model"
summarize: false
summaryModel: "current_model"
forcePrompt: false
modelDisplayLabel: "OpenRouter"
mcpServers:
memory:
command: npx
args:
- -y
- "@modelcontextprotocol/server-memory"
pal:
command: uvx
args:
- --from
- git+https://github.com/BeehiveInnovations/pal-mcp-server.git
- pal-mcp-server
env:
OPENROUTER_API_KEY: "<INSERT_OPENROUTER_KEY_HERE>"The snippet for maxRecursionLimit and recursionLimit is to fix if you get this error (Click here and here for more info):
Something went wrong. Here's the specific error message we encountered: An error occurred while processing the request: Recursion limit of 25 reached without hitting a stop condition. You can increase the limit by setting the "recursionLimit" config key. Troubleshooting URL: https://langchain-ai.github.io/langgraphjs/troubleshooting/errors/GRAPH_RECURSION_LIMIT/
And this is my docker-compose.override.yml :
services:
api:
volumes:
- type: bind
source: ./librechat.yaml
target: /app/librechat.yaml
command: sh -c "apk add --no-cache git && pip install --break-system-packages uv && npm run backend"