I started reading and thought that it isn’t that bad, just what I expected from closed source software but then came this:
Team Memory Sync, an unreleased internal project. There’s a bidirectional sync service (src/services/teamMemorySync/index.ts) that connects local memory files to api.anthropic.com/api/claude_code/team_memory. It provides a way to share memories with other team members within an organization. The service includes a secret scanner (secretSanner.ts) that uses regex patterns for around 40 known token and API key patterns (AWS, Azure, GCP, etc). But sensitive data that doesn’t match these regexes might be exposed to other team members through memory sync.
I started reading and thought that it isn’t that bad, just what I expected from closed source software but then came this:
This seems like a great idea!
On one hand yeah it’s bad, but from the wording it seems like it’s for organizations, that is for work.
If you’re putting sensitive data into an AI service your employer provides for work, I have no no notes.