What Are Signals?
Signals are raw captured moments from your development sessions - aha moments, decisions, friction points. They're the atomic units of insight that eventually cluster into patterns. Not every signal becomes a pattern, and that's okay.
11 in last 7 days
48% of signals
44% of signals
8% of signals
**Cold Start Problem IS PL's entire value proposition**
**Beren's Hive Mind = PL's Architecture**
**The Supervision Paradox creates demand for PL**
**Priority mapping for PL features**
**Mark articles 165, 168, 179 as high-confidence validation**
**CS193p has principles embedded in its materials**
**Review screen needs a "Your Principles" panel**
**Freeform drag-drop over guided wizard**
**Lazy analysis (after organize, not during upload)**
**Dismissed suggestions collapse, don't disappear**
**Gaps = principle violations, not external standards**
The relevance scoring system creates compound signals
Watch list as living hypothesis tracker
Use slash commands instead of shell scripts for digest/trends
Create `intake-hn.sh` as placeholder + fetch pattern
Six topic filters aligned with articles/ structure
Trend tracking with explicit watch list
Shell script can't call MCP tools directly
No deduplication between INBOX and existing articles
Current parser suffers from task conflict
Presence head decoupling = garbage collection optimization
Exemplar search enables "sessions like THIS one" queries
Implement SAM3-inspired presence detection in parser
Create exemplar-based search using pattern matching
Document detector/tracker dual architecture
Script syntax error in exemplar search
**Two Truths doctrine maps directly to three-tier model**
**Lakatos's "degenerating research program" is a critical success test**
**This session itself may be the problem it describes**
**Pattern surfacing dilemma connects to multiple sessions**
**Added Lakatos Test to Week 1 gate criteria**
**20+ directional relationships exist even in "simplified" views**
**Lens-switching is the feature, not multi-lens display**
**Adopt lens-based connection mapping for session relationships**
Vector DB could automate the "2+ references across contexts" promotion criteria
Tension between automated discovery and manual connection-making
Stuck sessions analysis is a killer use case
Hold vector DB implementation until Week 3 (after Week 1 gate passes)
Document the concept now, implement later if valuable
All major AI providers converged on same structured outputs approach
Structured outputs eliminate the "prompt engineering for JSON" anti-pattern
Three-layer validation strategy maps perfectly to garbage collection model
Use structured outputs for session knowledge extraction
Choose Claude Sonnet 4.5 as primary extraction model
Three-tier schema design (SessionExtraction, IntellectualInputTracking, PatternEmergence)
Had to read 5 separate documentation sources to synthesize best practices
Not yet clear how to handle "partial success" extractions
Task table pattern perfectly maps to knowledge management needs
This is a meta-dogfooding moment (building knowledge system using knowledge system principles)
Table views answer experiment question "which principles are dead weight?"
Create comprehensive PRD with user stories before coding
Augment kanban with table view (toggle), don't replace
Phase implementation (Inputs Week 1, Sessions Week 2, Patterns Week 3)
The garbage collection model literally mirrors how human brains learn
Representational change is binary (cold→hot), not gradual (warmer→warmer)
Visual leaderboard could track when representational changes happen
Keep "Aha Moments" section in session template (already there, now validated)
Validation must come from cross-references, not subjective ratings
Shadcn template perfectly aligns with Visual Leaderboard concept
Created CLAUDE.md for repository guidance
Identified Shadcn UI template for Visual Leaderboard implementation