notebooklm-skill: The Open-Source Pipeline From Research to Content at Zero Cost
The only open-source tool connecting trend discovery, NotebookLM research, AI content creation, and multi-platform publishing in one zero-cost pipeline.
What if you could go from “I want to write about X” to a published article, podcast, slide deck, and social posts — all automated, all cited, and all at zero API cost?
That’s exactly what notebooklm-skill does.
The Problem With AI Content Pipelines
Most AI content workflows have two painful gaps:
- AI-generated content lacks real citations. LLMs hallucinate sources. Even RAG pipelines require expensive embedding infrastructure.
- Research tools charge per query. Perplexity, You.com, and similar services bill for every search. Scale up, and costs compound fast.
Google’s NotebookLM solves both problems — it grounds research in real sources with verifiable citations, generates 10 different artifact types (podcasts, slide decks, reports, quizzes, and more), and it’s completely free.
The missing piece? Automation. NotebookLM has no public API. It’s a manual, browser-only tool.
Until now.
What notebooklm-skill Does
notebooklm-skill is the only open-source tool that chains the full content creation workflow:
Trend Discovery → NotebookLM Deep Research → AI Content Creation → Multi-Platform Publishing
Built on notebooklm-py v0.3.4, it wraps NotebookLM’s internal HTTP APIs. After a one-time browser login, all subsequent operations are pure HTTP — no browser automation overhead per call.
Dual-Mode Architecture
The tool works in two modes:
| Mode | How It Works | Compatible With |
|---|---|---|
| Claude Code Skill | Drop SKILL.md into .claude/skills/ — auto-detected by trigger phrases | Claude Code |
| MCP Server | Run python3 mcp-server/server.py — stdio or HTTP mode | Claude Code, Cursor, Gemini CLI, any MCP client |
This means the same tool works whether you’re a Claude Code user, a Cursor user, or building your own MCP-compatible agent.
The Numbers
| Category | Count |
|---|---|
| CLI Commands | 11 commands |
| MCP Tools | 13 tools |
| Pipeline Workflows | 5 end-to-end pipelines |
| Artifact Types | 10 (all generated by Google’s servers) |
| Source Types | 11 (URLs, YouTube, PDFs, Google Drive, and more) |
| API Cost | $0 |
10 Artifact Types — All Free
Every artifact is generated by Google’s infrastructure at zero cost to you:
| Artifact | Format | Use Case |
|---|---|---|
| Podcast (Audio Overview) | M4A | Deep dive, brief, critique, or debate formats |
| Video | MP4 | Visual content |
| Slide Deck | Presentations | |
| Report | Markdown | Long-form research report |
| Study Guide | Markdown | Learning guide format |
| Quiz | JSON | Educational assessment |
| Flashcards | JSON | Study material |
| Mind Map | JSON tree | Visual knowledge mapping |
| Infographic | PNG | Visual summaries |
| Data Table | CSV/JSON | Structured data extraction |
The podcast alone supports four formats: deep dive (15-30 min), brief (3-5 min), critique (10-20 min), and debate (two opposing viewpoints, 10-20 min).
5 Pipeline Workflows
These are the pre-built end-to-end workflows:
1. Research → Article
Feed URLs → NotebookLM creates a notebook → asks 5 research questions → Claude writes a 1,000-2,000 word cited article.
2. Research → Social Posts
URLs → notebook → summary → platform-specific posts for Threads, Instagram, and Facebook.
3. Trend → Content
Integrates with trend-pulse to discover trending topics, research them in NotebookLM, and generate content automatically.
4. Batch Digest
RSS feeds → notebook → categorized newsletter digest with Q&A.
5. Generate All Artifacts
URLs → notebook → generates and downloads all 9 artifact types in one shot.
YouTube Video Synthesis
One of the most impressive capabilities: turning slides + podcast into a YouTube-ready video.
The included make_video.sh script:
- Generates a slide deck (PDF) and podcast (M4A) from NotebookLM
- Converts PDF slides to PNG frames with
pdftoppm - Composites slides with audio using
ffmpeg - Outputs H.264 video with AAC audio,
+faststartfor web streaming
See it in action:
Authentication: No OAuth, No API Keys
The setup is refreshingly simple:
pip install -r requirements.txt
python3 -m playwright install chromium
python3 -m notebooklm login # Opens browser, log in once
That’s it. The browser session is saved locally. All subsequent calls are pure HTTP. No Google Cloud project, no OAuth app, no API keys. The session persists for weeks.
Quick Start
As a Claude Code Skill:
mkdir -p .claude/skills/notebooklm
cp -r notebooklm-skill/SKILL.md .claude/skills/notebooklm/
cp -r notebooklm-skill/scripts/ .claude/skills/notebooklm/scripts/
Then just tell Claude: “Research Claude Code best practices using NotebookLM and write an article.”
As an MCP Server:
{
"mcpServers": {
"notebooklm": {
"command": "python3",
"args": ["/path/to/notebooklm-skill/mcp-server/server.py"]
}
}
}
CLI usage:
# Create notebook from URLs
python3 scripts/notebooklm_client.py create \
--title "Research Topic" \
--urls "https://example.com/article1" "https://example.com/article2"
# Generate a podcast
python3 scripts/notebooklm_client.py podcast --notebook "Research Topic"
# Full research-to-article pipeline
python3 scripts/pipeline.py research-to-article \
--urls "https://..." --questions 5 --format markdown
How It Compares
| Tool | Research | Artifacts | Automation | Cost |
|---|---|---|---|---|
| notebooklm-skill | NotebookLM (cited) | 9 types | Full pipeline | Free |
| Perplexity API | Per-query search | Text only | Manual | $5-200/mo |
| LangChain/LlamaIndex | DIY RAG | DIY | Custom code | LLM API costs |
| Zapier + NotebookLM | None | None | Partial | $20+/mo |
| NotebookLM UI | Manual | 9 types | None | Free |
The unique position: cited research + artifact generation + full automation + zero cost.
Related Open-Source Projects
notebooklm-skill is part of a broader open-source ecosystem we’re building:
- trend-pulse — Real-time trending topic discovery from 7 free sources (Google Trends, Hacker News, Reddit, and more). Powers the
trend-to-contentpipeline in notebooklm-skill. Works as an MCP Server. - cf-browser — 9 browser tools for Claude Code via Cloudflare Browser Rendering. Screenshot, scrape, crawl, and extract content from any webpage. One-command deployment.
Together, these three tools form a complete content pipeline: discover trends (trend-pulse) → research in depth (notebooklm-skill) → capture web content (cf-browser) → create and publish (notebooklm-skill).
What’s Next
The project is MIT-licensed and actively maintained. Planned additions include more MCP tools (nlm_generate, nlm_download, nlm_list_artifacts), additional pipeline recipes, and deeper integrations with content management systems.
GitHub: claude-world/notebooklm-skill Demo (Chinese): youtu.be/6M3K4sxahdE Demo (English): youtu.be/q1kj_OccaVE
If you’re building content pipelines and tired of paying per API call, give it a try. Star it on GitHub if you find it useful — it helps others discover the project. The zero-cost angle alone makes it worth exploring, and the full automation pipeline makes it a game changer.