Skip to main content
Featured Open Source NotebookLM MCP Server Content Pipeline Claude Code Skill

notebooklm-skill: The Open-Source Pipeline From Research to Content at Zero Cost

The only open-source tool connecting trend discovery, NotebookLM research, AI content creation, and multi-platform publishing in one zero-cost pipeline.

March 14, 2026 8 min read By Claude World

What if you could go from “I want to write about X” to a published article, podcast, slide deck, and social posts — all automated, all cited, and all at zero API cost?

That’s exactly what notebooklm-skill does.

The Problem With AI Content Pipelines

Most AI content workflows have two painful gaps:

  1. AI-generated content lacks real citations. LLMs hallucinate sources. Even RAG pipelines require expensive embedding infrastructure.
  2. Research tools charge per query. Perplexity, You.com, and similar services bill for every search. Scale up, and costs compound fast.

Google’s NotebookLM solves both problems — it grounds research in real sources with verifiable citations, generates 10 different artifact types (podcasts, slide decks, reports, quizzes, and more), and it’s completely free.

The missing piece? Automation. NotebookLM has no public API. It’s a manual, browser-only tool.

Until now.

What notebooklm-skill Does

notebooklm-skill is the only open-source tool that chains the full content creation workflow:

Trend Discovery → NotebookLM Deep Research → AI Content Creation → Multi-Platform Publishing

Built on notebooklm-py v0.3.4, it wraps NotebookLM’s internal HTTP APIs. After a one-time browser login, all subsequent operations are pure HTTP — no browser automation overhead per call.

Dual-Mode Architecture

The tool works in two modes:

ModeHow It WorksCompatible With
Claude Code SkillDrop SKILL.md into .claude/skills/ — auto-detected by trigger phrasesClaude Code
MCP ServerRun python3 mcp-server/server.py — stdio or HTTP modeClaude Code, Cursor, Gemini CLI, any MCP client

This means the same tool works whether you’re a Claude Code user, a Cursor user, or building your own MCP-compatible agent.

The Numbers

CategoryCount
CLI Commands11 commands
MCP Tools13 tools
Pipeline Workflows5 end-to-end pipelines
Artifact Types10 (all generated by Google’s servers)
Source Types11 (URLs, YouTube, PDFs, Google Drive, and more)
API Cost$0

10 Artifact Types — All Free

Every artifact is generated by Google’s infrastructure at zero cost to you:

ArtifactFormatUse Case
Podcast (Audio Overview)M4ADeep dive, brief, critique, or debate formats
VideoMP4Visual content
Slide DeckPDFPresentations
ReportMarkdownLong-form research report
Study GuideMarkdownLearning guide format
QuizJSONEducational assessment
FlashcardsJSONStudy material
Mind MapJSON treeVisual knowledge mapping
InfographicPNGVisual summaries
Data TableCSV/JSONStructured data extraction

The podcast alone supports four formats: deep dive (15-30 min), brief (3-5 min), critique (10-20 min), and debate (two opposing viewpoints, 10-20 min).

5 Pipeline Workflows

These are the pre-built end-to-end workflows:

1. Research → Article

Feed URLs → NotebookLM creates a notebook → asks 5 research questions → Claude writes a 1,000-2,000 word cited article.

2. Research → Social Posts

URLs → notebook → summary → platform-specific posts for Threads, Instagram, and Facebook.

3. Trend → Content

Integrates with trend-pulse to discover trending topics, research them in NotebookLM, and generate content automatically.

4. Batch Digest

RSS feeds → notebook → categorized newsletter digest with Q&A.

5. Generate All Artifacts

URLs → notebook → generates and downloads all 9 artifact types in one shot.

YouTube Video Synthesis

One of the most impressive capabilities: turning slides + podcast into a YouTube-ready video.

The included make_video.sh script:

  1. Generates a slide deck (PDF) and podcast (M4A) from NotebookLM
  2. Converts PDF slides to PNG frames with pdftoppm
  3. Composites slides with audio using ffmpeg
  4. Outputs H.264 video with AAC audio, +faststart for web streaming

See it in action:

Authentication: No OAuth, No API Keys

The setup is refreshingly simple:

pip install -r requirements.txt
python3 -m playwright install chromium
python3 -m notebooklm login  # Opens browser, log in once

That’s it. The browser session is saved locally. All subsequent calls are pure HTTP. No Google Cloud project, no OAuth app, no API keys. The session persists for weeks.

Quick Start

As a Claude Code Skill:

mkdir -p .claude/skills/notebooklm
cp -r notebooklm-skill/SKILL.md .claude/skills/notebooklm/
cp -r notebooklm-skill/scripts/ .claude/skills/notebooklm/scripts/

Then just tell Claude: “Research Claude Code best practices using NotebookLM and write an article.”

As an MCP Server:

{
  "mcpServers": {
    "notebooklm": {
      "command": "python3",
      "args": ["/path/to/notebooklm-skill/mcp-server/server.py"]
    }
  }
}

CLI usage:

# Create notebook from URLs
python3 scripts/notebooklm_client.py create \
  --title "Research Topic" \
  --urls "https://example.com/article1" "https://example.com/article2"

# Generate a podcast
python3 scripts/notebooklm_client.py podcast --notebook "Research Topic"

# Full research-to-article pipeline
python3 scripts/pipeline.py research-to-article \
  --urls "https://..." --questions 5 --format markdown

How It Compares

ToolResearchArtifactsAutomationCost
notebooklm-skillNotebookLM (cited)9 typesFull pipelineFree
Perplexity APIPer-query searchText onlyManual$5-200/mo
LangChain/LlamaIndexDIY RAGDIYCustom codeLLM API costs
Zapier + NotebookLMNoneNonePartial$20+/mo
NotebookLM UIManual9 typesNoneFree

The unique position: cited research + artifact generation + full automation + zero cost.

notebooklm-skill is part of a broader open-source ecosystem we’re building:

  • trend-pulse — Real-time trending topic discovery from 7 free sources (Google Trends, Hacker News, Reddit, and more). Powers the trend-to-content pipeline in notebooklm-skill. Works as an MCP Server.
  • cf-browser — 9 browser tools for Claude Code via Cloudflare Browser Rendering. Screenshot, scrape, crawl, and extract content from any webpage. One-command deployment.

Together, these three tools form a complete content pipeline: discover trends (trend-pulse) → research in depth (notebooklm-skill) → capture web content (cf-browser) → create and publish (notebooklm-skill).

What’s Next

The project is MIT-licensed and actively maintained. Planned additions include more MCP tools (nlm_generate, nlm_download, nlm_list_artifacts), additional pipeline recipes, and deeper integrations with content management systems.

GitHub: claude-world/notebooklm-skill Demo (Chinese): youtu.be/6M3K4sxahdE Demo (English): youtu.be/q1kj_OccaVE

If you’re building content pipelines and tired of paying per API call, give it a try. Star it on GitHub if you find it useful — it helps others discover the project. The zero-cost angle alone makes it worth exploring, and the full automation pipeline makes it a game changer.