How to Automatically Download New YouTube Shorts from Your Favorite Channels
5/25/2025
|Team CapsAI

Keeping your offline library up to date manually can be a chore. Automate the entire process so that whenever a creator you follow publishes a new Short, it’s downloaded to your device or server without lifting a finger. Here’s how to set it up in 2025:
1. Use yt-dlp with a Simple Cron Job
Platform: Cross-Platform CLI
Link & Docs: https://github.com/yt-dlp/yt-dlp
- Install yt-dlp
pip install yt-dlp
- Create a Text File of Channel URLs
List each channel’s Shorts feed URL inchannels.txt
, one per line:
https://www.youtube.com/feeds/videos.xml?playlist_id=PLSHORTS_FOR_ChannelA
https://www.youtube.com/feeds/videos.xml?playlist_id=PLSHORTS_FOR_ChannelB
- Write a Shell Script (
download_shorts.sh
)
#!/usr/bin/env bash
while IFS= read -r feed; do
# Fetch latest video IDs from RSS (requires xmlstarlet or similar)
vids=$(curl -s "$feed" | xmlstarlet sel -t -m "//entry/link[@rel='alternate']" -v @href)
for url in $vids; do
yt-dlp -a - --download-archive archive.txt -f bestvideo+bestaudio --merge-output-format mp4 <<< "$url"
done
done < channels.txt
--download-archive archive.txt
prevents re-downloading.
- Schedule with Cron Runs the script at the top of every hour, grabbing any new Shorts from your channels.
0 /path/to/download_shorts.sh
2. Leverage the YouTube Data API & Python
Platform: Cross-Platform, Requires API Key
Docs & Signup: https://developers.google.com/youtube/v3
- Enable YouTube Data API in Google Cloud and get an API key.
- Install Dependencies
pip install google-api-python-client yt-dlp
- Python Script (
auto_shorts.py
)
from googleapiclient.discovery import build
import subprocess, json, time
API_KEY = 'YOUR_API_KEY'
CHANNEL_IDS = ['UCxxxxx', 'UCyyyyy']
ytdl_opts = ['yt-dlp', '-f', 'best', '--download-archive', 'archive.txt']
youtube = build('youtube', 'v3', developerKey=API_KEY)
def fetch_shorts(channel_id):
req = youtube.search().list(part='id', channelId=channel_id,
type='video', eventType='completed',
maxResults=10, publishedAfter=(time.strftime("%Y-%m-%dT%H:%M:%SZ", time.gmtime(time.time()-3600))))
return ['https://youtu.be/' + item['id']['videoId'] for item in req.execute()['items']]
if __name__ == '__main__':
for cid in CHANNEL_IDS:
for url in fetch_shorts(cid):
subprocess.run(ytdl_opts + [url])
- Automate
Schedule this with cron or Windows Task Scheduler to run every hour or as often as you like.
3. No-Code Automation with Zapier or Make (Integromat)
Platforms: Web-Based Services
Links:
- Zapier: https://zapier.com/apps/youtube/integrations
- Make: https://www.make.com/en/integrations/youtube
- Trigger: “New Video in Channel” on YouTube app—filter by Shorts (e.g., title contains
/shorts/
). - Action: Webhook or Code by Zapier module that calls
yt-dlp
on your server, or saves the URL to a Dropbox/Google Drive folder. - Run Frequency: Zapier polls every 5–15 minutes (depending on plan) to detect new Shorts.
- Output: Use FTP/SSH or cloud-storage integrations to push downloaded files to your NAS.
4. Use RSS-to-Email + Email Parser + Script
- Subscribe to Channel RSS: Add each Shorts RSS feed to an RSS-to-email service (e.g., Feedrabbit, Blogtrottr).
- Email Parser: Configure an email parser (Zapier Email Parser, Mailparser.io) to extract the Shorts URL from incoming emails.
- Webhook Trigger: When a new email arrives, the parser hits a webhook that runs your download script (via a small serverless function or IFTTT Webhooks).
This method requires minimal coding but leverages familiar email workflows.
5. Best Practices & Tips
- Maintain an Archive File: Always use
--download-archive
to skip duplicates. - Monitor Logs: Redirect script output to a log file and set up alerts for errors.
- Split by Channel: Create separate output folders per channel for organized storage.
- Use a VPS or Home Server: Offload downloads to a dedicated machine that’s always online.
- Rate Limiting: Insert short sleeps (
sleep 5
) betweenyt-dlp
calls to avoid being throttled by YouTube.
Keeping your offline library up to date manually can be a chore. Automate the entire process so that whenever a creator you follow publishes a new Short, it’s downloaded to your device or server without lifting a finger. Here’s how to set it up in 2025:
1. Use yt-dlp with a Simple Cron Job
Platform: Cross-Platform CLI
Link & Docs: https://github.com/yt-dlp/yt-dlp
- Install yt-dlp
pip install yt-dlp
- Create a Text File of Channel URLs
List each channel’s Shorts feed URL inchannels.txt
, one per line:
https://www.youtube.com/feeds/videos.xml?playlist_id=PLSHORTS_FOR_ChannelA
https://www.youtube.com/feeds/videos.xml?playlist_id=PLSHORTS_FOR_ChannelB
- Write a Shell Script (
download_shorts.sh
)
#!/usr/bin/env bash
while IFS= read -r feed; do
# Fetch latest video IDs from RSS (requires xmlstarlet or similar)
vids=$(curl -s "$feed" | xmlstarlet sel -t -m "//entry/link[@rel='alternate']" -v @href)
for url in $vids; do
yt-dlp -a - --download-archive archive.txt -f bestvideo+bestaudio --merge-output-format mp4 <<< "$url"
done
done < channels.txt
--download-archive archive.txt
prevents re-downloading.
- Schedule with Cron Runs the script at the top of every hour, grabbing any new Shorts from your channels.
0 /path/to/download_shorts.sh
2. Leverage the YouTube Data API & Python
Platform: Cross-Platform, Requires API Key
Docs & Signup: https://developers.google.com/youtube/v3
- Enable YouTube Data API in Google Cloud and get an API key.
- Install Dependencies
pip install google-api-python-client yt-dlp
- Python Script (
auto_shorts.py
)
from googleapiclient.discovery import build
import subprocess, json, time
API_KEY = 'YOUR_API_KEY'
CHANNEL_IDS = ['UCxxxxx', 'UCyyyyy']
ytdl_opts = ['yt-dlp', '-f', 'best', '--download-archive', 'archive.txt']
youtube = build('youtube', 'v3', developerKey=API_KEY)
def fetch_shorts(channel_id):
req = youtube.search().list(part='id', channelId=channel_id,
type='video', eventType='completed',
maxResults=10, publishedAfter=(time.strftime("%Y-%m-%dT%H:%M:%SZ", time.gmtime(time.time()-3600))))
return ['https://youtu.be/' + item['id']['videoId'] for item in req.execute()['items']]
if __name__ == '__main__':
for cid in CHANNEL_IDS:
for url in fetch_shorts(cid):
subprocess.run(ytdl_opts + [url])
- Automate
Schedule this with cron or Windows Task Scheduler to run every hour or as often as you like.
3. No-Code Automation with Zapier or Make (Integromat)
Platforms: Web-Based Services
Links:
- Zapier: https://zapier.com/apps/youtube/integrations
- Make: https://www.make.com/en/integrations/youtube
- Trigger: “New Video in Channel” on YouTube app—filter by Shorts (e.g., title contains
/shorts/
). - Action: Webhook or Code by Zapier module that calls
yt-dlp
on your server, or saves the URL to a Dropbox/Google Drive folder. - Run Frequency: Zapier polls every 5–15 minutes (depending on plan) to detect new Shorts.
- Output: Use FTP/SSH or cloud-storage integrations to push downloaded files to your NAS.
4. Use RSS-to-Email + Email Parser + Script
- Subscribe to Channel RSS: Add each Shorts RSS feed to an RSS-to-email service (e.g., Feedrabbit, Blogtrottr).
- Email Parser: Configure an email parser (Zapier Email Parser, Mailparser.io) to extract the Shorts URL from incoming emails.
- Webhook Trigger: When a new email arrives, the parser hits a webhook that runs your download script (via a small serverless function or IFTTT Webhooks).
This method requires minimal coding but leverages familiar email workflows.
5. Best Practices & Tips
- Maintain an Archive File: Always use
--download-archive
to skip duplicates. - Monitor Logs: Redirect script output to a log file and set up alerts for errors.
- Split by Channel: Create separate output folders per channel for organized storage.
- Use a VPS or Home Server: Offload downloads to a dedicated machine that’s always online.
- Rate Limiting: Insert short sleeps (
sleep 5
) betweenyt-dlp
calls to avoid being throttled by YouTube.