Schroedinger-Hat/youtube-to-anchorfm

How to setup a cron/public api to automate posting?

pooriaarab opened this issue · 7 comments

Hi,

Want to use n8n or Zapier to automate posting from YouTube to Anchor.

How can I go about it?

I think that you can use a component in your n8n or Zapier flow that would update the episode.json in your repository. See if you can find out how would you script that since I haven't used n8n or Zapier.

Wabri commented

@pooriaarab any news on this issue?

You can, if necessary, add your own code to detect new episodes, update episode.json and commit it via github action. Just use the on.schedule option.

Example script (untested) to get the last episode:

echo "{'id': '$(youtube-dl -j PLoXdlLuaGN8ShASxcE2A4YuSto3AblDmX --playlist-items 1 \
    | jq .display_id -r)'} > episode.json"

In the example below, that would be in the Fetch latest data part, in the run section:

    - name: Fetch latest data
      run: |-
        echo "{'id': '$(youtube-dl -j PLoXdlLuaGN8ShASxcE2A4YuSto3AblDmX --playlist-items 1 | jq .display_id -r)'} > episode.json"
        git commit -am "new episode" || exit 0 # necessary because unchanged episode.json would trigger commit error.

You'll need to install youtube-dl and jq. Maybe sudo apt install?

The commit made inside the scheduled action will trigger the upload action.

Example (have been using for years now https://github.com/weltonrodrigo/NYTdiff/actions):

name: Scrape latest data
on:
  push:
  workflow_dispatch:
  schedule:
    - cron: '6,17,26,46,57 * * * *'
jobs:
  scheduled:
    runs-on: ubuntu-18.04
    steps:
    - name: Check out this repo
      uses: actions/checkout@v2
    - name: Cache
      uses: actions/cache@v2.1.2
      with:
        path: |
          ./.venv
          ./pha*
        key: venv
    - name: Fetch latest data
      env:
        TESTING: ${{ secrets.TESTING}}
        NYT_TWITTER_CONSUMER_KEY: ${{ secrets.NYT_TWITTER_CONSUMER_KEY}}
        NYT_TWITTER_CONSUMER_SECRET: ${{ secrets.NYT_TWITTER_CONSUMER_SECRET}}
        NYT_TWITTER_ACCESS_TOKEN: ${{ secrets.NYT_TWITTER_ACCESS_TOKEN}}
        NYT_TWITTER_ACCESS_TOKEN_SECRET: ${{ secrets.NYT_TWITTER_ACCESS_TOKEN_SECRET}}
        NYT_API_KEY: ${{ secrets.NYT_API_KEY}}
        RSS_URL: ${{ secrets.RSS_URL}}
        PHANTOMJS_PATH: ${{ secrets.PHANTOMJS_PATH}}
      run: |-
        envsubst < run_diff.sh > run.sh
        bash run.sh
    - name: Commit and push if it changed
      run: |-
        git config user.name "Automated"
        git config user.email "actions@users.noreply.github.com"
        git add titles.db nytdiff.log
        timestamp=$(date -u)
        git commit -m "Latest data: ${timestamp}" || exit 0
        git push

Apparently, google cloud run allows for request timeouts of up to 15 minutes, which seems more than enough for a run of this code. https://cloud.google.com/run/docs/configuring/request-timeout?hl=pt-br

The free tier is pretty good for a once a day run: https://cloud.google.com/run/pricing

We'd need to create a container with a http server in front of the code, possibly with authentication. Maybe get episode and show config from a post request authenticated with a bearer token.

This would make possible to run this using a curl request or a custom (simple) cli client.

You can, if necessary, add your own code to detect new episodes, update episode.json and commit it via github action. Just use the on.schedule option.

Example script (untested) to get the last episode:

echo "{'id': '$(youtube-dl -j PLoXdlLuaGN8ShASxcE2A4YuSto3AblDmX --playlist-items 1 \
    | jq .display_id -r)'} > episode.json"

In the example below, that would be in the Fetch latest data part, in the run section:

    - name: Fetch latest data
      run: |-
        echo "{'id': '$(youtube-dl -j PLoXdlLuaGN8ShASxcE2A4YuSto3AblDmX --playlist-items 1 | jq .display_id -r)'} > episode.json"
        git commit -am "new episode" || exit 0 # necessary because unchanged episode.json would trigger commit error.

You'll need to install youtube-dl and jq. Maybe sudo apt install?

The commit made inside the scheduled action will trigger the upload action.

Example (have been using for years now https://github.com/weltonrodrigo/NYTdiff/actions):

name: Scrape latest data
on:
  push:
  workflow_dispatch:
  schedule:
    - cron: '6,17,26,46,57 * * * *'
jobs:
  scheduled:
    runs-on: ubuntu-18.04
    steps:
    - name: Check out this repo
      uses: actions/checkout@v2
    - name: Cache
      uses: actions/cache@v2.1.2
      with:
        path: |
          ./.venv
          ./pha*
        key: venv
    - name: Fetch latest data
      env:
        TESTING: ${{ secrets.TESTING}}
        NYT_TWITTER_CONSUMER_KEY: ${{ secrets.NYT_TWITTER_CONSUMER_KEY}}
        NYT_TWITTER_CONSUMER_SECRET: ${{ secrets.NYT_TWITTER_CONSUMER_SECRET}}
        NYT_TWITTER_ACCESS_TOKEN: ${{ secrets.NYT_TWITTER_ACCESS_TOKEN}}
        NYT_TWITTER_ACCESS_TOKEN_SECRET: ${{ secrets.NYT_TWITTER_ACCESS_TOKEN_SECRET}}
        NYT_API_KEY: ${{ secrets.NYT_API_KEY}}
        RSS_URL: ${{ secrets.RSS_URL}}
        PHANTOMJS_PATH: ${{ secrets.PHANTOMJS_PATH}}
      run: |-
        envsubst < run_diff.sh > run.sh
        bash run.sh
    - name: Commit and push if it changed
      run: |-
        git config user.name "Automated"
        git config user.email "actions@users.noreply.github.com"
        git add titles.db nytdiff.log
        timestamp=$(date -u)
        git commit -m "Latest data: ${timestamp}" || exit 0
        git push

can we add this response in the doc/wiki readme? @abe-101 @matevskial

Then we can close this issue

i just wanna throw out some caution
github is very fast to block/ban accounts that run an action that triggers another action when the content of the action does not relate to enhancing the code base (like testing...)

Closing this issue as no development is required