actions/upload-pages-artifact

Download pages artifact

hfhbd opened this issue ยท 5 comments

hfhbd commented

Is there an orthogonal action to download the actual website/pages artifact? The uploaded github-pages artifact attached to the CI output expired after a few days.

Although this action's default retention period for uploaded artifacts defaults to roughly 24 hours, you can extend that to up to 90 days (unless your organization specifies a shorter timeline) using the retention-days input parameter:

retention-days:
description: "Duration after which artifact will expire in days."
required: false
default: "1"

Moving beyond that to your general question:

There is not currently a way to retrieve or reconstruct a Pages site's artifact after it is deployed today (or at least once the artifact's retention period has expired).

๐Ÿ’ก It is an interesting idea, though! Definitely open to considering adding backend support and a separate Action for it in the future, but it's not something we can prioritize right now. ๐Ÿ˜•

FWIW, I've added an item to the Pages team backlog to consider this in the future. No promises that it will happen or any forecasted timeline, though. โค๏ธ

If you wanted to work around it for now (without using a gh-pages style branch ๐Ÿ˜“), you could probably also create a less performant option by:

  1. Including some sort of manifest file (JSON, YAML, XML, whatever) cataloguing all of the files to be included in your Packages artifact, and then include that file in the artifact as well
  2. Creating an Action or script to subsequently download that manifest file, and then download each of the catalogued assets (plus needing to create directory structures, etc.).

It's not pretty, but just wanted to pitch the idea in case it helps. ๐Ÿคž๐Ÿป ๐Ÿคท๐Ÿป

Related:

hfhbd commented

Thank you for your answer and the workaround, I will try it.

I will just add my use case for your internal backlog item:

I want to document the API of my library. This library has different versions, and the documentation (Javadoc) supports different API versions with a UI version picker. At the moment, I need to store the old API docs on the docs branch to keep and link the old docs.
With the new GitHub, actions deploy/upload pages artifact, there is no need for a separate docs branch, but to link the old docs, you need to fetch the old docs.

The workaround, to checkout old tags of the library and built the documentation during a new release, often doesn't work, easily due to dependency/JVM or os support, for example, older versions of the library don't support a current JVM, so you would need to switch to an older JVM too.

@hfhbd Did you find a nice way to do this?

hfhbd commented

@RebeccaStevens We tried the index and download option, but it didn't work well due the download overhead, so we keep a checked in docs folder, which is updated/keep in sync by our build tool. Not ideal, but better than having a gh-Pages branch.

Here is a workaround

Stop doing:

  1. actions/upload-pages-artifact@v1
  2. actions/deploy-pages@v2

Instead do:

  1. actions/cache/save@v3
  2. actions/upload-pages-artifact@v1
  3. actions/deploy-pages@v2

And download with:

  1. actions/cache/restore@v3