This is a library for the developer tool for exploring and testing the Adobe Asset Compute service. To use this, see adobe/aio-cli-plugin-asset-compute.
Access to at least one blob storage container is required for using the development tool. Currently, we only support AWS S3 and Azure Blob Storage.
Note: This cloud storage can be shared across developers and custom workers projects. You do not need a separate cloud storage container per project.
- If you do not already have an account, follow the steps here to create a storage account.
- If you do not already have a container, create a new container:
- Navigate to your storage account in the Azure portal.
- In the left menu for the storage account, scroll to the Blob service section, then select Containers.
- Select the
+
Container button and choose a name for the container. - Select access level. (We reccomend private.)
- Select OK to create the container.
- Retrieve Azure storage account key by following the steps here.
- Set the Azure Storage environment variables in you
.env
file:
AZURE_STORAGE_ACCOUNT=
AZURE_STORAGE_KEY=
AZURE_STORAGE_CONTAINER_NAME=
For complete information on setting up you AWS account and S3 bucket, see documentation here
- Create a new bucket under
Services > S3 > Create bucket
. - Ensure the user has access to the bucket
- To check: go to
Services > S3
and search for the bucket (e.g.my-bucket
).You should be able to see the bucket, click on it and upload a file.
- To check: go to
- Add a bucket policy:
- The minimal permission someone needs to give your user is an template S3 policy like below. Replace
BUCKET
andPrincipal
permissions with your own:
- The minimal permission someone needs to give your user is an template S3 policy like below. Replace
{
"Id": "ExamplePolicy01",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ExampleStatement01",
"Action": [
"s3:DeleteObject",
"s3:GetObject",
"s3:GetObjectVersion",
"s3:ListBucket",
"s3:ListBucketMultipartUploads",
"s3:ListMultipartUploadParts",
"s3:PutObject"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::BUCKET/*", // replace with your own bucket
"arn:aws:s3:::BUCKET" // replace with your own bucket
]
"Principal": {
"AWS": [
"arn:aws:iam::123456789012:user/Dave" // replace with your own
]
}
}
]
}
- Retrieve the AWS keys for your user. You might have them stored locally somewhere. Otherwise, get new AWS keys for your user:
- Go to
Services > IAM > Users
- Search for your username and click on the user
- Go to the tab
Security credentials
- If you had previous keys, you see them, but you won't be able to get the secret key from AWS again, so if you've lost it, then you have to delete an old one first.
- Click
Create access key
- Copy the access key and the secret key and store them safely (e.g. in your password manager)
- Go to
- Fill in the AWS S3 credentials to your
.env
file:
S3_BUCKET=
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION=
-
Be granted System Admin or Developer Role access in the Experience Organization (This can be set by a System Admin in the Admin Console)
-
Log onto the Adobe Developer Console
- Make sure you are in the same Adobe Experience Cloud Organization as your AEM as a Cloud Service Integration
- For reference, here is the Adobe Developer Console documentation
-
Create a new project and subscribe to the services needed for Asset Compute:
-
Go back to the Organization main Project page
-
Click "Create new project" => "Empty project"
-
Go into your new project and add the following APIs and services to your project (You must add each service one at a time)
- Click on
"Add to Project" => "API"
and add each of these services one at a time:"Asset Compute"
,"IO Events"
,"IO Events Management"
- Click on
"Add to Project" => "Runtime"
to add Adobe IO Runtime to your project
Note: You will be prompted to create a private key. Please save this to a safe place on your machine. This file path will be the
ASSET_COMPUTE_PRIVATE_KEY_FILE_PATH
environment variable in your.env
file. - Click on
-
-
Retrieve and format credentials for Asset Compute service development.
Navigate to your Asset Compute project or workspace created in step 4, press the
"Download"
button to download your credentials and save this file to a secure location on your machine. This file is needed to use the Asset Compute Development tool. Use this file path as theASSET_COMPUTE_INTEGRATION_FILE_PATH
environment variable in the.env
file.
Yarn is required for installing dependencies for this project. See internal dependencies to learn more about this.
Please set the following credentials in a .env
file in the root of the /server
folder.
For more information on setting up credentials, see Cloud Storage Container and Adobe I/O Console Techinical Integrations below.
# Path to AIO Integration File JSON (defaults to current working directory + `console.json`. Only applicable if running in the context of App Builder application)
ASSET_COMPUTE_INTEGRATION_FILE_PATH=
# Path to Private Key file for AIO Integration
ASSET_COMPUTE_PRIVATE_KEY_FILE_PATH=
# S3 credentials
S3_BUCKET=
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION=
# Azure Storage credentials
AZURE_STORAGE_ACCOUNT=
AZURE_STORAGE_KEY=
AZURE_STORAGE_CONTAINER_NAME=
# Optional (can be used during development)
# ASSET_COMPUTE_DEV_PORT=
# Optional Asset Compute endpoint (defaults to Asset Compute Production Endpoint)
# ASSET_COMPUTE_URL=
- cd into
/server
- Make sure to do a clean
yarn install
in the/server
- if this is your first time using the dev tool or there were changes to the UI since you used it last, run
npm run build
- run
npm run start-prod
NOTE: Only use dev mode if you are developing or adding changes to the devtool itself. Please use production mode if you are just using the devtool.
- Make sure to set environment variable:
ASSET_COMPUTE_DEV_TOOL_ENV='development'
- If it is your first time using the dev tool or you made changes to the UI, cd into
/client
and runyarn install
- cd into the
/server
directory. (remember to do anyarn install
) - run
npm run start-dev
Note: The backend port will default to 9000. If you already have something running on that port or would like to change it, set the ASSET_COMPUTE_DEV_PORT
environment variable. WARNING: this updates the package.json in /client
- Make sure to do a clean
yarn install
in both/server
and/client
- cd into
/server
- Run
npm run build
- update the version number in
package.json
(inside/server
) to refelect the next chronological version number - push the updated
package.json
to master - Create the git tag for the release
- Run
npm publish --access public --dry-run
thennpm publish --access public
script to release to npm
As stated by Adobe Reactor Extensions Core, the client
in this project depends on the @react/react-spectrum package which depends on the @react/collection-view package. Neither package is published to the public npm repository. In order to support their installation and use, they have been included in this project as tar files. Each file has been prepended with the following message regarding the license:
/**
* Use of this code is governed by the Adobe Terms of Use and
* Adobe Developer Additional Terms, and the license attached
* to this repo does not apply.
*/
While changing package.json to point to the tar file for @react/react-spectrum is simple and natively supported by npm, this is not the case with @react/collection-view, since it is a dependency of @react/react-spectrum. To solve this problem, this project uses Yarn for installing dependencies since it natively supports selective dependency resolutions.
Contributions are welcomed! Read the Contributing Guide for more information.
This project is licensed under the Apache V2 License. See LICENSE for more information.