On-chain deployment checklist
This section details a requirement checklist that need to be satisfied so that your AI agent can be properly deployed on-chain.
Required FSM App skill models
Ensure that your skill extends the following classes in the models.py file:
packages.valory.skills.abstract_round_abci.models.BenchmarkTool
packages.valory.skills.abstract_round_abci.models.BaseParams
You can define custom arguments for the skill, if required.
from packages.valory.skills.abstract_round_abci.models import BenchmarkTool
from packages.valory.skills.abstract_round_abci.models import BaseParams
# (...)
YourBenchmarkTool = BenchmarkTool
YourSkillParams = BaseParams
from packages.valory.skills.abstract_round_abci.models import BenchmarkTool
from packages.valory.skills.abstract_round_abci.models import BaseParams
# (...)
YourBenchmarkTool = BenchmarkTool
class YourSkillParams(BaseParams):
def __init__(self, *args: Any, **kwargs: Any) -> None:
self.your_custom_arg_1: str = self._ensure("your_custom_arg_1", kwargs, str)
self.your_custom_arg_2: str = self._ensure("your_custom_arg_2", kwargs, str)
# (...)
super().__init__(*args, **kwargs)
Required arguments and overrides
Ensure that your FSM App skill, agent blueprint and AI agent configuration files (skill.yaml, aea-config.yaml, and service.yaml, respectively) define the appropriate arguments and overrides:
skill.yaml- Must define default/placeholder values for the arguments associated to the
YourSkillParamsclass. aea-config.yaml- Must define overrides for
valory/abciconnection,valory/ledgerconnection,valory/p2p_libp2p_clientconnection, and your FSM App skill. Environment variables used for agent-blueprint-level overrides can use the simplified syntax${<type>:<default_value>}. service.yaml- Must define overrides for
valory/ledgerconnection and your FSM App skill (optionally, also forvalory/p2p_libp2p_clientconnection). Environment variables used for ai-agent-level overrides use the syntax${<env_var_name>:<type>:<default_value>}. They will be exported as their upper case JSON path in the agent blueprint Docker container. See also the AI agent level overrides and multiple overrides sections for more information.
name: <your_skill_name>
author: <author>
version: <version>
type: skill
# (...)
models:
benchmark_tool:
args:
log_dir: /logs
class_name: YourBenchmarkTool
params:
args:
setup:
all_participants:
- '0x0000000000000000000000000000000000000000'
safe_contract_address: '0x0000000000000000000000000000000000000000'
consensus_threshold: null
tendermint_url: http://localhost:26657
tendermint_com_url: http://localhost:8080
service_registry_address: null
share_tm_config_on_startup: false
on_chain_service_id: null
# (...)
class_name: YourSkillParams
# (...)
---
public_id: valory/abci:0.1.0
type: connection
config:
target_skill_id: <author>/<your_skill_name>:<version>
host: ${ABCI_HOST:str:localhost}
port: ${ABCI_PORT:int:26658}
use_tendermint: ${ABCI_USE_TENDERMINT:bool:false}
---
public_id: valory/ledger:0.19.0
type: connection
config:
ledger_apis:
ethereum:
address: ${str:http://localhost:8545}
chain_id: ${int:31337}
poa_chain: ${bool:false}
default_gas_price_strategy: ${str:eip1559}
---
public_id: valory/p2p_libp2p_client:0.1.0
type: connection
config:
nodes:
- uri: ${P2P_URI:str:acn.autonolas.tech:9005}
public_key: ${P2P_PUBLIC_KEY:str:02d3a830c9d6ea1ae91936951430dee11f4662f33118b02190693be835359a9d77}
- uri: ${P2P_URI:str:acn.autonolas.tech:9006}
public_key: ${P2P_PUBLIC_KEY:str:02e741c62d706e1dcf6986bf37fa74b98681bc32669623ac9ee6ff72488d4f59e8}
cert_requests:
- identifier: acn
ledger_id: ethereum
message_format: '{public_key}'
not_after: '2025-01-01'
not_before: '2024-01-01'
public_key: ${P2P_PUBLIC_KEY:str:02d3a830c9d6ea1ae91936951430dee11f4662f33118b02190693be835359a9d77}
save_path: .certs/acn_cosmos_9005.txt
---
public_id: <author>/<your_skill_name>:<version>
type: skill
models:
benchmark_tool:
args:
log_dir: ${str:/benchmarks}
params:
args:
setup:
all_participants: ${list:[]}
safe_contract_address: ${str:'0x0000000000000000000000000000000000000000'}
consensus_threshold: ${int:null}
tendermint_url: ${str:http://localhost:26657}
tendermint_com_url: ${str:http://localhost:8080}
service_registry_address: ${str:null}
share_tm_config_on_startup: ${bool:false}
on_chain_service_id: ${int:null}
# (...)
# (...)
---
public_id: valory/ledger:0.19.0
type: connection
config:
ledger_apis:
ethereum:
address: ${ETHEREUM_ADDRESS:str:http://localhost:8545}
chain_id: ${ETHEREUM_CHAIN_ID:int:31337}
poa_chain: ${ETHEREUM_POA_CHAIN:bool:false}
default_gas_price_strategy: ${DEFAULT_GAS_PRICE_STRATEGY:str:eip1559}
---
public_id: <author>/<your_skill_name>:<version>
type: skill
models:
benchmark_tool:
args:
log_dir: ${LOG_DIR:str:/benchmarks}
params:
args:
setup:
all_participants: ${ALL_PARTICIPANTS:list:["0x...","0x...","0x...","0x..."]}
safe_contract_address: ${SAFE_CONTRACT_ADDRESS:str:0x...}
consensus_threshold: ${CONSENSUS_THRESHOLD:int:null}
tendermint_url: ${str:http://localhost:26657}
tendermint_com_url: ${str:http://localhost:8080}
service_registry_address: ${SERVICE_REGISTRY_ADDRESS:str:0x...}
share_tm_config_on_startup: ${SHARE_TM_CONFIG_ON_STARTUP:bool:false}
on_chain_service_id: ${ON_CHAIN_SERVICE_ID:int:1}
# (...)
Important
Recall that when deploying an on-chain AI agent using autonomy deploy from-token, a number of arguments (under setup) are overridden with the values registered in the Autonolas Protocol:
# (...)
models:
params:
args:
setup:
all_participants: # Overridden with the registered values
safe_contract_address: # Overridden with the registered values
consensus_threshold: # Overridden with the registered values
For local deployments, the argument consensus_threshold can take the value:
null: then the framework will automatically calculateconsensus_thresholdas \(\lceil (2N+1)/3 \rceil\), where \(N=\)len(all_participants).- Any value \(M\) such that \(\lceil (2N+1)/3 \rceil \leq M \leq N\).
Otherwise, the framework will raise an error and the app will not start.
Publish and mint packages
Ensure that your components, agent blueprints and AI agents packages are published to the IPFS registry:
- Push your components using the
autonomy pushcommand. - Publish your agent blueprints using the
autonomy publishcommand. - Publish your AI agents using the
autonomy publishcommand.
Ensure that your components, agent blueprints and AI agents packages are minted on-chain in the Autonolas Protocol.
Check the deployment readiness of the AI agent using
$ autonomy analyse service --public-id PUBLIC_ID
or if you want to check deployment readiness of an on-chain AI agent
$ autonomy analyse service --token-id TOKEN_ID
Publish Docker images (optional)
You can build the Docker images for the AI agent using the autonomy build-image command. Alternatively, the images are built automatically when the AI agent is deployed using autonomy deploy from-token command
If you want to use an image with a stable hash or a stable version of a runtime image, you can provide the hash/version using --image-version on the autonomy deploy build command.
Ensure that the image exists before running the deployment:
docker pull <author>/oar_runtime_<ai_agent_name>:<ipfs_hash_ai_agent>