Deploying models for InstructLab
You can choose to deploy the model.
Deploying the model to Watsonx on IBM Cloud
-
Sign up for IBM watsonx as a Service.
-
If you do not have one yet, create a project.
-
Add a connection to the Object Storage data source in IBM Cloud.
Deploying the model to RHEL-AI on IBM Cloud
-
Using the IBM Cloud CLI, get a bearer token.
ibmcloud iam oauth-tokens
-
Update the variables from this bash script and run it.
#!/usr/bin/env bash # Replace variable with the bearer token BEARER_TOKEN="XXX" # Replace variable with the Object Storage bucket name CUSTOMER_BUCKET="XXX" # Replace variable with the Object Storage endpoint COS_ENDPOINT=https://s3.direct.us-east.cloud-object-storage.appdomain.cloud # Replace variable with the model ID MODEL_PREFIX="trained_models/XXX/model/" # Replace variable with the model directory path MODEL_DIR=/root/model/modeltest curl -v -G "$COS_ENDPOINT/$CUSTOMER_BUCKET" --data-urlencode "list-type=2" --data-urlencode "prefix=$MODEL_PREFIX" -H "Authorization: Bearer $BEARER_TOKEN" >/tmp/rawxml.txt cat /tmp/rawxml.txt | awk '{split($0,a,"<Key>"); for (i=1; i<=length(a); i++) print a[i]}' >/tmp/keysonnewline.txt mkdir -p "$MODEL_DIR" while read -r line; do if [[ "$line" != "trained_models"* ]]; then continue fi KEY_TO_DOWNLOAD=$(echo "$line" | awk -F '<' '{print $1}') FILE_NAME=$(basename "$KEY_TO_DOWNLOAD") curl -X "GET" "$COS_ENDPOINT/$CUSTOMER_BUCKET/$KEY_TO_DOWNLOAD" -H "Authorization: Bearer $BEARER_TOKEN" >"${MODEL_DIR}/$FILE_NAME" done </tmp/keysonnewline.txt
-
Then use the
ilab
commands to serve and chat.ilab model serve --model-path $MODEL_DIR -- --tensor-parallel-size 1 --host 0.0.0.0 --port 8080
ilab model chat --endpoint-url http://localhost:8080/v1 -m $MODEL_DIR